[if AI was] removed once discovered, I don’t see any problem
The problem was that it was sold to consumers at all without consent. You don’t get off scot free when you accidently leave some cocaine for the inlaws to find. There was malicious intent just by not disclosing its usage
OpenOrca
You bring this up and imply we’re agreeing here, but I find it odd that you immediately backtrack and say that AI usage in general, not OpenOrca usage, is a-okay. It’s entirely irrelevant what AI tool you use if this company didn’t
Can I blame someone for using placeholder texture [using] AI
Yes. A placeholder texture should be made to be obvious that it’s a placeholder. As soon as it doesn’t do that, it’s failed at its job. By using AI like this, you’re effectively making the QA’s job 10 times harder, as now they have to stare at every texture to make sure it’s not AI generated
left and right creating drama [about CO]
Yes. Because information came out on a large platform that allowed more people to hear about it compared to when it was initially released.
I actually played through Clair Obscur about two months ago and gave it a very hearty review on steam, but as soon as I heard that they used AI without disclosing it that changed into a very charged negative thumbs down. It’s very easy to pretend that people just hate things because they’re popular, you see it all the time with youtubers and movies after all. Unfortunately, there’s usually a very good reason that irks these people
The vast majority was made by hand, and the game is good
You seem to have missed my main point, and it’s not just here, either. You eluded to this frame of mind multiple times when writing these past two comments. My entire argument is that it was incredibly scummy to not disclose the usage of AI, robbing the buyer of any agency in the matter
Me responding to your other points is really just entertaining their idea, rather than engaging in a thoughtful discussion, as the only response you had for this main point was mainly along the lines of the above quote I’m responding to, which is really just moving goalposts
By acting like it doesn’t matter because everything else is good, you’re kind of weirdly conceding the point, as if it didn’t matter if they did it, then why shouldn’t they disclose it?
The problem was that it was sold to consumers at all without consent. You don’t get off scot free when you accidently leave some cocaine for the inlaws to find. There was malicious intent just by not disclosing its usage
You bring this up and imply we’re agreeing here, but I find it odd that you immediately backtrack and say that AI usage in general, not OpenOrca usage, is a-okay. It’s entirely irrelevant what AI tool you use if this company didn’t
Yes. A placeholder texture should be made to be obvious that it’s a placeholder. As soon as it doesn’t do that, it’s failed at its job. By using AI like this, you’re effectively making the QA’s job 10 times harder, as now they have to stare at every texture to make sure it’s not AI generated
Yes. Because information came out on a large platform that allowed more people to hear about it compared to when it was initially released.
I actually played through Clair Obscur about two months ago and gave it a very hearty review on steam, but as soon as I heard that they used AI without disclosing it that changed into a very charged negative thumbs down. It’s very easy to pretend that people just hate things because they’re popular, you see it all the time with youtubers and movies after all. Unfortunately, there’s usually a very good reason that irks these people
You seem to have missed my main point, and it’s not just here, either. You eluded to this frame of mind multiple times when writing these past two comments. My entire argument is that it was incredibly scummy to not disclose the usage of AI, robbing the buyer of any agency in the matter
Me responding to your other points is really just entertaining their idea, rather than engaging in a thoughtful discussion, as the only response you had for this main point was mainly along the lines of the above quote I’m responding to, which is really just moving goalposts
By acting like it doesn’t matter because everything else is good, you’re kind of weirdly conceding the point, as if it didn’t matter if they did it, then why shouldn’t they disclose it?