Man this is one legal mess we’re going to have to iron out as a society. I see both sides, obviously a creator doesn’t want their work to be utilized in a way they don’t approve…on the other hand we severely limit ourselves on AI development if we don’t use the collective work of society as a whole. And policing may be a LOT harder than people realize…taking that too far while it protects authors and creatives may ultimately mean falling behind in this area to competitive countries.
For games, at least it kind of makes sense to want to use a model that doesn’t have things trained from libraries or television/movies. You don’t want to be talking to an NPC in a Star Wars game that keeps referencing Harry Potter as an example lol…might be a little immersion breaking haha.
But also, AI usage could bring development a step forward. Indie devs may be able to produce AAA quality experiences on their normal budget, or conversely hobbyist may be able to create Indie-level games.
I see AI bringing us potentially marrying a lot of silos of entertainment in the future. We may move beyond movies, TV shows, gaming into more collective “experiences” that combine the best aspects of all of these mediums.
Idk what the answer is but it’s going to be interesting to see how it plays out.
It really just requires a single step of indirection. Instead of indie dev using AI directly, they pay Joe's Asset Shack for their assets which may or may not be generated.
Can you explain how that seems logical? It makes it impossible for anyone but the mega-rich to use. AAA developers alone will be able to reap the benefits of generative AI and outcompete indie devs who can't afford models that meet these ridiculous restrictions.
It'll prevent indie artists from having their work plagiarized over and over without payment from indie "devs" who honestly shouldn't have the right to exist as "developers" if they can't afford to actually hire artists and such.
It'd be one thing if they made an agreement to get assets from artists for cheap or for free as a favor, but just plain putting them all out of business permanently by letting a machine steal their work forever is another thing entirely.
At this point it feels like Tim Sweeney is a generative AI which has been exclusively trained on taking a data set from Steam's and Gabe's decisions and inverting them. And that's it.
I'm with Tim Sweeney here - why restrict creativity with arbitrary restrictions like that? We already have some amazing 1-person games, how many more we'd have with this immense productivity boost? I'm excited for more games even if that means more trash out there, I have the brain power to sift through it if it means another Stardew Valley.
Because it is copyright laundering, which is ilegal.
We are just too early in the tech to have it established. But see cases open against Microsoft's Copilot.
I'm surprised people here on open source, free software project are defending copyright so fiercly. AI is learning not copying and even if you disagree - fuck copyright and fuck protectionism. There's so much shit to do in this world and we're back to "looms will end the world" nonsense. The propaganda machine is rolling hard on this one.
The problem is more that generative AI is trained on the actual work done by other, actual people. And we have no legal framework so far how those people should get paid in turn.
Plus, let's not for a moment imagine that Sweeney is saying this out of a firmly held personal belief. He's entirely based on his reactionary stance to Steam. Steam goes against generative AI -> Sweeney is in favor of it. If Steam would say they're against eating live babies, you can sure as hell bet he'd sing praises for that, too.
I agree with both your statement about AI training and Sweeney. However, I do believe there is a legitimate argument for using generative AI in game development, and I therefore also think Sweeney has a legitimate point, even if he's doing it as a reaction to Steam.
Something oft acknowledged as okay in art (or any creative endeavor) is inspiration. Legally, we can really go even further, saying that copying is okay as long as the thing being copied is sufficiently transformed into something that can be considered new. Say, for example, different artists' versions of a character such as Pikachu. We might be able to recognize them all as Pikachu, but also acknowledge that they're all unique and obviously the creation of one particular artist.
Why is this process a problem when it's done with technology? I, as a human, didn't get permission from someone else to transform their work. It's okay when I do it, but not when it's done algorithmically? Why?
I think this is a legitimate question that has valid arguments either way, but it's a question that needs to be answered, and I don't think a blanket response of "it's bad because it's stealing other people's work" is appropriate. If the model is very bad and clearly spits out exact replicas of the inputs, that's obviously a bad thing, just as it would be equally bad if I traced someone else's work. But what about the models that don't do that, and spit out unique works never seen before? Not all models are equal in this sense.
Why is everyone have to be paid for everything? The real dillema is wether AI is learning or is it remixing and the science is on the side of learning while all grifters on the side of remixing. All of these lawsuits like the gettyimages one are for profit. They are grifting off this and people so blindly fall for this propaganda thinking they are protecting "the little guy" when big majority of world's copyright is owned by mega corporations. Fuck that.