AI this, AI that - you can't go anywhere without something trying to force AI on you. Usually a company trying to get you to buy into what they've wasted billions on. So indie devs have begun fighting back with their No Gen AI Seal.
This might be a little off-topic, but I've noticed what seems to be a trend of anti-AI discourse ignoring programmers. Protect artists, writers, animators, actors, voice-actors... programmers, who? No idea if it's because they're partly to blame, or people are simply unaware code is also stolen by AI companies—still waiting on that GitHub Copilot lawsuit—but the end result appears to be a general lack of care about GenAI in coding.
LLMs are going to make senior devs indespensable. So far from what I've seen, it's not great at solving unusual cases, and it most shines in boilerplate and generic problems.
So juniors are never going to learn to code, and then companies will have to pay for experienced people.
Juniors never think hard about unionizing, and the seniors will have job security and therefore not strong motivation.
I hope devs will unionize in any case, LLMs or not, like any other specialization.
I think it's because most programmers use and appreciate the tool. This might change once programmers start to blame gen AI for not having a job anymore.
And programmers retain complete control of the output - it's just a bit of text that you can adapt as needed. Same as looking up snippets from Stack Overflow. Programmers are used to finding some snippet, checking if it actually works, and then adapting it to the rest of their code, so if doesn't feel like introducing media that you didn't create, but like a faster version of what everyone was already doing.
I noticed a bad trend with my colleagues who use copilot, chatgpt etc. They not only use it to write code, but also trust it with generally poor design decisions.
Another thing is that those people also hate working on existing code, claiming it is communicated and offering to write their (which also ends up complicated) version of it. I suspect it's because copilot doesn't help as much when code is more mature.
There remains a significant enclave that rejects it, but yeah, it's definitely smaller than equivalent groups in other mentioned professions. Hopefully things won't get that far. I think the tech is amazing, but it's an immense shame that so many of my/our peers don't give a flying fuck about ethics.
There remains a significant enclave that rejects it, but yeah, it's definitely smaller than equivalent groups in other mentioned professions.
Reporting in.
I think the tech is amazing, but it's an immense shame that so many of my/our peers don't give a flying fuck about ethics.
Yup. Very much agreed here. There are some uses that are acceptable but it's a but hard to say that any are ethical due to the ethically bankrupt foundations of its training data.
Indie studio teams are pretty small so its possible, I personally hate that the word copilot ever even appears and never ever autogen code, but moreso I'm sure the stamp refers to art, texture, and sound.