My friend is a technical writer and just lost her job because "chat GPT can do what you do!"
She then informed me that she knew 11 other people who got fired due to the same thing. And now those companies are realizing how badly they fucked up and are frantically trying to rehire.
My experience tells me that gpt is only good if a trained professional is behind the screen. If you fire a technician or a professional and fully replace it with GPT, it'll be on you to see how much it backfires
Replacing humans with AI is a bit like replacing a trained professional with a minimum wage, call center worker from a third world country. Sure it saves money and they can kind of do the job well enough so that if you squint it looks like the same thing. But the output is inevitably going to be subpar unless you retain a human expert manager.
Anyone who has ever had to deal with code from India knows this all too well.
Its like firing all your mechanics at a repair shop and letting the front office people fix the car because they already have the tools. But they don't know how to fix a car.
As intended. LLMs are either good or are easy to control and censor/direct what they answer. You can't have both. Unlike a human with actual intelligence who can self censor or intelligently evade or circunvent compromising answers. LLMs can't do that because they're not actually intelligent. A product has to be controllable by its client, so, to control it, you have to lobotomize it.
Yeah, I've lost count of the number of articles or comments going "AI can't do X" and then immediately testing and seeing that the current models absolutely do X no issue, and then going back and seeing the green ChatGPT icon or a comment about using the free version.
GPT-3.5 is a moron. The state of the art models have come a long way since then.