I’ll see people responding to fucken lemmy comments with “i ran the question through gpt and...” like what the fuck?
It’s literally the same thing as saying “I asked some RANDOM dude and this is what he said. Also I have no reason to believe he’s even the slightest bit educated.”
If you really wanna just throw some fucking spaghetti at the wall, YOU CAN DO THAT WITHOUT AI.
This is coming from someone who hates google, but if this person’s entire family had died, I would put a LOT of that blame on them before google.
I applaud your optimism that most people can do this without AI but have you gone and met people? Most people are not that capable of producing torrents of shameless bullshit as conscience or awareness of social and/or professional costs rear their head at some point.
If they can't do it themselves then they have no idea if the output is good. If they want to run it through the bullshit machine they shouldn't post the output unless they know it is accurate.
Can they, though? Sure, in theory Google could hire millions of people to write overviews that are equally idiotic, but obviously that is not something they would actually do.
I think there's an underlying ethical theory at play here, which goes something like: it is fine to fill internet with half-plagiarized nonsense, as long as nobody dies, or at least, as long as Google can't be culpable.
Can they, though? Sure, in theory Google could hire millions of people to write overviews that are equally idiotic, but obviously that is not something they would actually do.
The millions of people writing overviews would definitely be more reliable, that's for sure.
For one thing, they understand the concept of facts.
We don't need a fancy word that makes it sound like AI is actually intelligent when talking about how AI is frequently wrong and unreliable. AI being wrong is like someone who misunderstood something or took a joke as literal repeating it as factual.
When people are wrong we don't call it hallucinating unless their senses are altered. AI doesn't have senses.
Does everyone else see this? These are the exact type of out of town haters we really want. I also think calling LLMs all but delusional is too generous and I mean that unironically.
mods can you please ban "david gerard" or whatever his name really is. ai hate is already out of hand without people coming to push their agenda like this
Huh. I was making my own garlic oil this way (without advice from an LLM mind-you) and I was today years old when I learned this carries the risk of botulism (albeit small) , so in a way, an LLM has potentially saved my life by causing the chain of events which taught me something new.