Wishing for my death or a World War. Either will do. Because FML or this world.
I find celeb ads pretty disgusting. Watched ads for gambling apps? They almost always feature a celeb. All I can say is, we have to know that celebs will do whatever for more money. They don’t care about their audience. The onus is on us to not get influenced by them.
Thing about RDR is that challenges are a pain to complete. It also holds true for RDR2. I never bother with them.
RDR was the first game I ever completed. Such an amazing experience.
Playing through Psychonauts 2. Almost at an end.
How about a submarine ride by OceanGate?
Sounds dumb.
Greed will do that to any company.
I read it as a depiction of how phones have become spyware. Opposite to privacy.
Now that I think about children develop critical thinking at around the age of 10. Perhaps you are right. But, the question remains, will LLMs develop such critical thinking on it’s own or are we still missing something?
Does using authoritative sources is fool proof? For example, is everything written in Wikipedia factually correct? I don’t believe so unless I actually check it. Also, what about reddit or stack overflow? Can they be considered factually correct? To some extent, yes. But not completely. That is why most of these LLMs give such arbitrary answers. They extrapolate on information they have no way knowing or understanding.
Why do you even think that? Children don’t ask questions? Don’t try to find answers?
This is something I already mentioned previously. LLMs have no way of fact checking, no measure of truth or falsity built into. In the training process, it probably accepts every piece of text as true. This is very different from how our minds work. When faced with a piece of text we have many ways to deal with it, which range from accepting it as it is to going on the internet to verify it to actually designing and conducting experiments to prove or disprove the claim. So, yeah what ChatGPT outputs is probably bullshit.
Of course, the solution is that ChatGPT be trained by labelling text with some measure of truth. Of course, LLMs need so much data that labelling it all would be extremely slow and expensive and suddenly, the fast moving world of AI to screech to almost a halt, which would be unacceptable to the investors.
I call it The Price of Love.
I just realised that we are more worried about mortality of our loved ones than our own.
Running in circles in an open field is a worthwhile aspiration. We should all aspire to it, IMO.
High quality human contact, in a workplace?
Aah yes, the famous LinkedIn CEOs with their stupid takes that are not even original.
Being able to explore the world is much more satisfying to me than being able to beat a boss. To me exploration is the main appeal of the Elden Ring.