I shouldn't still need to point this out but most boomers are poor. They also have a lot of medical issues and are mistreated by the health system and aged care. They are also mentally abused by cynical political strategists that target their fears and drive them half crazy.
If you haven't noticed the wealth divide is increasing and the rich are getting further away from regular people. That is happening despite rich boomers dying. Their wealth is being inherited and their replacements are just as mean.
Anytime you see a divisive comment like old.people, immigrants, trans are at fault question the source and if it is worth repeating. Its possible a lot of such comments originate in enemies of free societies. Community is important. It's the foundation for a fair and peaceful society. We can't have community while we have this mindset of excluding the other.
I sort of see why UHC would go in this direction. Case law (as in understand it) around AI and algorithms is that they can't be blamed held accountable on how they make decisions. Because they are black boxes, they are given a pass.
When a human reviews and denies a claim, it can be scrutinized, making it possible for the human to be reviewed and discredited. And a human might even gasp approve a case.
When it's AI, it can't be reviewed. It can be blamed, but with no responsibility, it can't be sued.
We really need more regulation around algorithms and AI or more shit like this is going to happen.
Seriously. I am almost 100% the guy in the picture was my waiter at the Boathouse restaurant, in Disney Springs (Florida), on the night of the shooting. The person in the picture was nowhere near the state of New York.
UnitedHealth uses the algorithm to "prematurely and in bad faith discontinue payment for healthcare services," the complaint said.
"This is an example of how AI is being utilized not to help people but to line the pockets of corporations and their shareholders," Clarkson said.
When these coverage denials are appealed to federal administrative law judges, about 90% are reversed, the complaint said, demonstrating the "blatant inaccuracy" of the algorithm. Only a tiny fraction of patients appeal the denials at all, Clarkson said.
Ah, yes, creating more work for an already overloaded legal and healthcare system, while pocketing the profits from people who won't fight it. Man, I LOVE capitalism...
Insurance companies practice medicine without a license all the time and those practices result in death. We are just supposed to pretend that's the correct way to live.
The ai they use is also not intended to separate those who need care from those who don't, instead the ai is meant to separate those who would successfully appeal against the decision from those who wouldn't. This is how the UHC denial rate was able to shoot up so fast: from 10.9% in 2020 to 32% in 2023. There have to be a lot of excess deaths, personal bankruptcies and homelessness hiding behind that statistic.
There's at least one chronic pain patient who allegedly took it out on one CEO, causing one excess death. It'll be nuts if his jury nullifies his alleged crime
Abstraction. These people are in such a different class compared to the average person that they basically don't associate with them, so they never see the consequence of their downright evil actions... up until a scorned man with nothing to lose makes them very clear very quickly.
The most defining trait of Sociopaths and Psychopaths is having no empathy, so for people like this the suffering of others, be it of their own making or not, has about as much emotional impact as the "suffering" of a leaf of lettuce when they eat it or of a piece of paper they crumple and thrown in the trashcan.
This also means that, amongst other things, they feel no guilt whatsoever (would you feel guilt from eating a leaf of lettuce?!) and will sleep like babies with the blood of thousands on their hands.
Yup, I was in an AI/ML training session about a year ago, and there were a few hospital execs there too. They were absolutely giddy about the prospect of using AI to deny care to unprofitable patients.