People Want To Use Things But Not Own The Consequences Of Its Use.
[Edit 2: I think anyone commenting should identify how much they use Facebook in their comment lol]
On the list of people I describe in the subject, I place myself first. If you're here to defend yourself by showing me your receipts, congratulations, you win, I just saved us who knows how much time. I'm typing this out in an attempt to describe phenomena, not persuade you of anything in particular, other than, this is a thing I see happening a lot; too much would be my take.
I'm just gonna grab [a] most egregious example, but I would like to talk about this, not as a horrific fail, but as an exemplar; at the moment I believe that most people categorize it as the former.
[edit: there really is no "most" egregious example, and I just thought of a much worse one, and unlike Facebook I am fully guilty of this one: I own and drive a car, a lot, and boy am I ignoring some real world consequences there.]
That example being, Facebook Acted As The Main Propaganda Outlet For A Genocide Of The Rohingya In Myanmar, and therefore, Anyone Who Uses Facebook Is Using A Tool That Has Bloodstains On It And Are Somehow Not Horrified.
To more easily conceptualize this, it's much the same as me needing a shovel, and having a neighbour that I happen to know murdered someone with their shovel, but has not been arrested for it, and right when I need the shovel, they walk over with their bloodstained shovel and offer to let me use it for my non-murder task. And I just go "Wow how convenient that you happened to be here with that bright-red shovel just now, I think I'll use this one one of yours with the little spatters of brain on it, instead of walking over to my shed and getting my own shovel out!"
We are talking about murder here, Facebook was used to foment mass murder and in a world that made sense, Zuckerberg would be handed over to the ICC years ago, along with Henry Kissinger and a number of others who instead hang out at the Nobel Peace Prize club where Barack makes a mean Mai Tai.
The problems that people use Facebook to constructively solve is connections to family and close friends, event and interest group organizing, the marketplace, and for the avid user it constitutes a daily journal.
These problems could each be solved using something else that is also just as gratis. It might be a small amount of effort more, but then you maybe don't ever have to touch the remains of a human life that once existed and now does not, due to this particular device being used to end that life.
But it seems that it's more convenient, easy, zero effort, to simply ignore the gore.
That's what I see on the internet. I don't think anyone has ever accepted a bloodstained shovel and set to digging a ditch with it who didn't also feel that their life was next if they didn't, but as long as there's no visible bloodstains, as long as it's just a few articles and podcasts from known radical leftists, eh, look at little Jimmy's recital, isn't he cute?
In 1993 Radio Télévision Libre des Mille Collines (RTLM) started broadcasting hate speach in Rawanda. They used technology presumably manufactured and sold by multi-national corporations who had no mechanism to prevent abuse of the platform they created. Should we blame the manufacturers of radio broadcast equipment for the Rawandan genocide?
No, but that radio station should definitely be shut down and handed over to the ICC, just like Facebook and Zuck.
By your silly analogy, I would have a problem with all the physical equipment manufacturers that Facebook buys their servers, switches, etc from. It's not about the equipment, actually, it's about allowing the operator of that radio station to continue operating the radio station, and not just that, continuing to listen to a station operated by that broadcaster in a different market, because in your market it's all car ads and vaccine denial instead.
Facebook didn't generate the objectionable content. They created a mechanism for people to communicate with one another, like radio did a century earlier. Asking Facebook to check to make sure people don't missuse the platform is like asking radio manufacturers make sure equipment doesn't fall into the hands of evidoers.
What would you have had Facebook do, specifically? What practical steps are you wanting them to have taken? Could those steps be reasonably taken for every country in the world?
To maintain the analogy - what if the radio equipment were somehow designed to provide stronger, more far-reaching frequencies if the DJs were broadcasting hate speech and military commands, but shorter, weaker frequencies when DJs discussed crimes against humanity? Facebook isn't a truly open platform, it's algorithms dictate what users see and what goes viral.
I see where you're going, but I think it's important to note that the Facebook algorithm wasn't intentionally boosting hate, it just looks to maximize engagement. The unintended consequence is that hate gets boosted because it gets engagement both from the haters and the hated.
Agreed. I personally struggle with the word "intentionally," however. Meta was aware of the negative side effects of their content algorithms far before the recent Myanmar violence and did nothing to remedy it. There were internal reports about teen suicide and eating disorders several years prior that they tried to hush up, and of course the Cambridge Analytica data privacy scandal which revealed the extent to which Facebook was supplying third parties with user info that was directly responsible for increased partisanship in the 2016 & 2020 election cycles, and probably (imo) they share some blame for recent hate crimes in the US accordingly. And now we know they definitively hold blame for increased violence in Myanmar. If they knew the effect their platform had and did nothing about it, that to me seems intentional. Just my 2¢