European Commissioner Thierry Breton and the UK’s Michelle Donelan want Elon Musk to get a grip on gruesome Israel attack videos posted on X.
Elon Musk has until the end of Wednesday to respond to demands from Brussels to remove graphic images and disinformation linked to the violence in Israel from his social network X — or face the full force of Europe's new social media rules.
Thierry Breton, the European Union commissioner who oversees the bloc's Digital Services Act (DSA) rules, wrote to the owner of X, formerly Twitter, to warn Musk of his obligations under the bloc's content rules.
If Musk fails to comply, the EU's rules state X could face fines of up to 6 percent of its revenue for potential wrongdoing. Under the regulations, social media companies are obliged to remove all forms of hate speech, incitement to violence and other gruesome images or propaganda that promote terrorist organizations.
Since Hamas launched its violent attacks on Israel on October 7, X has been flooded with images, videos and hashtags depicting — in graphic detail — how hundreds of Israelis have been murdered or kidnapped. Under X's own policies, such material should also be removed immediately.
Woah, child sex abuse isn't the same as war though. People already take it plenty seriously and nobody is glorifying it (out in the open.)
Your analogy isn't a 1:1 representation of the topic at hand. All it does is pivot from the actual topic to something that's easier for you to argue against.
I never said it was the same, it's called a comparison. We ban images of sex abuse because of the harm sharing those images does to victims. Hamas has gone through a lot of effort to film and disseminate what they did in Israel online, and they are doing so with the intent of doing harm to the victims' families. While there may not be laws in the USA prohibiting the sharing of this content, I would still argue that it is morally reprehensible given that you are participating in something intended to do harm.
What it's 'intended' to do doesn't really matter. If you notice, people aren't supporting Hamas. They see these videos and they're rallying behind them in support for Israel.
Wow. It's almost like, exactly how I said, showing people instead of telling them causes them to take war seriously.
There have been plenty of studies about gore and death content that suggest they cause trauma similar to PTSD. Some people are affected more than others. On X you're pretty likely to be presented with some extremely violent images right now if you go looking for information about what is happening, so you can't really avoid it other than to avoid X entirely. Plenty of these images and videos aren't even related to this conflict, and are just misinformation / ragebait.
Can you show me what studies you're talking about?
I have a feeling you're referring specifically to studies that focus on people who are paid to moderate this content. If you share what studies you're talking about we can know for sure.
You really don't need to look further than the clinical data on PTSD. A sufficient amount of any form of trauma can cause mental health issues including but not limited to PTSD. Watching an execution video has a large potential to cause a severe trauma response, especially if the victims are people you know or love, or are members of your community.
Plenty of real world examples of content moderation teams at social media companies suffering from their exposure to extreme content.
Traumatizing people is one of the core goals of terrorism, because it does damage.