Skip Navigation

This is what Russian propaganda looks like in 2024

www.npr.org /2024/06/06/g-s1-2965/russia-propaganda-deepfakes-sham-websites-social-media-ukraine

Pro-Russia social media accounts amplifying stories about divisive political topics such as immigration and campus protests over the war in Gaza.

Influence operations linked to Russia take aim at a disparate range of targets and subjects around the world. But their hallmarks are consistent: attempting to erode support for Ukraine, discrediting democratic institutions and officials, seizing on existing political divides and harnessing new artificial intelligence tools.

"They're often producing narratives that feel like they're throwing spaghetti at a wall," said Andy Carvin, managing editor at the Atlantic Council's Digital Forensic Research Lab, which tracks online information operations. "If they can get more people on the internet arguing with each other or trusting each other less, then in some ways their job is done."

6
6 comments
  • "It's absolutely true that when you look at an individual campaign, it's just as likely as not that it hasn't had a huge amount of influence, which is why Russia just does it again and again, or in a different form, or targeting a different group," the Digital Forensic Research Lab's Carvin said. "It's almost like producing cheaply manufactured goods and just getting it out there in the world, hoping that maybe one particular gadget ends up becoming the popular toy of the season, even if the others completely fail."

    Many researchers who study disinformation warn against seeing the hand of Russia as an all-powerful puppeteer, especially since so much of what its mouthpieces amplify is homegrown.

    I think we're severely underestimating the damage and impact of Russian influence, just as we've spent decades underestimating the damage and impact of Fox News propaganda.

    Amplifying something "home grown" rather than creating a narrative whole-cloth doesn't make it any less impactful. On a scale of 1-100, turning a fringe party from a volume level of 0.1 to 10 makes it seem like it is still low impact, but in fact it's 100 times as impactful. It's the difference between a fringe idea remaining fringe and it being accepted as a variation on "normal."

    That's why thirty years ago, white supremacy was a fringe group that would be toxic to anyone even touched by it. Now, thanks to normalization by Trump and Fox News - and yes, Russia - there are open white supremacists (though they only occasionally say the quiet part loud) in Congress.

    Russia is normalizing fringe right-wing, populist and totalitarian policies. I think they are not only having an impact, they are winning in recent elections. Yes, proving it is difficult, and that's why no news source is ready to claim Russia caused it. But they are injecting poison into the veins of the world. You might say it's "trace" amounts, but given a long enough timescale, it is going to be fatal.

    • I think we’re severely underestimating the damage and impact of Russian influence, just as we’ve spent decades underestimating the damage and impact of Fox News propaganda.

      And just as we'll underestimate the damage of the same garbage coming out of China.

      It really is concerning that there's so little understanding and reporting about the influence campaigns and interference by the Chinese government, particular when that's much more often what users are running across these days online.

      Russia really kicked the game off leading up to 2016, but China is the one running with the ball right now, and I'd say they're being much more effective with how they create and steer narratives not just in the US but across Europe and South America.

      China has silently managed to gain outsized influence within corporations like Google and other social media, not just in how the services finction within their own borders but you witness how thin the divide between Google's "chinese" version and the rest of the world really is with the way they moderate political speech on their platforms. It wasn't so long ago that they "accidentally" somehow banned a bunch of chinese keywords on western versions of the platform. And even now you will get shadowbans and comment removal for criticism of the CCP on YT, though it's unclear exactly what keywords and criteria they're using. Worst part is that it's entire opaque.

      It's really weird how differently the media approaches Russian and CCP influence campaigns even though at this point, as Russia declines and democracies are wiser to their tricks, none of those lessons or research are being applied towards countering and educating the public as to Chinese influence to undermine democractic institutions.

  • Pro-Russia social media accounts amplifying stories about divisive political topics such as immigration and campus protests over the war in Gaza.

    They forgot discontent about the economy. Also, the key factor of (sometimes, but not always) linking all three topics directly to Biden even though his actual record on all three could be summarized as "not Bernie Sanders but also several standard deviations better than most Democrats, like actually to the point that he's trying to help, and several miles or several hundred miles better than Trump"

    "I care how migrants are treated, and that's why I can't vote for Biden over Trump" is, if you take a second to examine the reality involved, all you really need to see to know that the person you're talking to is motivated by something much darker and more dishonest than actually caring about what happens to migrants

    The volume of posts, articles and websites that Russian-linked operations produce is being boosted by artificial intelligence — another new factor that sets 2024 apart from previous election cycles.

    I am constantly curious to maybe find some poster on Lemmy that's actually literally a bot, or whose answers are being generated by a bot. I haven't done it yet. I wish glitch tokens still worked.

  • This is the best summary I could come up with:


    "They're often producing narratives that feel like they're throwing spaghetti at a wall," said Andy Carvin, managing editor at the Atlantic Council's Digital Forensic Research Lab, which tracks online information operations.

    Since the invasion of Ukraine, the European Union has banned Russian media outlets including RT, Sputnik, Voice of Europe and RIA Novosti from publishing or broadcasting within the bloc.

    That hasn’t stopped RT articles from proliferating across hundreds of other websites widely available in Europe, according to a recent report from the German Marshall Fund of the United States, the University of Amsterdam and the Institute for Strategic Dialogue.

    "We discovered RT articles reposted to third-party websites targeting audiences from Iraq to Ethiopia to New Zealand, often without any indication that the content was sourced from a Russian propaganda outlet," the researchers wrote.

    Covert influence campaigns based in Russia, as well as in China, Iran and Israel, have begun using AI in their attempts to manipulate public opinion and shape politics, according to recent reports from OpenAI, Meta and Microsoft.

    A Russian operation that Microsoft calls Storm-1679 used AI to fake actor Tom Cruise's voice narrating a phony Netflix documentary disparaging the International Olympic Committee.


    The original article contains 1,190 words, the summary contains 196 words. Saved 84%. I'm a bot and I'm open source!

6 comments