Three things to know about foreign disinformation campaigns

October 31, 2024

As election day nears, U.S. adversaries—including Russia, China, and Iran—ramp up efforts to exert influence by spreading false narratives and sowing discord

Key Takeaways:


Peter Pomerantsev and Anne Applebaum
Peter Pomerantzev and Anne Applebaum

Each day brings new headlines about foreign entities attempting to interfere in the upcoming U.S elections. Just last week, intelligence officials warned, “Foreign actors—particularly Russia, Iran, and China—remain intent on fanning divisive narratives to divide Americans and undermine Americans’ confidence in the U.S. democratic systems.”

SNF Agora fellows Anne Applebaum and Peter Pomerantsev have spent years examining propaganda, a topic they explore on their recently launched podcast, Autocracy in America, produced by The Atlantic.

Applebaum and Pomerantsev built on the topics they discuss on their podcast during a panel discussion held earlier this month at the Hopkins Bloomberg Center.

Here’s what you need to know:

  1. Foreign interference is coming from multiple directions.

China, Russia, and Iran, among others, are all conducting influence operations in the United States, although with differing goals.

Russia state media, for example, funded a group of conservative social media personalities who created videos that criticized America’s ongoing support for Ukraine. The country has also run operations that support former president Donald Trump’s current presidential campaign. In contrast, Pomerantsev said, Iran is working against Trump.

Still, Applebaum said, the end goal is the same.

“Russia, China, Iran, Venezuela, and others collaborate around a set of narratives, and really one big narrative,” she said. “The idea is autocracy is stable and safe, and democracy, especially American democracy, is used and divided.”

  1. Meta’s algorithm change may have been counterproductive.

Meta—the parent company of Facebook, Instagram, and Threads—tweaked its algorithm to limit the amount of political content users see. It was an effort to reduce the spread of disinformation on the platform.

But, panelists said, the move may have backfired.

“It helps the bad actors because it means people who do real journalism, who collect facts … have seen their engagement levels plummet,” Applebaum said.

  1. Platforms have gotten better at breaking up networks.

But Renée DiResta, author of Invisible Rulers: The People Who Turn Lies into Reality, said Meta has made improvement in one key area since the 2020 election: disrupting networks of users running disinformation campaigns.

For example, in November 2023, Facebook broke up three “coordinated inauthentic behavior” networks with roots in China and Russia. They consisted of thousands of fake accounts posing as Americans and aiming to manipulate public debate.

DiResta said steps such as those taken by Meta can limit distribution.