Five Fantastic Facts
- Viral Propaganda: In 2016, over 30 million Americans unknowingly shared Russian-made propaganda.
- Daily Exposure: Each American, on average, was exposed to four Russian propaganda posts per day in October 2016.
- Minimal Clean-up: Post-election, Facebook announced the removal of only 301 groups and pages managed by the Russian troll farm.
- Financial Investment: Russia is estimated to have spent $125 million on boosted posts and social media ads during the 2016 election period.
- Mysterious Death: The mastermind behind the Russian propaganda blitz in 2016 died in a mysterious plane crash last year.
It’s Not Meddling, It’s Targeting
The narrative surrounding Russia and the 2016 election has been thoroughly dissected. The term “meddling” is commonly used, but it doesn’t capture the precision and sophistication of the tactics used. This isn’t about changing votes after they’re cast or eliminating candidates; it’s about using everyday, legal advertising strategies to shape how decisions are made.
In 2024, as another contentious election approaches, it’s not just Russia in the game. Other state actors are employing similar tactics to influence American voters. By the end of this discussion, you’ll better understand how these tactics work, enabling you to spot the propaganda that’s likely to flood your feeds as November draws near.
The Troll Farm
Our story begins in St. Petersburg, Russia, at the Internet Research Agency (IRA). This nondescript, four-story building housed an elaborate operation resembling a high-quality marketing agency. Each floor had a specific role in their mission to influence global perceptions:
- Website Creation: The first floor focused on creating seemingly legitimate news websites controlled by Russian propagandists. Examples include D.C. Weekly, New York News Daily, Chicago Chronicle, and Miami Chronicle. In 2016, a site called Blacktivist, run by Russia, amassed over 500,000 followers—more than the official Black Lives Matter page.
- Blogging: The second floor was dedicated to bloggers who worked 12-hour shifts, each producing 10 posts per shift. Their content echoed Russian media narratives and was generated around the clock.
- Account Creation and Commenting: The third floor was responsible for creating and managing fake social media accounts. These profiles engaged in 126 comments and two posts per account daily, spreading their reach exponentially.
- Meme Creation: The fourth floor focused on designing memes to elicit emotional responses, flooding internet forums and social networks with manipulated imagery.
Beating the Algorithms
The entry point for this manipulation often started with a simple friend request. Thousands of fake profiles would send out friend requests, and even if only a small percentage accepted, it was enough to infiltrate networks. Here’s how it worked:
- Initial Acceptance: A few mutual friends can lead to more people accepting the friend request, creating a domino effect.
- Algorithm Manipulation: Once a fake profile gains a foothold, the propagandists engage with posts through likes, shares, and comments, tricking social media algorithms into promoting the content more widely. As real people interact with this content, it gains even more traction, sometimes reaching mainstream news outlets.
A notable example was the misinformation about Hillary Clinton’s health. After a minor stumble from heat exhaustion, Russian propagandists fueled rumors of brain damage. This narrative, amplified through fake profiles and real people, became a leading story on major news networks like CNN, despite being entirely false.
Targeting and Personalized Messaging
The effectiveness of any ad campaign lies in its targeting. Social media platforms track every interaction to build detailed profiles of users, allowing advertisers to target specific groups effectively. The IRA used this to their advantage:
- Data from Hacks: In 2015, Russian agents hacked voter registration databases, gaining access to basic information like name, address, age, email, phone number, and party affiliation.
- Matching Profiles: Using this data, propagandists matched names and emails with social media profiles, creating targeted audiences based on party affiliation.
- Tailored Messaging: They then sent highly personalized content designed to resonate with specific groups, from pro-Bernie Sanders messages to pro-Texas secession content.
This wasn’t about supporting one candidate; it was about sowing division and amplifying controversial issues to destabilize American society.
The KPI: Distrust in Democracy
Contrary to popular media narratives, the goal of this propaganda campaign wasn’t merely to elect Donald Trump. The true mission, confirmed by Putin himself, was to undermine faith in democracy. Here are some real-world examples:
- Engineered Protests: Using two Facebook groups, Heart of Texas and United Muslims of America, Russian propagandists organized opposing protests on May 21, 2016. This led to a real-world confrontation that required police intervention and received national media coverage, all for the low cost of $200.
- Trump Protest: A fake Russian group, BlackMattersUS, organized a Trump protest attended by thousands and shared with 61,000 users on social media, costing only $400.
- Current Conflicts: Recently, Russia has used similar tactics to exploit divisions over the Israel-Hamas conflict, organizing protests and counter-protests that fuel ongoing tensions.
Spotting the Propaganda
The challenge now is recognizing and resisting these tactics. Here are some tips to help you identify potential propaganda:
- Question the Source: Verify the legitimacy of the source. Look beyond the headline and check if reputable news outlets are reporting the same story.
- Be Skeptical of Emotional Triggers: Propaganda often uses emotionally charged language and imagery to elicit strong reactions. If a post makes you angry or fearful, take a moment to fact-check before sharing.
- Check the Engagement: Analyze who is engaging with the content. A sudden surge in comments or shares from profiles that seem fake or have minimal activity could be a red flag.
- Diversify Your News Sources: Relying on a single news source can make you more susceptible to propaganda. Diversify your news intake to get a broader perspective.
Conclusion
As we approach another critical election, understanding these tactics is crucial. Propaganda is more sophisticated and pervasive than ever, and recognizing its presence is the first step in mitigating its impact. Stay informed, stay skeptical, and help protect the integrity of our democratic processes.