You are using an outdated browser. Please upgrade your browser to improve your experience.
Skip to content
Hear from our CEO about the Key OSINT Trends in 2024 Read Now

The Growth of Influence Operations

This post aims to serve as a primer on influence operations – what they are, how they operate, and what intelligence agencies might do to counter them.

Echo Chambers

We’re all familiar with how arguments, counterarguments, ideas, and viewpoints flow freely through social media. Left alone, popular opinions get shared, unpopular opinions get swept under the rug, and consensus, be it transient and unstable or “common sense” and long-lived, can form organically.

This dynamic is open to abuse from actors who wish to manipulate public opinion via online influence operations.

#NotInMyNameTheresaMay

Let’s look at a real-life example of how political topics can be hijacked and public opinion swayed by an influence operations campaign. Back in April 2018, then UK Prime Minister Theresa May roused the ire of the British electorate with her proposal to join the US military response in Syria following the chemical attack on the city of Douma. A slew of negative reactions poured in on Twitter under the hashtag #NotInMyNameTheresaMay. A Twitter poll started by an activist for the UK’s Labour Party showed massive opposition to “bombing Syria”. Newspapers, both in the UK and internationally, reported on the unpopularity of the Prime Minister’s decision.

Things weren’t exactly as they seemed, though. The hashtag and the poll were real enough – created in good faith by activists in the UK. The problem was, in amongst the legitimate chatter, there was an apparently coordinated effort by Russia-linked accounts to amplify the negative response and weaken May’s mandate.[1]

The trolls hi-jacked the hashtag and tipped the poll. And though they made up a minority of users engaging with the existing campaign, their influence was enough that, as a result, Russian platforms such as Sputnik News and RT had the ammo to publish articles such as, Not in my name, Theresa May: Social Media users oppose UK strikes in Syria, and 43% of Britons lack appetite for war in Syria.

In the end, the campaign was unsuccessful in convincing those in power against committing forces to the joint military effort, though not for lack of trying; a British government report indicated a 4000% increase in Kremlin-linked bot and troll activity linked to this campaign.

It’s fake news

As the term suggests, influence operations serve to push an agenda. They may be politically driven, but they don’t have to be. They can target foreign entities or domestic. They can persuade or defame or just sow confusion. They can churn out their own “news” and propaganda and can spread their message widely through networks of bots and trolls. They always try to appear indigenous to the target audience.

Ongoing projects such as the Influence Effort Database [2] have been tracking the rise of influence operations, from high profile, high impact foreign influence campaigns such as the 2017 Saudi and UAE efforts to diplomatically isolate Qatar or the interference in the 2016 US Presidential elections, to purely domestic affairs. Russia, via their Internet Research Agency, is the most prolific source of influence operations with more than half of the entries in the Influence Effort Database listing them as the attacking country (though Iran, Saudi Arabia and China have made some recent headway).

Botnets, Troll Farms and Content Mills

No two influence operations are exactly alike. Tools and strategies are constantly evolving. However, we are able to see some constants across operations. The use of automated accounts (bots), human-controlled fake accounts (trolls), fake, recycled, and distorted news articles (content mills), and targeted messaging are all commonly seen.

Let’s take Iran’s efforts against Israel as an example.

The Tel Aviv Times: a Hebrew-language news website controlled entirely by Iran.[3] The operation mobilized bots and trolls on Facebook, Twitter, and Instagram to push the supposedly local content to users in Israel. The website itself was a content mill mixing stories lifted wholesale from legitimate news sites and articles where the headlines or text were twisted to fit Iran’s goals (e.g., to increase the perception of threats to Israel). Its influence is hard to quantify, given the seemingly broad aim of the operation, but the website attracted around 65,000 hits per month in its heyday.

Whack-a-mole

Countering an influence operation can be tricky.

Just finding active operations requires laborious analysis. And once you’ve found the accounts and worked out the message they’re pushing, what then?

Where applicable, agencies can attempt to get suspected accounts banned by flagging them as having breached their respective platforms’ terms of service (TOS). Though, each time you get rid of one account, what is stopping the operation from mobilizing a new one? Or a hundred new ones?

From our perspective of wanting to counter influence operations, it would seem that the best-placed entities are the social media platforms themselves. However, for a number of reasons (not least that bots and fake accounts inflate their user base, and thus their revenue), the platforms have been less-than-proactive in this regard.

Agencies could try to directly engage with the target audiences of influence operations, highlighting the material as propaganda, creating counter-narratives, or debunking claims. These types of interventions have been shown to reduce the emotional engagement of users with campaign content and reduce their probability of “liking” or sharing the content. More broadly, educating people on how to be savvier with social media may be the most robust, long-term solution.[4]

Influence operations are here to stay, and analysts need the tools and the means to counter them. For an attacker, they’re cheap and low risk. But if they hit right, the consequences can be huge.

References 

  1. https://medium.com/dfrlab/trolltracker-pro-kremlin-trolls-deployed-ahead-of-syria-strikes-e49acc68c8ff
  2. Martin, D. A., Shapiro, J. N., & Ilhardt, J. G. (2020). Trends in online foreign influence efforts (v2). Princeton Univ. Press.
  3. https://www.haaretz.com/israel-news/.premium-israeli-cyber-security-company-iran-created-fake-hebrew-news-sites-1.6463020
  4. Helmus, T. C., Marrone, J. V., Posard, M. N., & Schlang, D. (2020) Russian Propaganda Hits Its Mark: Experimentally Testing the Impact of Russian Propaganda and Counter-Interventions. Santa Monica, CA: RAND Corporation. https://www.rand.org/pubs/research_reports/RRA704-3.html