Organic cyberwars

The respectable sounding Internet Research Agency (IRA) is a media organisation that was started by the Russian government in 2013, initially to exert influence over Ukrainian and Russian citizens. Some time before the 2016 US election the Russian IRA directed these operations to influence online political discussions in the US, with further influence in other countries, though that hasn’t drawn as much attention yet.

Two reports released this week about political influence online provide the details with evidence of the Russian IRA’s covert social media influence campaigns.

Organic reach

The reports refer to organic online activity which includes innocent content supplied by consumers as tweets, Facebook posts, blogs, and Youtube clips, as well as comments, reposts, links, likes and subscriptions (or following). So organic activity embraces non-commercial online consumer activity, to be contrasted with advertising activity paid for by sponsors, that usually appears conspicuously as banner ads, popups, and branded inserts in news feeds and videos.

Advertisers obviously design their interventions to persuade and reinforce a brand. Political parties and other interests also deploy such “non-organic” tools of persuasion and for propaganda. Advertisers can also encourage organic posts by customers and that has the effect of enhancing the reach of the brand for free, as part of a customer engagement strategy.

The reports I referred to above deploy the term “organic reach” to describe the processes by which advertisers or any organisations exert influence by pretending to be from social media consumer sources, i.e. pretending to be organic activity — delivering fake consumer-led online activity.

Covert and fake organic tactics of the kind deployed by the Russian IRA include posts, feeds, comments, and other content, including likes, all delivered from false or misattributed accounts, organisations, communities and individuals.

A malign organisation can certainly introduce paid ads that deliver misinformation and claim to be from a source other than those running the influence operation. Bots, algorithms, and pools of human operatives pretending to be legitimate “organic” social media users simulate organic activity.

Sowing confusion

Here is one of the tactics used by Russia’s Internet Research Agency: voter suppression. This is a tactic directed at people on social media within a demographic that can be readily identified and is usually inclined to vote in a particular direction, e.g. the average African American voter is or was assumed likely to vote Democrat in certain US states.

To suppress that vote, the malign agent presents to members of that group “tweets designed to create confusion about voting rules,” according to the report Tactics and tropes of the Internet Research Agency (8) e.g. voters might be led to think they can get someone else to vote on their behalf, or that they can deliver their vote online. The Russian IRA might also encourage that demographic to vote for a 3rd political party or an independent minority candidate. That would effectively be a “wasted” vote.

The third method here is to persuade members of that group not to go to the polling station as their vote won’t make a difference anyway.

I think that particular demographic was picked out in the study as there did seem to be a correlation between voter turnout for African Americans and a lower Democratic vote in some states, and that correlation was easy to identify, and would have made a difference to who won the election.

That’s one example of how such an influence campaign can work, and it targeted US citizens from outside the country. Combined with sophisticated monitoring of social media users, targeted organic reach, and hacking into private records, we have the makings of cyber-subversion, if not cyber warfare.

References

  • DiResta, Renee, Kris Shaffer, Becky Ruppel, David Sullivan, Robert Matney, Ryan Fox, Jonathan Albright, and Ben Johnson. 2018. The Tactics and Tropes of the Internet Research Agency. Washington, DC: Senate Select Committee on Intelligence
  • Howard, Philip N., Bharath Ganesh, Dimitra Liotsiou, John Kelly, and Camille François. 2018. The IRA, Social Media and Political Polarization in the United States, 2012-2018. Oxford: Computational Propaganda Research Project, University of Oxford

Note

  • Image above is the EU-Russia border at Narva, Estonia.

1 Comment

Leave a Reply