Recent Trends in Online Foreign Influence Efforts

0
449

Foreign governments have used social media to influence politics in a range of countries by promoting propaganda, advocating controversial viewpoints, and spreading disinformation. We analyze 53 distinct foreign influence efforts (FIEs) targeting 24 different countries from 2013 through 2018. FIEs are defined as (i) coordinated campaigns by one state to impact one or more specific aspects of politics in another state (ii) through media channels, including social media, (iii) by producing content designed to appear indigenous to the target state. The objective of such campaigns can be quite broad and to date have included influencing political decisions by shaping election outcomes at various levels, shifting the political agenda on topics ranging from health to security, and encouraging political polarization. We draw on more than 460 media reports to identify FIEs, track their progress, and classify their features. Introduction Information and Communications Technologies (ICTs) have changed the way people communicate about politics and access information on a wide range of topics (Foley 2004, Chigona et al. 2009). Social media in particular has transformed communication between leaders and voters by enabling direct politician-to-voter engagement outside traditional avenues, such as speeches and press conferences (Ott 2017). In the 2016 U.S. presidential election, for example, social media platforms were more widely viewed than traditional editorial media and were central to the campaigns of both Democratic candidate Hillary Clinton and Republican candidate Donald Trump (Enli 2017). These technological developments, however, have also resulted in new challenges for democratic systems; foreign actors have sought to exploit ICTs to influence politics in a range of countries by promoting propaganda, advocating controversial viewpoints, and spreading disinformation. High-profile episodes of such foreign influence efforts (FIEs), such as Russian efforts to influence the outcomes of the 2016 U.S. presidential election, have prompted numerous studies on this subject (Boyd et al. 2018, Risch et al. 2018, Howard et al. 2018). Many of these studies, however, extrapolate from isolated examples of Russian efforts to polarize public opinion abroad (see Hegelich & Janetzko 2016, Connell & Vogler 2017, Hellman & Wagnsson 2017, among others). Journal of Information Warfare (2019) 18.3: 15-48 15 ISSN 1445-3312 Print/ISSN 1445-3347 Online 16 Journal of Information Warfare Recent Trends in Online Foreign Influence Efforts We advance the literature by applying a consistent definition and set of coding criteria to the full set of identified FIEs since 2013. (The results described in this article were first discussed in our report Trends in Online Foreign Influence Efforts, released here in late-June 2019.) Drawing on a wide range of media reports, our data identified 53 FIEs in 24 targeted countries from 2013 through 2018. In total, 72% of the campaigns were conducted by Russia, with China, Iran, and Saudi Arabia accounting for most of the remainder. Our findings highlight the breadth of FIEs to date, suggest a small number of actors are launching these campaigns despite the fact that they are not technically challenging to conduct, and illustrate the broad spectrum of their political objectives. This paper, and the data described herein, offer high-level context for the growing literature about state-sponsored disinformation campaigns. The remainder of the paper proceeds as follows. Section 2 describes the coding rules, inclusion criteria, and process for creating our data on FIEs. Section 3 provides descriptive statistics and highlights trends over time. Section 4 discusses implications of these trends and potential future research directions. Foreign Influence Effort Database We define FIEs as (i) coordinated campaigns by one state to impact one or more specific aspects of politics in another state (ii) through media channels, including social media, (iii) by producing content designed to appear indigenous to the target state. To be included in our data, FIEs must meet all three criteria. Under this definition, FIEs are distinct from both traditional propaganda and disinformation (or ‘fake news,’ to use a colloquial term). The former involves political information provided by country X about country Y in ways which do not seek to mask its origin (such as Voice of America broadcasts about the U.S.S.R. during the Cold War) and may be true or false. Our definition also excludes local political activity, such as disinformation about country X produced by political actors in country X and spread on social media. Finally, the veracity of the content being promoted is not part of the definition. FIEs may involve promoting solely true content, solely false or misleading information, or some combination of the two. Data development Our data draw on a wide range of media reports to identify FIEs, track their progress, and classify their features. Drawing on more than 460 news articles (full list available), we identified 53 FIEs targeting at least 24 different countries from 2013 through 2018. We also looked for information in a wide range of previous academic research, building a database of 326 pieces studying online propaganda, influence operations and media consumption of voters (e.g. the Australian Strategic Policy Institute’s review of efforts to influence elections in democracies, Hanson et al. 2017). In total, 72% of the campaigns we identified were conducted by Russia. China, Iran, and Saudi Arabia accounted for most of the remainder. We also identified more than 40 distinct influence efforts which met some, but not all, of our inclusion criteria. In 2016, for example, Pro-Kremlin and Russian state-funded media wrote negative stories against NATO’s operation in Estonia, many of which contained clear falsehoods (Nimmo 2017). This information operation involved spreading incorrect information on social media but Recent Trends in Online Foreign Influence Efforts Journal of Information Warfare 17 was not an FIE under our definition because the content was not meant to appear as though it were produced in Estonia. We built our data in three steps following standard practice: 1) Develop a coding schema. Our database reflects the influencer’s strategic decisions as well as operational choices that have to be made by any organization conducting multiple distinct influence campaigns over time (e.g. which platforms to target in a given effort), as the Russian Internet Research Agency (IRA) did from mid-2014 through at least 2018 (Mueller 2019, pp. 4-8, pp. 14-35). Such campaigns require country-specific strategies along several dimensions, including topics to post about, platforms to use, tactics to employ, and so on. Figure 1, below, summarizes the relational database we developed to categorize FIEs. 2) Identify candidate influence efforts. Once the coding scheme was developed, we examined 463 stories about influence efforts from 41 countries across a range of sources. We first reviewed material from major sources, including ABC News, BBC News, Politico, Reuters, The Economist, The Guardian, The Independent, The Mirror, The New York Times, The Telegraph, The Wall Street Journal, The Washington Post, and Wired Magazine. We then searched for additional information on media websites and expert blogs, including Al-Monitor, Buzzfeed, Freedom House, Human Rights Watch, Medium (including reports by DFRLabs), Quartz, The Atlantic, The Daily Beast, The Daily Dot, The Hill, The Intercept, The New Republic, The Observer, The New Statesman, The Register, and The Verge. Finally, we reviewed all working papers and articles by the Computational Propaganda Project of Oxford University and the Social Media and Political Participation (SMaPP) Lab of New York University. 3) Code values for all FIEs. We identified 93 candidate FIEs across the sources described above. Of those, 53 met our inclusion criteria based on sources in English as well as Arabic, French, Spanish, and Russian, as appropriate. Each FIE was reviewed and evaluated by one of the authors as well as two student research assistants. The 53 identified cases from 2013 through the end of 2018 surely represent a lower bound on the number of distinct FIEs to date; media reporting in languages we could access may not have captured all FIEs within this time frame, and there may be some FIEs which went undetected. Our methodology is similar to that of some other efforts. Bradshaw & Howard (2018), for example, report on domestically-produced propaganda in which political parties or governments use social media to manipulate public opinion. As in this report, they focus on coordinated campaigns and not lone actors, identifying 48 cases around the world. Their methodology is similar to ours in that they look for information in the news, review the cases with help from a research team, and check the results with experts. Woolley & Howard (2017) use a different approach to study computational propaganda. They examine both purely domestic influence campaigns and ones targeting foreign countries by analyzing tens of millions of posts on seven different social media platforms during political elections between 2015 and 2017 in Brazil, Canada, China, Germany, Poland, Taiwan, Russia, Ukraine, and the United States. Key fields Each FIE is identified as an attacker-target-political goal triple. This design allows us to draw Recent Trends in Online Foreign Influence Efforts 18 Journal of Information Warfare inferences about changes in tactics over time as well as the allocation of effort by attacking organizations, which must make tradeoffs between time spent on different political goals. For each FIE we record the following fields: • Political Goal. Describes the objective of the effort.