Summary contributed by Samuel Curtis, a Schwarzman Scholar and member of our learning community
*Authors of original paper & link at the bottom
Mini-summary: As social media platforms not only allow political actors to reach massive audiences, but also to fine-tune target audiences by location or demographic characteristics, they are becoming increasingly popular domains to carry out political agendas. Governments across the world—democratic and authoritarian alike—are expanding the capacity and sophistication of their “cyber troops” operations to capitalize on this medium of communication. In this report, Samantha Bradshaw and Philip N. Howard document the characteristics of 48 countries’ computational propaganda campaigns. While the size, funding, and coordination capacities of each country’s online operations vary, one thing remains clear, regardless of location: social media platforms face an increased risk of artificial amplification, content suppression, and media manipulation.
In short time, social media platforms’ roles have expanded from their nascent stages—places for users to connect, share entertainment, discuss popular culture, and stay in touch with each other’s day-to-day lives—to fields of operations for large-scale political and ideological warfare. On these online theaters, “cyber troops,” carry out missions to manipulate public opinion for political purposes by disseminating and amplifying “computational propaganda” (automation, algorithms and big-data analytics to manipulate public life). While many readers may be familiar with the large, robust cyber operations based in Russia, China, US, or North Korea, this 2018 report by Samantha Bradshaw and Philip N. Howard at the Oxford Internet Institute illuminates formally-organized social media campaigns that are developing across the world, in countries large and small, rich and poor, authoritarian and democratic alike.
Bradshaw and Howard point out that in the past, governments relied on “blunt instruments” to block or filter information, but that modern social media platforms allow for more precise information control, as they have the capabilities of reaching large numbers of people while simultaneously allowing for micro-targeting of people based on location or demographic traits. This versatility is precisely what has made social media platforms suitable tools to shape discourse and nudge public opinion. The value in being able to control discourse online is evidenced by the growth in coordinated attempts to influence public opinion documented in 48 countries in this report, compared to the previous year’s 28 countries (but authors do note that their data may not be comprehensive).
While cyber troops function across all sorts of governments, the authors point out that their roles are not one and the same across space. In emerging and Western democracies, political bots are being used to poison the information environment, polarize voting constituencies, promote distrust, and undermine democratic processes. In authoritarian regimes, governing parties use computational propaganda as just one tool in a portfolio of tactics to shape the narrative of the ruling party, stomp out counter-narratives, and subvert elections.
Computational propaganda operations can be targeted at both foreign and domestic audiences, and conducted by government agencies, politicians and parties, private contractors, civil society organizations, or citizens and influencers. They may deploy a number of “valence strategies” (characterizing attractiveness or averseness of content), by spreading pro-government or -party propaganda, attacking the opposition or mounting smear campaigns, or diverting conversations or criticism away from important issues. Often, cyber troops conduct operations through fake accounts, which may be automated accounts, human-controlled, or hybrid/cyborg accounts—in which operators combine automation with elements of human curation, making them particularly difficult to identify and moderate.
Cyber troops also employ a suite of communication strategies: they may amplify messages by creating content, posting forum comments, replying to genuine or artificial users, suppress other content, by launching targeted campaigns to falsely mass-report legitimate content or users, so that platforms are temporarily or permanently removed from the site. The authors comment that, logically, automated accounts can be found on platforms that make automation easy, namely, Twitter, but that cyber campaigns take place across all common forms of social media. They also note that ⅕ of the countries had evidence of disinformation campaigns operating over chatting applications (WhatsApp, WeChat, Telegram, etc.), many of which are located in the Global South, where these applications are prevalently used, and large public group chats are widespread.
This report also shares the size, resources, status (permanent or temporary), level of coordination, and capacity of countries’ cyber troop capacity. The size of cyber troop teams can range from dozens (in countries such as Argentina or Kyrgyzstan), thousands or tens of thousands (UK and Ukraine, respectively) or even millions (China). Countries differ in the budgets they allocate towards computational propaganda, the degree to which their teams coordinate with other firms and actors, and how often they operate, be it full-time and year-round, or just around critical dates or occasions, like elections.
The report concludes by calling democracies to take action by formulating guidelines to discourage bad actors from exploiting computational propaganda: “To start to address these challenges [outlined in the report], we need to develop stronger rules and norms for the use of social media, big data and new information technologies during elections.” Notably, the terms “rules and norms” leave ambiguity with respect to those who should be developing, implementing, and enforcing said reforms: social media platforms or governments? This was likely intentional, as the conversation around who should regulate speech in democracies warrants a paper in its own right.
Original paper by Samantha Bradshaw and Philip N. Howard (University of Oxford): https://comprop.oii.ox.ac.uk/research/cybertroops2018/