• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
Montreal AI Ethics Institute

Montreal AI Ethics Institute

Democratizing AI ethics literacy

  • Articles
    • Public Policy
    • Privacy & Security
    • Human Rights
      • Ethics
      • JEDI (Justice, Equity, Diversity, Inclusion
    • Climate
    • Design
      • Emerging Technology
    • Application & Adoption
      • Health
      • Education
      • Government
        • Military
        • Public Works
      • Labour
    • Arts & Culture
      • Film & TV
      • Music
      • Pop Culture
      • Digital Art
  • Columns
    • AI Policy Corner
    • Recess
    • Tech Futures
  • The AI Ethics Brief
  • AI Literacy
    • Research Summaries
    • AI Ethics Living Dictionary
    • Learning Community
  • The State of AI Ethics Report
    • Volume 7 (November 2025)
    • Volume 6 (February 2022)
    • Volume 5 (July 2021)
    • Volume 4 (April 2021)
    • Volume 3 (Jan 2021)
    • Volume 2 (Oct 2020)
    • Volume 1 (June 2020)
  • About
    • Our Contributions Policy
    • Our Open Access Policy
    • Contact
    • Donate

Research summary: Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation

August 26, 2020

Summary contributed by Nga Than, a Ph.D student in the Sociology program at City University of New York – The Graduate Center.

Link to full paper + authors listed at the bottom.


Mini-summary: Online media manipulation has become a global phenomenon. Bradshaw and Howard examine this emerging phenomenon by focusing on the “cyber troops,” or organized governmental and political actors who manipulate public opinion via social media. This report provides detailed accounts of such groups across 28 countries. The authors investigate the types of messages, valances, and communications strategies that cyber troops use. Furthermore, they also compare their organizational forms, resources and capacities. The authors find that organized social media manipulation is a pervasive and global phenomenon. Some organizations target domestic populations, while others try to influence public opinion of foreign populations. Authoritarian regimes tend to have organized social media manipulation campaigns that target their domestic population. In democratic regimes, cyber troops have campaigns that target foreign publics, while political-party-supported campaigns target domestic voters. Overtime, the mode for organizing cyber troops went from military operation to private-for profit communication firms that work with the government.

Full summary:

Social media has played an increasingly important role in shaping public life, and public discussions. It helps form public opinion and serve as a place of information acquisition across the globe. Governments, and political actors increasingly have taken advantage of these communication platforms to their own advantage. They spend an increasing amount of financial resources to employ people to generate content, influence public opinion, and engage with domestic and foreign audiences. Bradshaw and Howard gather and create a unique dataset of organized media manipulation organizations to understand this global trend. 

The authors start out by defining the term: “cyber troops,” which refer to “government, military or political‐party teams committed to manipulating public opinion over social media.” The authors maintain that these groups play an increasing role in shaping public opinion. Then, they describe the process of gathering information to construct a unique dataset that they created to study those organizations to analyze the size, scale and extent to which different kinds of political regimes deploy cyber troops to influence and manipulate the public online. The authors rely on mainly news media sources written in English to find information such as budgets, personnel, organizational behavior and communication strategies. They further corroborate and supplement this information by consulting countries experts, and with reports from research institutes, civil society organizations. 

The authors found that cyber troops adopt a wide range of strategies, tools, and techniques for social media manipulation. These strategies include commenting on social media posts to engage with citizens, targeting individuals, using both real and fake social media accounts, and bots to spread propaganda and pro-government messages, as well as creating original content. Messages to users range from positive, to harassing and verbal abuse, to neutral language to distract public attention from important issues. Individual users are targeted to silence political dissent. This method is considered most harmful to targeted individuals because they often receive real life threats and suffer reputational damage. Cyber troops create original content such as videos, blog posts under online aliases. 

Cyber troops have a wide range of organization forms, structures, and capacity of cyber troops. The authors observe that some governments have their own in-house teams, while others outsource these activities to private contractors, and sometimes galvanize volunteers, and hire private citizens to spread political messages on the Internet.

In some countries, organized media manipulation is done through a small team, while in others a large network of government employees is involved. A notable example is China, which has more than 2 million individuals working to promote the party ideology. The research team also found that these different groups have different operating budgets, yet they encountered the problem of incomplete information because such information is not readily available. In authoritarian regimes, governments tend to provide funding for these activities, while in democratic regimes, political parties tend to be the main drivers of organized social media manipulation. 

Bradshaw and Howard show that cyber troops have heterogeneity in terms of organizational structure. Such organizations could have five different types of structure: (1) a clear hierarchy and reporting structure, (2) content review by superiors; (3) strong coordination across agencies or team (4) weak coordination across agencies or teams (5) liminal teams. Cyber troops also engage in capacity building activities from training staff to improve skills and abilities associated with producing and disseminating propaganda to providing rewards or incentives for high-performing individuals to investing in research and development projects. 

This cross-country comparative research highlights the heterogeneous nature of cyber troops’ activities across the world. They are increasing in size, scope, and organizational resources and capacity. This paper has important implications for researchers, civil society organizations, and private citizens to question how their online activities are shaped and influenced by government-funded groups. Furthermore, the paper raises important questions about the social media environment where government cyber operations could operate, shape public opinion, and sometimes divert public attention from important issues.


Original paper by Samantha Bradshaw, Philip N. Howard: https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/07/Troops-Trolls-and-Troublemakers.pdf

Want quick summaries of the latest research & reporting in AI ethics delivered to your inbox? Subscribe to the AI Ethics Brief. We publish bi-weekly.

Primary Sidebar

🔍 SEARCH

Spotlight

Illustration of a coral reef ecosystem

Tech Futures: Diversity of Thought and Experience: The UN’s Scientific Panel on AI

This image shows a large white, traditional, old building. The top half of the building represents the humanities (which is symbolised by the embedded text from classic literature which is faintly shown ontop the building). The bottom section of the building is embossed with mathematical formulas to represent the sciences. The middle layer of the image is heavily pixelated. On the steps at the front of the building there is a group of scholars, wearing formal suits and tie attire, who are standing around at the enternace talking and some of them are sitting on the steps. There are two stone, statute-like hands that are stretching the building apart from the left side. In the forefront of the image, there are 8 students - which can only be seen from the back. Their graduation gowns have bright blue hoods and they all look as though they are walking towards the old building which is in the background at a distance. There are a mix of students in the foreground.

Tech Futures: Co-opting Research and Education

Agentic AI systems and algorithmic accountability: a new era of e-commerce

ALL IN Conference 2025: Four Key Takeaways from Montreal

Beyond Dependency: The Hidden Risk of Social Comparison in Chatbot Companionship

related posts

  • Research Summary: Toward Fairness in AI for People with Disabilities: A Research Roadmap

    Research Summary: Toward Fairness in AI for People with Disabilities: A Research Roadmap

  • Perspectives and Approaches in AI Ethics: East Asia (Research Summary)

    Perspectives and Approaches in AI Ethics: East Asia (Research Summary)

  • Report on the Santa Clara Principles ​for Content Moderation

    Report on the Santa Clara Principles ​for Content Moderation

  • The Meaning of “Explainability Fosters Trust in AI”

    The Meaning of “Explainability Fosters Trust in AI”

  • Response to Mila’s Proposal for a Contact Tracing App

    Response to Mila’s Proposal for a Contact Tracing App

  • Research summary: Aligning Super Human AI with Human Behavior: Chess as a Model System

    Research summary: Aligning Super Human AI with Human Behavior: Chess as a Model System

  • Visions of Artificial Intelligence and Robots in Science Fiction: a computational analysis

    Visions of Artificial Intelligence and Robots in Science Fiction: a computational analysis

  • Subreddit Links Drive Community Creation and User Engagement on Reddit

    Subreddit Links Drive Community Creation and User Engagement on Reddit

  • Research summary: Lessons from Archives: Strategies for Collecting Sociocultural Data in Machine Lea...

    Research summary: Lessons from Archives: Strategies for Collecting Sociocultural Data in Machine Lea...

  • Data Capitalism and the User: An Exploration of Privacy Cynicism in Germany

    Data Capitalism and the User: An Exploration of Privacy Cynicism in Germany

Partners

  •  
    U.S. Artificial Intelligence Safety Institute Consortium (AISIC) at NIST

  • Partnership on AI

  • The LF AI & Data Foundation

  • The AI Alliance

Footer


Articles

Columns

AI Literacy

The State of AI Ethics Report


 

About Us


Founded in 2018, the Montreal AI Ethics Institute (MAIEI) is an international non-profit organization equipping citizens concerned about artificial intelligence and its impact on society to take action.

Contact

Donate


  • © 2025 MONTREAL AI ETHICS INSTITUTE.
  • This work is licensed under a Creative Commons Attribution 4.0 International License.
  • Learn more about our open access policy here.
  • Creative Commons License

    Save hours of work and stay on top of Responsible AI research and reporting with our bi-weekly email newsletter.