Coordinated (in)authentic behaviour online: A systematic review and typology
Campaign
Social Media
Mobilisation
Influence
Theoretical
Abstract
Social media platforms allow citizens to share their opinions online but are also vulnerable to coordinated campaigns that mimic genuine citizen behavior and seek to manipulate public discourse and opinion. A growing number of studies from a wide range of disciplines investigate this phenomenon, using diverse concepts such as astroturfing (Keller et al., 2020), coordinated inauthentic behavior (Giglietto et al., 2020), social bots (Boichak et al., 2021) or computational propaganda (Woolley & Howard, 2019). While the scholarly interest and the potential adverse consequences for democratic opinion formation and political participation underline the importance of this research field, the conceptual variety complicates a comprehensive understanding of these practices and interrelated phenomena of strategic coordinated influence operations online. What is lacking in this field of research is an overview and a sorting of existing conceptualizations of coordinated campaigns online and a clarification of these approaches' overlaps and delineations.
This paper fills this gap by conducting a systematic literature review and putting forward a typology of coordinated (in)authentic online behavior. We collected the academic literature and found 458 studies from various disciplines on the Web of Science and EBSCOhost databases using a comprehensive search string. A team of four reviewers is going to code the identified literature following clearly defined categories and instructions. We aim to provide a systematic overview of the conceptualizations in this body of literature. Drawing on this review, we propose a typology of coordinated online behavior that delineates different forms of inauthentic, authentic, automated, and human communication as well as different degrees of deceptive intent. Within this typology, we examine the specific functions of coordinated behavior employed by different actor types and the context-specific means of coordination. Importantly, we formulate hypotheses regarding the role of digital platform affordances for different types of coordinated behavior, paving the way for future work on cross-platform analyses of coordinated manipulation attempts.
Overall, our research addresses the need for a unified and operationalizable definition for further empirical research. By providing a synthesized conceptual framework, we lay the groundwork for future empirical studies in three areas: to assess the prevalence of inauthentic coordinated behavior relative to authentic behavior, to distinguish between harmful and harmless coordination, and to measure the impact of coordinated behavior on public opinion.
Boichak, O., Hemsley, J., Tromble, R., & Tanupabrungsun, S. (2021). Not the Bots You Are Looking For: Patterns and Effects of Orchestrated Interventions in the U.S. and German Elections. International Journal of Communication, 15, 814–839.
Giglietto, F., Righetti, N., Rossi, L., & Marino, G. (2020). It takes a village to manipulate the media: Coordinated link sharing behavior during 2018 and 2019 Italian elections. Information, Communication & Society, 23(6), 867–891. https://doi.org/10.1080/1369118X.2020.1739732
Keller, F. B., Schoch, D., Stier, S., & Yang, J. (2020). Political Astroturfing on Twitter: How to Coordinate a Disinformation Campaign. Political Communication, 37(2), 256–280. https://doi.org/10.1080/10584609.2019.1661888
Woolley, S., & Howard, P. (2019). Computational propaganda: Political parties, politicians, and political manipulation on social media. Oxford University Press.