Social bots and synthetic interactions to stage digital extremist armies (part 3) – by Daniele M. Barone

According to a report by GNET, while Daesh (and jihadist groups in general) relied heavily on bot technology, racially and ethnically motivated violent extremist networks have so far refrained from widespread bot usage, mostly because of their different objectives and the more permissive online environment in which they operate.[i]  Nevertheless, far-right or conspiracy groups’ use of social bots can still highlight further communication branches in which AI can be exploited.

Bots exploitation to arm digital crowds

The following focus on the relation between extremist or conspiracy narratives and social bots is aimed to outline additional macro-areas of communication in which social bots can be exploited to reach malicious goals. Moreover, it doesn’t exclude that also other religious, ethnically,  politically motivated extremist or terrorist groups could use these declinations of social bots for their purposes.

  • Programmed defamation campaigns

According to the US Department of Homeland Security, social media bots can be used to harass users, overwhelming them to the point of deactivation.[ii]

Harassment campaigns have long been an issue in online spaces[iii] and can bring a twofold implication when perpetrated by extremist groups: reinforce their community and narratives while depriving their targets of the use of communication to defend themselves by defamation.

In 2017 U.S. far-right activists helped amplify a leak of hacked emails, belonging to Emmanuel Macron, during its campaign for the French presidential election, with a disinformation campaign consisting of rumors, fake news, and forged documents. An analysis by the Atlantic Council found that, on Twitter, the hashtag #MacronLeaks reached 47,000 tweets in three and a half hours and appeared in almost half a million tweets in twenty-four hours.[iv] The hashtag was first used by Jack Posobiec, an internet performer and writer for the far-right news organization The Rebel, who declared to have shared a post he saw on 4chan.[v] Researchers found that the #MacronLeaks hashtag, due to the immediate, frequent, and concentrated engagement, clearly indicated the use of social bots, which also helped move the hashtag from the United States to France.[vi]

  • Support and spread polarized views and fake news

Research from the University of California analyzed the use of social bots on left-leaning tweets and right-leaning tweets during the 2020 US elections.[vii] In the macro-group of bots that tweeted right-leaning content, researchers found also clusters of bots posting highly structured conspiracy theory-related tweets with links and references to conspiracy theories (i.e. Qanon,[viii] “gate” conspiracies as #obamagate,[ix] Covid conspiracies)[x] and links to conspiracy news organizations and web sites.

These kind of bot networks are established on bots designed to post content based on the major topics discussed inside the communities they try to blend into. Once they have gained a credible profile, they can disseminate disinformation or conspiracy theories as efficiently as users’ accounts. This mechanism tends to make fake content more and more realistic, with the risk of blurring the line between legitimate political views and extremist narratives, while attracting broader support.[xi]

Furthermore, the next generation of bots will threaten to move beyond text generation to audio and video manipulation. Indeed, over time, disinformation campaigns on social media are likely to be aided by deepfakes, a type of fake audio or visual content that has been manipulated or generated using Generative adversarial networks (GANs).[xii] Example of this is the fake video showing Ukrainian President, Volodymyr Zelenskyy, calling on Ukrainian citizens to stop fighting Russian soldiers and surrender their weapons, also claiming he had already fled Kyiv.[xiii]

  • Promote events

Neo-Nazis and white supremacists use bots for announcing and promoting events, such as marches and conferences.

In June 2021 a post forwarded by a French neo-nazi channel belonging to the “Cercle des Amis d’Adolf Hitler” announced an event titled “Adolf Hitler: Une Vie, Des Valeurs” to be held in Paris. It added that those interested could use the @Cercle_Hitler_Bot to register for the event.[xiv]

Furthermore, extremist events can also be exploited by state-sponsored botnets to spread extremist narratives and discord.

In the aftermath of the events of the white supremacist rally in Charlottesville, Virginia, researchers found that a large number of automated bots generating Twitter posts helped make right-wing conspiracy theories, and rallying cries about Charlottesville, go viral. The analyzed social bots sample included pro-Russian accounts that were pushing content from state-controlled outlets Russia Today and Sputnik.[xv]

Coordinated waves of a digital crowd

Theoretically, crowd behavior can be compared to fluid dynamics. Its density doesn’t let people move forward continuously, so they need to stop and wait for another opportunity to advance, generating “stop-and-go waves.”[xvi] In these terms, digital crowds’ behavior should also be better analyzed and understood because, even though they are physically dispersed, they can be considered as a collectively intelligent complex system, with unlimited growth.[xvii]

Uncontrolled exposure to extremist narratives or disinformation can have an impact on collective behavior and “when perturbed, complex systems tend to exhibit finite resilience followed by catastrophic, sudden, and often irreversible changes,”[xviii] similarly to stop-and-go waves.

Social bots can help coordinate the extent of these waves but is still not clear how much the manipulation of digital crowds can reverberate in real life or policymaking.[xix] Indeed, existing research extensively studied bot detection, but bot coordination is still emerging and still requires more in-depth analysis.[xx] 

Even though who is running social bots is not always detectable, as bots can be exploited either for provocative campaigns or as part of an information war,[xxi] and conspiracies or extremist contents tend to follow current events even when there aren’t coordinated campaigns,[xxii] recurring patterns on the topics and languages used by botnets coordinated activities can still be detected and should be better analyzed.

For instance, some cases represent a coordinated shift of social bots to different stories; coordinated attempts to expand and actualize disinformation or extremist narratives to follow an agenda, pushing new topics, new terms, and hashtags in the social media environment.

In this respect, a study on bots and misinformation on Covid analyzed social bot tweets from January 2020 to August 2020. Some of these bots, identified between 2011 and 2019, were discovered before the pandemic and were originally designed for non-COVID-19 purposes, such as promoting product hashtags, retweeting political candidates, and spreading links to malicious content.[xxiii]

Other researchers found that, in the wake of Russia’s invasion of Ukraine, online activity on Twitter surged by nearly 20%. The analysis highlighted that ethnically motivated extremist accounts, such as those posting content on New World Order (NWO) conspiracy,[xxiv] shifted from topics related to Covid, a secret group controlling the global economy, and speculations about the end times,[xxv] almost entirely into Ukraine and Putin themes.[xxvi]

Understanding the exploitation of botnets could help increase public awareness and avoid users, and public figures, from involuntarily becoming echo chambers for malicious social bots clusters. This could be a valuable tool to prevent either state or non-state actors from generating unpredictable waves of digital crowds at their advantage.

These are not marginal aspects, because, as explained by the above-mentioned theory on crowd behavior and fluid dynamics: even though waves do not always portend a collapse, the stop-and-go wave can also be a warning signal for the situation in the crowd to become critical.[xxvii]

[i] Veilleux-Lepage Y., Daymon C., and Archambault E. (June 7, 2022) Learning from Foes: How Racially and Ethnically Motivated Violent Extremists Embrace and Mimic Islamic States Use of Emerging Technologies. Global Network on Extremism & Technology.

[ii] US Department of Homeland Security (May 2018) NATIONAL PROTECTION AND PROGRAMS DIRECTORATE – Office of Cyber and Infrastructure Analysis.

[iii] Geiger S.R. (2016) Bot-based collective blocklists in Twitter: The counterpublic moderation of harassment in a networked public space. Information, Communication, and Society 19(6).

[iv] Jeangène Vilmer J. (June 2019) The Macron Leaks” Operation: A Post-Mortem. Atlantic Council.

[v] Volz D. (May 7, 2017) U.S. far-right activists, WikiLeaks and bots help amplify Macron leaks: researchers. Reuters.

[vi] Southern Poverty Law Center. Jack Posobiec.

[vii] Ferrara E., Chang H., Chen E., Muric G., and Patel J. (October 2020) Characterizing social media manipulation in the 2020 U.S. presidential election. First Monday, 25(11).

[viii] Roose K. (September 3, 2021) What Is QAnon, the Viral Pro-Trump Conspiracy Theory? The New York Times.

[ix] Wolfe J. (May 14, 2020) Explainer: Trump keeps raising ‘Obamagate.’ What’s that? Reuters.

[x] Pertwee E., Simas C., and Larson H.J. (March 10, 2022) An epidemic of uncertainty: rumors, conspiracy theories and vaccine hesitancy. Nature.

[xi] Rovny J. (February 29, 2012) Where do radical right parties stand? Position blurring in multidimensional competition. Cambridge University Press.

[xii] United Nations Interregional Crime and Justice Research Institute (UNICRI) and the United Nations Office of Counter-Terrorism (UNCCT) (2022) Algorithms And Terrorism: The Malicious Use Of Artificial Intelligence For Terrorist Purposes.


[xiv] Stalinsky S. (April 13, 2022) Neo-Nazis And White Supremacists Are Using Telegram Bots To Recruit Members, Disseminate Content, Maintain Supporter Anonymity, Promote Events, And Obtain Information About Individuals To Be Targeted For Attack. MEMRI.

[xv] Arnsdorf I. (August 23, 2017) Pro-Russian Bots Take Up the Right-Wing Cause After Charlottesville. ProPublica.

[xvi] Lamb E. (January 17, 2017) How Fluid Dynamics Can Help You Navigate Crowds. Smithsonian Magazine.,move%20forward%20into%20any%20gaps

[xvii] Aradu C., Blank T. (2014) The Politics of digital crowds. Lo s uaderno Q, vol. 33.

[xviii] Holtz J. (June 14, 2021) Communication technology, study of collective behavior must be crisis discipline,researchers argue. University of Washington.

[xix] Schreiber M. (March 4, 2022) Bot holiday: Covid disinformation down as social media pivot to Ukraine. The Guardian.

[xx] Khaund T., Kirdemir B., Agarwal N., Liu H., Morstatter F. (August 19, 2021) Social Bots and Their Coordination During Online Campaigns: A Survey.  IEEE Transactions on Computational Social Systems.

[xxi] Cantini R., Marozzo F., Talia D., and Trunfio P. (January 4, 2022) Analyzing Political Polarization on Social Media by Deleting Bot Spamming. Special Issue – Big Data and Cognitive Computing: 5th Anniversary Feature Papers.

[xxii] Schreiber M. (March 4, 2022) Bot holiday: Covid disinformation down as social media pivot to Ukraine. The Guardian.

[xxiii] McKenzie H., Giorgi S., Devoto A., Rahman M., Ungar L., Schwartz H.A., EpsteinD.H., Leggio L., and Curtis B. (May 20, 2021) Bots and Misinformation Spread on Social Media: Implications for COVID-19. Journal of Medical Internet Research.

[xxiv] Flores M. (May 30, 2022) The New World Order: The Historical Origins of a Dangerous Modern Conspiracy Theory. Middlebury Institute of International Studies at Monterey.

[xxv] Barkun M. (May 2012)  Culture of Conspiracy: Apocalyptic Visions in Contemporary America. California Scholarship Online.

[xxvi] NCRI insight report (March 1, 2022) New World Order Conspiracy Theories and Anti-Nato Rhetoric Surging on Twitter Amid Russian Invasion of Ukraine.

[xxvii] Lamb E. (January 17, 2017) How Fluid Dynamics Can Help You Navigate Crowds. Smithsonian Magazine.,move%20forward%20into%20any%20gaps