-- KamilaKoronska - 09 Jan 2025

UK Southport Riots: Reverse engineering ‘dis-influencer’ and far-right networks with CIB and AI methods

Team Members


  • Link(s) to video / poster / slides / data sets

1. Introduction

From 30 July to 5 August 2024, far-right, anti-immigration riots erupted in England and Northern Ireland. The violent disturbances followed a tragic knife attack in Southport, England, where three young girls were fatally stabbed during a children’s dance class.

Shortly after the incident, false claims identifying the perpetrator as a Muslim asylum seeker began circulating online, initially appearing on the Channel 3 website (which purports to be a legitimate news outlet, but whose origins are unclear) and later being picked up and amplified by far-right groups, such as the English Defence League (EDL).

During the riots, a wave of Islamophobic and anti-immigrant sentiments was shared online by accounts from various parts of the world. The riots were accompanied by a surge in divisive online activity, including template-like posts criticising UK Prime Minister Keir Starmer and the Labour government for suppressing so-called "organic protest." Many accounts promoting this narrative appeared suspicious, with histories limited to amplifying divisive content.

The police themselves attributed the spread of online disinformation as a significant factor in mobilising individuals who participated in the riots. A few platforms including Telegram and X have been recognised for playing their part.

The Institute for Strategic Dialogue (ISD) highlighted Telegram's central role in far-right organizing following the Southport stabbing, with posting activity on related channels increasing by 327% in the 10 days afterwards. Many posts attacked Keir Starmer, the Labour Party, and mainstream media like the BBC and Sky News, accusing them of enabling immigration policies and fostering anti-white British sentiment. Anti-establishment narratives further targeted the 'MSM' for allegedly fueling social division.

Far-right Telegram groups shared links to mainstream platforms like YouTube, X, TikTok, Facebook, and Instagram, spreading fake news about the perpetrator's background. On X, these false claims, amplified by so-called ‘dis-influencer’ figures such as Elon Musk and Nigel Farage, garnered millions of views. The visibility of such content on major platforms helped normalize the hate being promoted, extending its reach beyond fringe communities.

This project aims to map and analyze the topology of “dis-influencer” networks on Facebook and X by utilizing the CIB (Coordinated Inauthentic Behavior) methodology and tools like CoorTweet. Far-right supporters associated with the EDL are known for acting independently, so identifying signs of a disinformation campaign could either illuminate shifts in EDL behavior or suggest the involvement of other actors in spreading disinformation and polarising messaging.

2. Research Questions

  1. How do “dis-influencer” networks on Facebook and X interact and coordinate in spreading far-right disinformation? What patterns emerge from their network topology using the coordinated inauthentic behavior (CIB) methodology?

  2. To what extent do far-right supporters associated with the (English Defence League) EDL engage in CIB? Can we find evidence suggesting the involvement of external actors (outside far-right groups) in amplifying Southport disinformation?

  3. How effective are tools like CooRTweet and LLMs (e.g., ChatGPT) in detecting and analyzing the digital footprints of disinfluencer campaigns?

3. Tools

CooRTweet

For our work we used the coordinated detection sharing service powered by CooRTweet. The Coordinated Detection Sharing Service is a user interface for the R package CooRTweet (Righetti & Balluff, 2024), designed to analyze social media content based on coordinated sharing patterns. The Coordinated Detection Sharing Service accepts a csv dataset, performs coordinated network analysis according to CooRTweet methodology, and visualizes the networks for dynamic exploration. Despite its name, CooRTweet and the Coordinated Detection Sharing Service are platform-agnostic, applicable to any social media platform and content type, and support cross-platform and multi-modal network analysis. The tool has been built using resources and network within the vera.ai project.

4. Datasets

A dataset from CrowdTangle was downloaded during the UK Southport riots, just before Meta phased out the tool. It consists of over 16,000 posts, with total interactions nearing two million. Additional data was scraped from Twitter/X using Zeischumer, employing a corpus of hashtags generated through the snowballing technique and close reading of content associated with the (then) trending hashtag #FarRightThugsUnite. Only posts with a minimum engagement of 30 likes were included. The date range for both datasets was set to one week, beginning on July 31, 2024.

5. Methodology

ChatGPT prompt engineering and testing the model

Two stages of prompt engineering were implemented during the project. In the first stage, we instructed the model to generate labels based on its analysis and understanding of text object_ids, as well as Facebook and Twitter/X account names. This process produced a set of initial labels, which we then used to label clusters within the smaller, one-minute graph.

Steps for Training the Model:

We trained the ChatGPT model to produce labels for the larger network, focusing on distinguishing between different types of templated and similar content. This refinement was essential for conducting a more fine-grained qualitative analysis. The objective was to identify and quantify the degree to which each cluster was dominated by templated or repetitive content, which could indicate the use of astroturfing strategies. The group of coders have agreed on the following labels:
  • Call to action (anything involving doing a specific action)
  • Claims and assertions ("x entity is like Y", "all X do Y", "A% of Y group do B")
  • Open questions ("why is everything like this")
  • Praise ("hooray for X", "good work X group")
  • Criticism* ("this government is bad", "open borders have failed")
  • Insult ("screw X group/demographic", "f** ")
  • Allegiance denial ("We are not far right extremists!")
1. Building the pipleline

2. Designing labels

3. Coders manually, labelled 329

4. Data cleaning and preparation (text standardisation)

5. Few-shot prompting - passing a prompt that labels the dataset based on examples provided

6. Running the model

6. Findings

Hard Left Amplification Campaigns

By setting the time window to one minute in CoorTweet, we aimed to identify instances of algorithmic manipulation on Facebook and X/Twitter. This time frame was chosen to detect the use of bots or coordinated amplification campaigns, often orchestrated by media outlets, opinion shapers, or advocacy groups. This process resulted in the identification of 16 clusters, each representing different types of coordination triggered by various elements of the object_id. We utilized ChatGPT to generate labels for clusters based on the content of the object_id field, which included body text, images, and sequences of hashtags.

Critical to the success of amplification campaigns, we observed several instances of mass cross-posting, shared account administration, and consistent messaging, primarily attributed to accounts associated with Hard-Left or Socialist movements. This is an interesting finding, suggesting that the far-right supporters (that were visible in the 1 week network) did not influence the conversation around the Southport attack in a similar way.

Apart from news snippets that might have been organic, there were also posts directly targeting the current Keir Starmer-led government. These originated from accounts registered on Facebook as “digital creators,” such as David Simpson, who is openly pro-Russian and was actively posting at the time in multiple open groups within this cluster, such as “fb_Oh, Jeremy Corbyn !!” and “fb_The #Socialist Unity Movement.” Despite the cluster's openly anti-far-right stance, we also observed private users cross-posting public messages that appeared to include antisemitic undertones, such as accusing Keir Starmer of treating Muslim people as inferior to Jews living in the UK. Similarly, we noted the coordinated use of hashtags like #TimeForChange and #TimeForChangeIsNow, which mirrored the rhetoric employed by far-right groups at the time, calling for drastic cultural and governmental change (#EnoughisEnough). This alignment seemed dissonant with the fact that a relatively new centre-left government had recently been elected to replace the previous Tory administration.

Additionally, in the “Corbyn supporters & Tory Haters” cluster we have observed shared administration, with posts from the “Stand By Jeremy Corbyn” page frequently re-appearing in the other groups, such as the 5.8K-member “Rishi Sunak as PM, Not In Our Name.” Content across these pages and groups, with relation to Southport riots, was thematically consistent, focusing on support for Jeremy Corbyn, criticism of far-right extremism, and opposition to figures like Rishi Sunak and Nigel Farage. For example, counter-protest calls and anti-racism campaigns are replicated across groups to amplify reach. Despite relatively modest audience sizes (e.g., 14,372 likes for “Stand By Jeremy Corbyn”). Notably, many of these discussions occurred within clusters that showed strong support for Jeremy Corbyn, the former leader of the UK Labour Party. This can suggest that, despite being grassroots, supposedly leaderless movements, these clusters demonstrated a clear tendency to align with and praise a specific "leadership style."

It needs to be added, that the synchronisation in the pro-Corbyn, socialist clusters was also of a different kind to the one observed among groups that were advocating for Rejoining the EU, criticising Brexit, and showed a support for Progressive Activism. There we have seen more randomized, organic cross-posting behaviour, that was seemingly a result of people with vested interest in the political discussions surrounding Brexit.

7. Discussion

  • Criticism of current goverment from far-left and far-right groups.
  • Far right and far left groups utlising almost similar language to ciritics the goverment (at times).
  • Mass cross-posting, shared account administration and conistent messaging within a tight time window of a 1 minute was only present in the LEFT
  • Contrary to expectations, the chief dis-influencers—Elon Musk—did not appear to be highly coordinated with other accounts participating in the conversations surrounding the violent protests that erupted across the UK. Similarly, Nigel Farage from Reform UK showed no significant coordination. The only prominent account that was both visible and demonstrated coordination was Tommy Robinson.

8. Conclusions

9. References

Topic revision: r4 - 21 Jan 2025, KamilaKoronska
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback