From 30 July to 5 August 2024, far-right, anti-immigration riots erupted in England and Northern Ireland. The violent disturbances followed a tragic knife attack in Southport, England, where three young girls were fatally stabbed during a children’s dance class.
Shortly after the incident, false claims identifying the perpetrator as a Muslim asylum seeker began circulating online, initially appearing on the Channel 3 website (which purports to be a legitimate news outlet, but whose origins are unclear) and later being picked up and amplified by far-right groups, such as the English Defence League (EDL).
During the riots, a wave of Islamophobic and anti-immigrant sentiments was shared online by accounts from various parts of the world. The riots were accompanied by a surge in divisive online activity, including template-like posts criticising UK Prime Minister Keir Starmer and the Labour government for suppressing so-called "organic protest." Many accounts promoting this narrative appeared suspicious, with histories limited to amplifying divisive content.
The police themselves attributed the spread of online disinformation as a significant factor in mobilising individuals who participated in the riots. A few platforms including Telegram and X have been recognised for playing their part.The Institute for Strategic Dialogue (ISD) highlighted Telegram's central role in far-right organizing following the Southport stabbing, with posting activity on related channels increasing by 327% in the 10 days afterwards. Many posts attacked Keir Starmer, the Labour Party, and mainstream media like the BBC and Sky News, accusing them of enabling immigration policies and fostering anti-white British sentiment. Anti-establishment narratives further targeted the 'MSM' for allegedly fueling social division.
Far-right Telegram groups shared links to mainstream platforms like YouTube, X, TikTok, Facebook, and Instagram, spreading fake news about the perpetrator's background. On X, these false claims, amplified by so-called ‘dis-influencer’ figures such as Elon Musk and Nigel Farage, garnered millions of views. The visibility of such content on major platforms helped normalize the hate being promoted, extending its reach beyond fringe communities.
This project aims to map and analyze the topology of “dis-influencer” networks on Facebook and X by utilizing the CIB (Coordinated Inauthentic Behavior) methodology and tools like CoorTweet. Far-right supporters associated with the EDL are known for acting independently, so identifying signs of a disinformation campaign could either illuminate shifts in EDL behavior or suggest the involvement of other actors in spreading disinformation and polarising messaging.
How do “dis-influencer” networks on Facebook and X interact and coordinate in spreading far-right disinformation? What patterns emerge from their network topology using the coordinated inauthentic behavior (CIB) methodology?
To what extent do far-right supporters associated with the (English Defence League) EDL engage in CIB? Can we find evidence suggesting the involvement of external actors (outside far-right groups) in amplifying Southport disinformation?
How effective are tools like CooRTweet and LLMs (e.g., ChatGPT) in detecting and analyzing the digital footprints of disinfluencer campaigns?
object_id
s, as well as Facebook and Twitter/X account names. This process produced a set of initial labels, which we then used to label clusters within the smaller, one-minute graph.
By setting the time window to one minute in CoorTweet, we aimed to identify instances of algorithmic manipulation on Facebook and X/Twitter. This time frame was chosen to detect the use of bots or coordinated amplification campaigns, often orchestrated by media outlets, opinion shapers, or advocacy groups. This process resulted in the identification of 16 clusters, each representing different types of coordination triggered by various elements of the object_id. We utilized ChatGPT to generate labels for clusters based on the content of the object_id field, which included body text, images, and sequences of hashtags.
Critical to the success of amplification campaigns, we observed several instances of mass cross-posting, shared account administration, and consistent messaging, primarily attributed to accounts associated with Hard-Left or Socialist movements. This is an interesting finding, suggesting that the far-right supporters (that were visible in the 1 week network) did not influence the conversation around the Southport attack in a similar way.
Apart from news snippets that might have been organic, there were also posts directly targeting the current Keir Starmer-led government. These originated from accounts registered on Facebook as “digital creators,” such as David Simpson, who is openly pro-Russian and was actively posting at the time in multiple open groups within this cluster, such as “fb_Oh, Jeremy Corbyn !!” and “fb_The #Socialist Unity Movement.” Despite the cluster's openly anti-far-right stance, we also observed private users cross-posting public messages that appeared to include antisemitic undertones, such as accusing Keir Starmer of treating Muslim people as inferior to Jews living in the UK. Similarly, we noted the coordinated use of hashtags like #TimeForChange and #TimeForChangeIsNow, which mirrored the rhetoric employed by far-right groups at the time, calling for drastic cultural and governmental change (#EnoughisEnough). This alignment seemed dissonant with the fact that a relatively new centre-left government had recently been elected to replace the previous Tory administration.
Additionally, in the “Corbyn supporters & Tory Haters” cluster we have observed shared administration, with posts from the “Stand By Jeremy Corbyn” page frequently re-appearing in the other groups, such as the 5.8K-member “Rishi Sunak as PM, Not In Our Name.” Content across these pages and groups, with relation to Southport riots, was thematically consistent, focusing on support for Jeremy Corbyn, criticism of far-right extremism, and opposition to figures like Rishi Sunak and Nigel Farage. For example, counter-protest calls and anti-racism campaigns are replicated across groups to amplify reach. Despite relatively modest audience sizes (e.g., 14,372 likes for “Stand By Jeremy Corbyn”). Notably, many of these discussions occurred within clusters that showed strong support for Jeremy Corbyn, the former leader of the UK Labour Party. This can suggest that, despite being grassroots, supposedly leaderless movements, these clusters demonstrated a clear tendency to align with and praise a specific "leadership style."
It needs to be added, that the synchronisation in the pro-Corbyn, socialist clusters was also of a different kind to the one observed among groups that were advocating for Rejoining the EU, criticising Brexit, and showed a support for Progressive Activism. There we have seen more randomized, organic cross-posting behaviour, that was seemingly a result of people with vested interest in the political discussions surrounding Brexit.
I | Attachment | Action | Size | Date | Who | Comment |
---|---|---|---|---|---|---|
png | graph_60seconds._labels 1.png | manage | 123 K | 09 Jan 2025 - 14:50 | KamilaKoronska |