The Facebook of Tomorrow:
An analysis of the Irish far right channels on Telegram

Team Members

Project facilitators: Salvatore Romano, Fraser Crichton. www.aiforensics.org

Design facilitator: Giada Germanò

Participants: Andrea Rouchon, Jenne van der Wees, John Mehegan, Judith de Bruin, Kevin Ha, Lauren Romijn, Mandresy Andriantsoanavalona, Marie Corradi, Nele Goutier, Pietro Tramontin, Rune Saugmann, Sanne Homan, Siyi Zhou, Xueyan Cao.

Contents

Team Members
Contents
Summary of Key Findings
Introduction
Initial Data Sets
Research Questions
Methodology
Findings
Discussion
References
Posters

Summary of Key Findings

The most significant findings of this project were that Irish far-right Telegram channels were rife with hate speech and conspiracy theories, even from political figures and parties. The data analysis revealed the most frequently discussed topics and the common topics mentioned across both unique and duplicate posts. This analysis also showed how quickly these topics were spread. The sentiment analysis presented the complex emotional landscape within the far-right Telegram posts and the image clustering revealed hate speech and antisemitic imagery.

Introduction

Over time, social media de-platforming—especially in the aftermath of the January 6 United States Capitol attack—have led far-right groups to increasingly turn to Telegram as their platform of choice. Telegram’s popularity surged in January 2021, surpassing 500 million users, fueled by heightened concerns over privacy and allegations of censorship on platforms such as Twitter, WhatsApp, and Facebook (Nicas et al., 2021). These groups are drawn to Telegram for its permissive environment, allowing them to disseminate hate speech without the constraints of censorship or the risk of being removed (Al-Rawi, (2021).

The far-right in Ireland has mirrored a global trend by migrating to Telegram, which has become their platform of choice for political communication and mobilization. As outlined in the Vice article, most far-right Irish political parties, groups, and prominent individuals have established a presence on Telegram. This shift reflects the platform’s role as a key tool for these actors to spread propaganda, organize activities, other than amplify hate speech, leveraging Telegram’s lenient content moderation policies (Lanigan, 2021).

Political violence, Intimidation and Threats. In the period leading up to the general election, far-right online threats of violence, along with real-world harassment and intimidation, were widespread in Ireland (Gallagher, et al., 2024). Examples include threats to burn down the Dáil (Healy, 2024) and direct threats targeting candidates and political figures (ISD 2024).

Two weeks ago a group of up to a dozen people — mainly men — with their faces masked gathered outside the home of Children’s Minister Roderic O’Gorman (Healy, 2024). Health Minister and Taoiseach, Leo Varadkar has had protests outside his Dublin home on several occasions over the years (Healy, 2024) and been subject to death threats (Debbie McCann 2024). Last January, a man was charged with making a threat to kill or cause serious harm to Galway TD and chief whip Hildegarde Naughton (Healy, 2024). Women politicians in particular have been targeted, for example a hoax bomb threat to the home of Helen McEntee, threat to kill Mary Lou McDonald (O’Toole, M. 2024) intense misogyny and abuse online from these individuals (Gallagher, et al., 2024).

Post-electoral conspiracies. During the European and local elections in Ireland in 2024, the Institute for Strategic Dialogue (ISD) analyzed posts across platforms including X, Telegram, Facebook, YouTube, and TikTok. This analysis uncovered claims alleging “foreign interference,” primarily targeting the legal rights of non-citizens to participate in local elections. Additionally, conspiracy theories about “voter harvesting” by political parties and non-governmental organizations were prominently circulated (ISD, 2024).

An article on TheJournal.ie (O’Connor, C. 2024) highlights how disinformation and conspiracy theories during Ireland’s 2024 local and European elections drew heavily from tactics used in the U.S. “Stop the Steal” campaign. Despite any proven election fraud in Ireland, allegations surfaced targeting non-citizens, NGOs, and media organizations, echoing patterns seen in American electoral disinformation. Claims of “foreign interference” misrepresented non-citizens’ legal voting rights, with accusations that asylum seekers and refugees were manipulated to favor specific political agendas.

These narratives, amplified online, reflect a broader global trend of using disinformation to erode trust in democratic processes, sow division, and pre-emptively delegitimize electoral outcomes. The project addresses the urgency of countering these tactics with platform transparency and a stronger EU regulation system.

X and Meta. After Trump’s election, with the backing of Elon Musk, the social media landscape is shifting rapidly. Mark Zuckerberg has announced the replacement of content moderation on Meta’s platforms with community standards which echoes the approach of X. Journalists and academics are leaving X, while far-right activists return to the platform due to its relaxed moderation. Telegram, a hub for far-right groups, may see its users migrate to X and Meta under these circumstances.

This project examines the channels of the Irish far-right on Telegram while using the Irish elections of 29 November 2024 as a case study. The project group analyzed hate speech and conspiracy narratives from politically significant channels, tracing its spread across Telegram and other platforms.

Initial Data Sets

The project group worked with a dataset containing 10.275 posts from 95 different far-right Telegram channels. The data was scraped between 6 November and 7 December 2024. From this dataset, 2.020 URLs were extracted and categorized. The URLs were sorted by platform. The categories were TikTok, Facebook, X, far-right news outlets, political party channels and mainstream news outlets. Visual content - 3685 images and videos - was also extracted for further analysis.

Research Questions

  1. To what extent are candidates subjected to harassment within far-right Telegram channels during and after the election period?

  2. How are conspiracy theories about the election results propagated within far-right Telegram channels, and what narratives dominate these discussions?

  3. How does the language and discourse (vernacular) of harassment and conspiracy narratives on Telegram compare to that on other platforms such as X, and Facebook/Instagram?

Methodology

The project group wanted to tackle three different research questions, while adding to existing research done by the Institute for Strategic Dialogue (ISD). We tested out many different methods of analysis, including but not limited to network analysis, topic modeling, and sentiment analysis. We started with analysing coordinated behaviours across channels.

Understanding duplicating behavior in telegram posting

In order to understand what topics are discussed in the telegram posts, we first conduct topic modeling to extract all the potential topics. While manual thematic analysis could take quite long for the amount of data to code, existing models for automated labeling come with limitations, especially the requirement of setting a fixed number of topics before modeling with both BERT and LDA approaches. To address those problems, we adapt unsupervised learning to detect the number of topics with a network approach to detect the topics in the texts.

We first consider each post as a node. We calculated the cosine similarities of every two posts. For 10,275 posts, there are 526,851 pairs of posts with similarity scores ranging from 0-1. We form a semantic network by considering an edge between two identical posts if their similarity score is above 0.1. The similarity score will serve as the edge weight between two nodes. Then we apply the Louvain Community Detection algorithm to cluster the nodes into groups based on their edge weights. The community detection algorithm gives results for clusters consisting of one or more posts. In our analysis, we only consider clusters with more than one post. Every cluster comes with a group id, and we assign the group id to the posts they belong to.

In order to understand the duplicating behaviour in the telegram dataset, we separate our dataset into two groups. One collects posts that are posted only once, and the other contains identical posts that are being duplicated by same or different users at different times (Table 1).

Number of Nodes

Number of edges

Number of Clusters

Number of topics with more than one post

Duplicate posts

1,532

20,498

145

8

Unique posts

19,565

1,369,383

95

20

Table 1shows a summary of the number of topics/clusters estimated in the semantic network of posts.

After grouping all the posts, we ask LLM to summarize our posts. We used the following prompt to extract topics/themes in each group of posts.

Prompt 1: You are an expert in topic modeling and topic modeling analysis. Here are a group of posts from telegram, can you generalize it into one topic for me?

Prompt 2: You are a qualitative researcher who is an expert on thematic analysis. If there is one theme in this document, what would it be?

Prompt 3: What are the important social issues discussed in this document?

Topics in Duplicate Corpus

Topics in Unique Corpus

Cultural and Political Livestream Events: Festivals, Discussions, and Independent Media.

Skepticism and Resistance Against Government Policies, Media Narratives, and Globalist Agendas

Irish Politics, Society, and Current Affairs in the Context of the 2024 General Election

Global Skepticism Toward Governance, Media, and Socio-Political Movements

Irish politics, societal unrest, and national identity issues

Polarization and Distrust in Sociopolitical Narratives and Global Structures

Political polarization, global power dynamics, and narratives of control and resistance.

Global and Local Political Discontent: Examining Governance, Media Narratives, and Socio-Political Change

Socio-political dissent and mobilization against perceived systemic and governmental overreach

Political Frustration and Socio-Economic Challenges: A Critical Examination of Governance, Policy, and Society

Community-driven nationalism and cultural preservation

Global Political Turbulence and Socio-Economic Challenges

Rant

Resistance to Institutional Narratives and Advocacy for Political, Social, and Health Transparency

Dissemination of Conspiracy Theories and Speculations Around Political Events, Global Alliances, and Alleged Secret

Global Resistance to Institutional Policies and Advocacy for National Sovereignty and Individual Freedoms

Expressions of Political and Cultural Identity through Advocacy, Events, and Resistance Narratives.

Cultural and Political Polarization Amidst Resistance to Institutional and Societal Shifts

Health Product Marketing and Claims of Natural Remedies

Global Socio-Political Turbulence and Resistance to Institutional Narratives

Global Conflict and Governance: Analyzing Political Crises, Military Actions, and Socio-Economic Challenges

Results of Prompt 1: You are an expert in topic modeling and topic modeling analysis. Here are a group of posts from telegram, can\n\nyou generalize it into one topic for me?



Topics in Duplicate Corpus Topics in Unique Corpus
Community-driven digital engagement and cultural initiatives Distrust in Institutional Authority and Advocacy for Alternative Narratives
Political discourse and public opinion in Ireland Resistance to Perceived Authoritarianism and Advocacy for Sovereignty, Freedom, and Truth
Nationalism and its intersection with political and societal challenges in Ireland Challenging Established Norms: Resistance to Authority, Cultural Shifts, and Dominant Narratives
Narratives of Power, Resistance, and Sociopolitical Control Challenging Authority: Resistance to Institutional Control, Media Narratives, and Socio-Political Dynamics
Grassroots activism and resistance against perceived socio-political and economic injustices Disillusionment with Governance and Advocacy for Socio-Political Change
Mobilization and solidarity around nationalist identity Widespread Political Dissent and Advocacy for Systemic Change
Rant Dissent and Advocacy for Autonomy in the Face of Institutional Power and Socio-Political Change
Speculative Narratives of Global Transformation and Political Secrecy Challenging Institutional Authority and Advocating for Autonomy and Societal Accountability
Community Mobilization and Resistance: Advocacy for Justice, Rights, and Collective Identity Societal Backlash Against Ideological and Institutional Hegemony
Promotion of Alternative Health Solutions with Emphasis on Natural Remedies and Anti-Mainstream Navigating Global Unrest: Resistance to Authority and Advocacy for Sovereignty and Social Justice
  Global Instability and Resistance: Examining Conflict, Governance, and Societal Struggles
  Societal Polarization and the Struggle for Ideological and Political Dominance

Results of Prompt 2: You are a qualitative researcher who is an expert on thematic analysis. If there is one theme in this document, what would it be?

Topics in Duplicate Corpus

Topics in Unique Corpus

Modern Slavery and Human Trafficking

Modern Slavery and Human Trafficking

Censorship

Censorship

Immigration and Refugee Policies

Immigration and Refugee Policies

Gender and Sexual Violence

Gender and Sexual Violence

Election Dynamics and Governance

Election Dynamics and Governance

Land Use and Food Security

Land Use and Food Security

Public Health and COVID-19 Vaccination

Public Health and COVID-19 Vaccination

Military and Geopolitical Conflicts

Military and Geopolitical Conflicts

Nationalism vs. Globalism

Nationalism vs. Globalism

Media Bias and Transparency

Media Bias and Transparency

Political Polarization

Identity Politics and Social Movements

Results of the Prompt 3: What are the important social issues discussed in this document?

Prompt 3 gives the most ideal result on the topics we wish to analyze. Thus, we label each post in our dataset with topics generated by prompt 3.

Time series analysis

After obtaining all the topics, we want to know when a topic is discussed, how often a topic is discussed, how popular a topic is, and how much attention a topic gets. To answer these questions, we need to add a few more variables relevant to the topics. We extract all the timestamps from each post and append the timestamps to the corresponding topic. We also extract engagement metrics, such as views, forwards, and likes, and append them to the corresponding topics. We calculate the average time gap and deviations of the time gaps of each topic being mentioned, as well as draw timelines of the volumes of each topic and engagement of each topic to answer these questions.

Conspiracy theories topic modeling

We used the de-duplicated dataset and a “seed” list of conspiracy theory-related terms (“plantation”, “invasion”, “voting”, “agenda 2030”, “crime”, “climate”, “globalism”, “family”, “abortion”, “pedo”, “women”, “nationalist”, “loyalist”, “gender”, “immigration”, “islam”, “africa”, “housing”). We then used those to automatically label posts with topics, either generated from the seed terms or from representative posts. We used embedding model bge_en_md_v1.5 to represent text in numerical form, and Python library BERTopic to do the modeling. We set the minimum cluster size at 15 posts in order to get meaningful clusters. This clustering also generated keywords associated with each topic. We counted how many times each keyword from a given topic appeared in the posts labelled with that topic.

Sentiment Analysis Protocol

As part of the data analysis project on Irish far-right Telegram channels, we conducted a sentiment analysis to understand the emotional tone and narrative dynamics within the extracted content. This analysis aimed to categorize and refine the emotional nuances in posts, helping to identify the predominant sentiments in far-right discussions.

To identify the most effective tools for sentiment classification, we tested several approaches. Initially, we employed the transformer pipeline from huggingFace for sentiment analysis. However, during evaluation, we found that this pipeline exhibited low accuracy, limiting the reliability for our specific dataset. Subsequently, we explored a zero-shot classification approach using the facebook/bart-large-mnli model. This method allowed us to classify content without prior task-specific training.

In the initial sentiment classification, we used transformer pipelines from Hugging Face function for zero-shot classification with the facebook/bart-large-mnli model. Posts were classified into "Positive," "Negative," or "Neutral" categories. The model processed posts in batches and assigned each post a label based on the highest confidence score. The resulting dataset included two columns: sentiment (the assigned label) and scores (confidence level).

To enhance granularity on this task, posts labeled "Positive" or "Negative" were analyzed further. Negative posts were categorized into sub-labels such as "Scepticism," "Threat," "Mistrust," "Aggressivity," "Hate," "Fear," "Anger," "Resentment," and "Victimhood." Positive posts were analyzed with labels like "Superiority," "Pride," "Empowerment," "Belonging," "Optimism," "Fun," "Solidarity," "Gratitude," "Confidence," and "Hope." This step allowed for a detailed breakdown of emotional nuances in far-right narratives.

Image clustering

We isolated all still images in the dataset after cleaning out non-Irish and suspected bot channels, and created a common still image dataset from this. This dataset contained 3685 images. We tried out different clustering methods for this dataset, using 4cat pixplot and an OpenAI CLIPmodel (Contrastive Language-Image Pre-training). Within 4cat we used the PixPlot visualisation: an explorable map of images algorithmically grouped by similarity.
As the full dataset was too big for us to handle with the CLIPmodel run on google colab, we clustered images with this model on a smaller subset containing 265 images from political figures and parties. the OpenAi CLIPmodel uses AI to analyze images and create an interactive map where images described similarly appear closer together

Findings

Sentiment analysis
The sentiment analysis revealed a complex emotional landscape within far-right Telegram posts. Key findings included the predominance of negative sentiments such as "Scepticism" and "Threat," alongside occasional expressions of "Superiority" and "Fun".

Figure 1 illustrates the most frequently discussed topics, which we define as those with the highest volume of mentions. In duplicated posts, the topics of elections, immigration, censorship, and sentiment toward the government appear most often, indicating coordinated efforts to amplify these discussions. In unique posts, the focus shifts to nationalism versus globalism, elections, and military or geographic conflicts, reflecting more organic public discourse. Notably, elections are prominent in both groups, suggesting they are a pivotal topic in public discussions. This overlap may imply that channels exhibiting duplicative behavior are strategically attempting to steer or reinforce narratives around elections within the broader public discourse. As shown in the graphs, the peak in election-related post volume within the duplicated group occurs 12 days after the peak in the unique post group, highlighting a possible time lag in coordinated amplification efforts compared to organic discussions.



Figure 1. (a) Duplicate posts (b)Unique posts.

We identified the common topics mentioned in both groups and measured the average time gap and deviations in time gaps for each topic. This analysis provides insights into how often and how quickly these topics are disseminated. Figure 2 presents the results for all topics. In most cases, duplicate posts are shared less frequently and more slowly than unique posts. The topics of censorship and the housing crisis deviate from this trend, appearing more frequently and spreading faster in duplicate posts. This indicates that these topics may be strategically targeted for amplification by actors using coordinated posting behaviors, likely aiming to heighten their visibility and influence public perceptions more effectively. The prioritization of censorship and the housing crisis in duplicate posts suggests these topics may serve as focal points for rallying support, reinforcing specific narratives, or discrediting opposing views. These efforts could reflect broader political or ideological strategies aimed at dominating discussions on issues that resonate strongly with audiences.





Figure 2. results for all topics.

Figure 3 and 4 shows the pattern of public reaction to all the posts, including view counts and forwards. Approaching election debate and election day, while most of the posts in the unique posts group show a descending pattern in their view counts, duplicate posts show more peak near those big days. This could indicate that organic discussions generate early interest but lose momentum closer to major events, possibly due to audience saturation or shifting focus to other sources of information. In contrast, distinct peaks in duplicate posts suggest effective impact after deliberate amplification efforts, where coordinated actors strategically increase their activity to maximize visibility and influence during critical moments.

The housing crisis garners the most passive (view counts) and active (forwards) attention at the beginning of the election month, indicating it is a pressing and widely resonating issue early in the campaign cycle. However, as the election debate and election day approach, collective attention shifts away from this topic. This suggests a temporal nature in public interest, where certain issues dominate discussions early but lose traction closer to key events, likely as new topics or priorities emerge.

The prominence of human trafficking and modern slavery as the most forwarded topic during the election debate highlights a surge in public interest and active engagement with this issue at a critical moment. This could reflect the debate's influence in spotlighting these topics or the audience's heightened sensitivity to emotive and urgent issues during high-stakes events.

Otherwise, we observe inconsistent patterns across topics and engagement in unique posts. This observation underscores the diversity of public discourse when regular users drive the conversation. This heterogeneity reflects organic participation, where various issues resonate differently with audiences, depending on timing, context, and individual priorities.

In duplicate posts, conspiracy-related content stands out as a consistently low-performing topic in terms of view counts and forwards before and after the election debate. However, the rapid acceleration in attention during the debate suggests deliberate amplification efforts to draw attention to conspiracy theories at a pivotal moment, likely to sway perceptions or inject doubt into public discussions. The sudden increase in engagement with conspiracy content during the debate may indicate coordinated actors timing their efforts to exploit heightened public attention and shape narratives during critical phases of the election cycle.

Figure 3.

Figure 4.

Conspiracy theory topics

We obtained 72 topics, 28 of which were associated with at least 40 posts. Among those, some were clearly unrelated to Ireland or Irish elections. The keywords associated with those were for example “vaccination” and “pandemic” , “zionist” and “Israel”, “Trump” or “Russia” and “Ukraine”. These are in line with some of the topics identified in the previous section. We zoomed in on a couple of topics that we renamed “plantation” (86 posts) and “LGBTQIA+” (43 posts). By examining the frequency of keywords in the posts associated with these topics and manually reviewing the content of the posts, we can clearly observe hate speech towards minorities being displayed in addition to unhinged conspiracy speech (see next figure).

Figure 5

Image clustering

(trigger warning: homophobic, racist and antisemitic images)

In order to explore any potential patterns in the pictures shared, we clustered 3685 images posted by Irish right wing channels using the CLIPmodel Image. Channels driven by bots and non-irish channels were excluded from the dataset.

The clustering shows that the majority of the images has text on top or on the side and that the people portrayed in the images are mostly men.

We highlight five images (Figure 6) that portray the report’s general findings about discrimination, antisemitism, fake news, anti-immigration and conspiracy theory. Some of the images that clearly picture hate speech or antisemitism are used in debates about platform moderation policies, celebrating how these images are allowed on Telegram while being banned on other platforms.

Figure 6: five images that portray the report’s general findings about discrimination, antisemitism, fake news, anti-immigration and conspiracy theory.


Images shared by recognized political figures. Strikingly, when focusing on a subset of data from political parties and politicians, images are still rife with hate speech and conspiracy, despite the fact that such individuals and parties have wider publicity and may therefore be subject to higher levels of content moderation.




The clustering shows that political posters and campaigning pictures - some featuring conspiracies and other featuring political issues such as housing - are grouped next to post images with antisemitic and hate speech content.

Discussion

The analysis of the number of posts, posting frequency, view counts, and forwards across unique and duplicate posts offers significant insights into the dynamics of public discourse and coordinated amplification during the election cycle. The results reveal contrasting patterns between organic and coordinated activity, highlighting the strategies and implications of different actors in shaping narratives.

Unique posts exhibit natural fluctuations in activity, reflecting the diverse and evolving concerns of regular users throughout the election period. The presence of multiple topics with inconsistent patterns demonstrates the heterogeneity of public discourse, where users are influenced by individual priorities and external events.

In contrast, duplicate posts display more systematic patterns, often concentrated around key moments such as the election debate and election day. The spikes in posting activity for certain topics, such as conspiracy theories, suggest deliberate amplification efforts designed to shape public discourse during high-attention periods.

The strategic timing of duplicate posts underscores the use of coordinated behaviors to influence narratives, particularly around divisive or manipulative topics. Conversely, the organic nature of unique posts reflects the breadth and variability of public engagement when individuals contribute without external coordination.

For example, topics like the housing crisis attracted the highest passive attention (view counts) early in the election cycle, underscoring their salience at that time. However, view counts for these topics declined as the election debate and election day approached, suggesting a shift in audience focus, likely due to emerging priorities or the saturation of existing narratives. In contrast, duplicate posts showed peaks in view counts around key events, particularly for topics like censorship and conspiracy theories, pointing to deliberate attempts to exploit heightened public attention to amplify specific narratives.

The disparity in view count trends between unique and duplicate posts suggests that coordinated actors work actively to maintain and amplify attention on certain topics when organic engagement wanes. This behavior raises concerns about the potential for distorting public discourse and disproportionately elevating particular narratives.

Forwarding behavior also reflects this divide. Unique posts, such as those addressing human trafficking and modern slavery, received the highest forwarding rates during the election debate, indicating the audience's heightened engagement with emotive or urgent issues at pivotal moments. The variability in forwarding patterns among unique posts further highlights the diversity and spontaneity of public discourse. In contrast, conspiracy theories in duplicate posts saw a sharp increase in forwards during the debate, despite low engagement before and after. This suggests a strategic effort by coordinated actors to amplify and propagate divisive narratives at critical moments of high visibility.

The differences between unique and duplicate posts reveal distinct dynamics in organic and coordinated activities. Unique posts demonstrate the fluidity and diversity of public discourse, where topics gain or lose traction based on audience interest and external events. Duplicate posts, however, exhibit calculated patterns of amplification, with actors strategically selecting topics and timing their efforts to maximize influence. The acceleration of attention for conspiracy theories and the amplification of topics like censorship and the housing crisis illustrate the deliberate shaping of public narratives by these actors.

On a broader scale, the coordinated amplification observed in duplicate posts raises significant concerns about the manipulation of public discourse during critical moments in the election cycle. By flooding the narrative space with specific topics, coordinated actors can distort perceptions, undermine trust, and sway public opinion. These findings emphasize the need for increased public awareness of coordinated behaviors and their impact. Media literacy and critical engagement with content are essential tools to help users navigate the information landscape and reduce the influence of manipulative narratives.

Moreover, this study underscores the importance of platform policies that can effectively distinguish between organic and coordinated activities. Proactive measures, such as enhanced detection of duplicative behaviors and timely interventions, are crucial for maintaining the integrity of public discourse and ensuring a fair and transparent information ecosystem.

References

Nicas, J., Isaac M., Frenkel S. (2021, January 13). Millions Flock to Telegram and Signal as Fears Grow Over Big Tech. The New York Times. https://www.nytimes.com/2021/01/13/technology/telegram-signal-apps-big-tech.html

Al-Rawi, A. (2021). Telegramming hate: Far-right themes on dark social media. Canadian Journal of Communication, 46(4), 821-851.

Lanigan, M. (April 30, 2021) Telegram Is the Far-Right’s Weapon of Choice in Ireland. Vice. https://www.vice.com/en/article/telegram-is-the-far-rights-weapon-of-choice-in-ireland

Gallagher, A. and McDonald, N. (June 2024) Dozens of incidents of political violence, intimidation and threats detected during Irish [Local and European] election campaign. ISD. https://www.isdglobal.org/digital_dispatches/dozens-of-incidents-of-political-violence-intimidation-and-threats-detected-during-irish-election-campaign/

Healy, by P. (2024) Face of man charged over sinister video threat to kill Mary Lou McDonald, Irish Mirror. https://www.irishmirror.ie/news/irish-news/pictured-face-man-charged-over-33277315

O’Toole, M. (2024) Far-right ‘plotted to kill Leo Varadkar’ and ex-soldier was intended shooter, Irish Mirror. https://www.irishmirror.ie/news/irish-news/far-right-plotted-kill-leo-33259771 .

Debbie McCann (2024, 10 June) ‘“I am ready to kill” -- Chilling threat of far-right agitator’. https://extra.ie/2024/06/10/news/irish-news/far-right-agitator

Analysing Claims of Electoral Interference During the Irish Local and European Elections. ISD https://www.isdglobal.org/wp-content/uploads/2024/11/Analysing-claims-of-electoral-interference-during-the-Irish-EU-elections.pdf

O’Connor, C. (2024) Elections 2024: The seeds of an Irish ‘Stop the Steal’ campaign are being sown, TheJournal.ie. https://www.thejournal.ie/readme/local-and-european-elections-6401193-Jun2024/

Layers of Lies:A First Look at Irish Far-Right Activity on Telegram. (2021) ISD. https://www.isdglobal.org/wp-content/uploads/2021/04/Layers-of-Lies.pdf

Nicas, J., Isaac, M. & Frenkel, S. (2021, January 13). Millions Flock to Telegram and Signal as Fears Grow Over Big Tech. The New York Times. https://www.nytimes.com/2021/01/13/technology/telegram-signal-apps-big-tech.html

Al-Rawi, A. (2021). Telegramming hate: Far-right themes on dark social media. Canadian Journal of Communication, 46(4), 821-851.

Lanigan, M. (April 30, 2021). Telegram is the Far-Right’s Weapon of Choice in Ireland. Vice. https://www.vice.com/en/article/telegram-is-the-far-rights-weapon-of-choice-in-ireland

Healy, P. (July 19, 2024). Face of man charged over sinister video threat to kill Mary Lou McDonald. Irish Mirror. https://www.irishmirror.ie/news/irish-news/pictured-face-man-charged-over-33277315

O’Toole, M. (July 17, 2024). Far-right ‘plotted to kill Leo Varadkar’ and ex-soldier was intended shooter. Irish Mirror. https://www.irishmirror.ie/news/irish-news/far-right-plotted-kill-leo-33259771

McCann, D. (May 10, 2024). ‘I am ready to kill’ — Chilling threat of far-right agitator. Irish News. https://extra.ie/2024/06/10/news/irish-news/far-right-agitator

O’Connor, C. (Jun 7, 2024). Elections 2024: The seeds of an Irish ‘Stop the Steal’ campaign are being sown. The Journal. https://www.thejournal.ie/readme/local-and-european-elections-6401193-Jun2024/

Gallagher, A. & O’Connor, C. (2021). Layers of Lies: A First Look at Irish Far-Right Activity on Telegram. ISD. https://www.isdglobal.org/wp-content/uploads/2021/04/Layers-of-Lies.pdf

Institute for Strategic Dialogue. (2024). Analysing Claims of Electoral Interference During the Irish Local and European Elections. https://www.isdglobal.org/wp-content/uploads/2024/11/Analysing-claims-of-electoral-interference-during-the-Irish-EU-elections.pdf

Gallagher, A. & McDonald, N. (June 9, 2024). Dozens of incidents of political violence, intimidation and threats detected during Irish election campaign. ISD. https://www.isdglobal.org/digital_dispatches/dozens-of-incidents-of-political-violence-intimidation-and-threats-detected-during-irish-election-campaign/

Posters

Topic revision: r2 - 17 Feb 2025, FilterTube
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback