The goal of this project is to map the most common controversial narratives in the Polish social media sphere regarding the war in Ukraine. We also attempt to discover whether the actors involved in the dissemination of these narratives are ordinary participants in online culture wars, for whom the conflict is another divisive topic to discuss, malicious actors and instigators supporting foreign governments, or other participants. The analysis is conducted for three platforms: Facebook, Twitter and Telegram. For each platform, the most frequent and persuasive narratives are identified and mapped using elements of OSINT and digital methods, including platform engagement, textual analysis (such as word trees) and network analysis (of actors and shared URLs).
Controversial narratives that we found could be divided into seven thematic areas, in order of engagement: refugees, historical massacres, national and economic insecurity, disease, corruption, coronavirus and conspiracy theories . Most of these narratives are repeated across all platforms, though certain platforms emphasise one or another. Conspiracy narratives (concerning, for instance, biolabs, Covid-19, and the authenticity of the war itself) are present mainly on Telegram and largely absent on Facebook. Offensive language concerning incoming refugees is found on Telegram. Facebook is predominantly used as a platform to alert the public about financial and security consequences if Poland continues to engage in helping Ukraine, but also to remind Poles of historical massacres that were orchestrated by the UPA (former Ukrainian nationalist paramilitary group). On Twitter, most persuasive narratives concern alleged diseases spread by Ukrainian refugees. Apart from providing instability, the incoming Ukrainians are said to bring along with them a series of conditions and diseases. They are Covid-19 carriers. They have measles, drug-resistant tuberculosis and HIV. There is also a post that discusses how Polio has not been eradicated in Ukraine.
Participants who actively debate the anti-Ukrainian content can be divided into a few distinctive groups. (1) Superspreaders who share copious links to external content, encouraging people to check out Youtube videos, articles or posts from other social-media groups or platforms, thus injecting controversial narratives or trying to establish new destructive behaviours. Some of the users in that group share posts in foreign languages (German, English, Russian) and in incorrect Polish. They often link to similar sources. (2) Users who post frequently and solely on controversial narratives.Their active participation on these platforms does not generate much engagement, and usually the response to the content they share is minimal. (3) Internet personalities with significant reach, such as right-wing politicians, populists, conspiracy theorists, amateur historians, novelists, professional news outlets, satirical news sites and more, who are participants of internet culture wars. Most of them are politicians and lawyers associated with the right-wing political movements (most often from the Konfederacja party), or public figures turned Youtubers who have channels dedicated to “natural” medicine, divination, magic, or coronascepticism. (4) Users who post “organic pushback” who engage in the divisive conversations on social media about the invasion in order to point out false claims and alert people not to fall for misleading content or disinformation.
Fig.1 Conspiracy Theories narrative cluster (data sampled from Telegram, Twitter, Facebook, date range: 24 February to 28 March, 2022)
Fig.2 Refugees narrative cluster (data sampled from Telegram, Twitter, Facebook, date range: 24 February to 28 March, 2022)
Fig.3 Diseases narrative cluster (data sampled from Telegram, Twitter, Facebook, date range: 24 February to 28 March, 2022)
Fig.4 Historical massacres/ UPA/ Banderites narrative cluster (data sampled from Telegram, Twitter, Facebook, date range: 24 February to 28 March, 2022)
Fig.5 Economy / national security narrative cluster (data sampled from Telegram, Twitter, Facebook, date range: 24 February to 28 March, 2022)
Fig.6 Corruption, weakness narrative cluster (data sampled from Telegram, Twitter, Facebook, date range: 24 February to 28 March, 2022)
Fig.7 Covid narrative cluster (data sampled from Telegram, Twitter, Facebook, date range: 24 February to 28 March, 2022)
2. Actors
Having established the most frequently shared controversial narratives (per platform), we sought to verify those responsible for disseminating such content and inquire into their online behaviour. Specifically, we are interested in the online identity of important nodes who intentionally inject conspiracy theories and disinformation into online debates.
Using textual analysis in the form of word trees, we singled out a number of Telegram personal accounts from our sample that spread conspiracy theories across all analysed channels. We then checked if there was anything unusual about the frequency with which these accounts were posting. We filtered out the top 20 most active, non-anonymous accounts by the number of messages they posted, which together amounted to over six million messages. Among these 20 most active accounts, we also saw those sharing controversial narratives.
We observed that some Telegram accounts are not only posting frequently but also in different languages. Since there have been press accounts about possible malicious involvement of foreign actors, we thought it was important to verify their command of the Polish language.
Among the accounts that were flagged as suspicious, five forwarded or wrote messages in languages other than Polish: Z***ia, E***a, T***da, A***el, and G***rd. It must be noted, however, the accounts were often flagged as using foreing languages because they forwarded a lot of content from other Telegram accounts run in languages other than Polish. All these accounts, with an exception of one account that was only posting in German, showed a good command of Polish.
All these accounts are actively sharing conspiracy and hateful messages, sometimes with links to domains registered in Russia. They seem to favour forwarding messages from others or their closed channels, rather than posting themselves. Indirect posting can be a particular form of Telegram activity, but it is also a clever technique to avoid manual checks on IDs that could be done with Telegram bots such as userinfobot.
In addition, these accounts also seem to understand Russian well, and find content in Russian useful to explain their agendas. We also saw that some of these accounts were deleted later in time such as Z***ia and A***el. Checking most active Telegram accounts, and verifying their language skills, and messaging patterns allowed us to make initial assumptions about their motifs, as we observed that actors, especially in the Grupa Sympatyków Konfederacji and Konfederacja PL channels appear to amplify such content. The accounts we found suspicious on Telegram show patterns of synchronised sharing. They shared similar content, mostly video content on Youtube and links to a blog section of a controversial legal advisory website - legaartis.
On Twitter, narratives about Ukrainian nationalism and UPA but also financial issues related to Poland’s willingness to accept refugees are published primarily by conservative accounts that share many posts about historical topics. Almost all accounts publishing controversial narratives were conservative and alt-right in nature, and link to news sites that are problematic (such as legaartis). The most active and retweeted account belongs to a defence journalist - Jaroslaw Wolski. Other accounts with high number of retweets belong to users with conservative sentiments such as a pro-life adovates or amateurs of Polish history, and historical memory who publish from a nationalist angle, oftentimes using derogatory language. None of these accounts, however, was guilty of sharing controversial narratives that we discovered with word trees. The problematic narratives in our sample were primarily shared by one account, who although active, has few followers and most of his retweets come from one person. His posts discuss narratives that fit into all seven of the clusters we distinguished (refugees, historical massacres, national and economic insecurity, diseases, corruption, coronavirus and conspiracy theories).
In the top 20 most active accounts on Facebook from our sample, we have not seen any non-mainstream, problematic accounts that would not be known to the public. The biggest reach on Facebook was attained by the right-wing politician - Sławomir Mentzen (109,056 total interactions) whose network prowess in relative terms has generated almost the same number of interactions as the Polish high-profile magazine OKO.press (116, 384 interactions) in the same time-frame. Accounts that primarily publish narratives about Ukrainian nationalism and the UPA on Facebook are - Ukrainians are not my brother's account together with a few right-wing historical portals. Wróżbita Maciej (Fortune teller Maciej) primarily posts about tax increases and economic problems due to increased presences of Ukrainian refugees in Poland, as did Leszek Samborski. Michal Urbaniak, on the other hand, publishes posts about the weakness and corruption of Ukraine that is in line with the Sputnik narrative.
The Facebook account that was the most alarming in our sample was Ukrainiec nie jest moim bratem, which openly encouraged hateful feelings towards Ukrainians. This account’s reach however was minuscule compared to other Facebook users from our sample, as it only had 2,726 total impressions, with content shares not exceeding 166 times. In the final stages of this research we have also learnt that the account administrator of this group (a primary school teacher) had been named and is believed to be referred to the prosecutor by the Sosnowiec city council.
A sample of problematic accounts sourced from Facebook along with controversial narratives they have shared appear innocent in comparison to content sourced from Telegram. They also pale in comparison to some press accounts about the involvement of potential malicious actors on the platform. The six narratives present on Facebook came from discussions mostly held by a mix of internet personalities whom were involved in online culture wars but we believe were not foreign such as: right-wing and populist politicians, conspiracy theorists, influencers, professional news outlets, amateur historians, nuclear energy researchers, novelists, well-known NGOs, satirical news sites, travel and lifestyle bloggers, debunking non-profit organisations and celebrities.
In addition, we have also gathered evidence that some of these actors are involved in activity that we described as organic pushback, by which they openly dispute incorrect or deliberately misleading stories concerning the Russian invasion of Ukraine and the exodus of Ukrainian refugees. Interestingly, these pushbacks are almost proportional to controversial narratives, by which we mean they debunk the most prominent anti-Ukrainian narratives on Facebook, such as economic or national security concerns. For example, Własny punkt widzenia writes: 'At the moment, most of the refugees who came to us are women with children, and there is hardly any desire in them to size the eastern parts of Poland as far as the Vistula line, as some people draw it in their fantasies'. Disputed, too, are arguments against helping Ukrainian refugees on the basis of historical massacres. As Spotted Tarnów writes: 'Stop judging people through the prism of their great-grandparents' actions. Poland wasn't a saint either. The above points are a message for people who fell victim to Russian trolls and disinformation'.
The pushback could be an effect of an exhaustive educational campaign aimed at alerting people in Poland about Russian disinformation, combined with efforts by Meta to curb problematic content. Last but not least, auxiliary initiatives from the private sector that turned passive social-media users into active players in debunking misinformation, such as the zglostrolla.pl (for Twitter and Facebook in particular), might have also helped to clean up problematic content from the platform. It should be said, however, that in our Telegram data we have seen a number of links directed to Facebook and Instagram Stories. So when we observe that there is less problematic content with visible engagement on Facebook than on Telegram we refer to written content.
3. Networks
Fig.8 Network visualization created with Gephi: analysis of actors and external links they shared on Telegram.
To better understand the context in which these controversial narratives appear, we also looked into the type of external URLa these actors were sharing. External links show the sources that are referenced, the extent to which posts reference the same or similar sources and whether there are clusters of accounts and (particularly problematic) sources, which then invites further scrutiny of the accounts.
To do that we plotted a network of all non-anonymous (at the time of the research), and non-empty messages from Telegram with external links in them (2,891 in total). We have shortened each link to the domain only, and for links to social media posts or for links pointing to video sharing platforms we used the generic domain names (all youtube links were grouped as youtube.com and all facebook posts as facebook.com). We then run the modularity algorithm to detect groups in Gephi (gephi.org). Nodes were also weighed and then sized based on out-degree ranking measures (so we know which nodes link most frequently within each cluster). (ForceAtlas2 and Label Adjust were used to layout the graph.)
For network analysis on Twitter and Facebook we used CoreText (www.cortext.net), where we used network creation from the collected data. We analysed both links provided by the accounts we highlighted that published controversial narratives and links published across the database. Looking just at the linking activity of actors involved in messaging on the channels that were analysed, we could see that some were much more active when referencing other sources, and used cross-sharing techniques (forwarding messages rather than writing themselves).
In all, through the use of modularity analysis we could subdivide the network into the following clusters:
Pink nodes – posting frequently but providing a much low number of external links
Green nodes – posting a large number of links, encouraging people to check out articles, Youtube accounts, other groups on Facebook or Telegram, linking frequently to legaartis.pl
*Blue nodes – suspicious accounts but different activity than Pink, Green or Raspberry clusters
Raspberry pink nodes – suspicious accounts but different activity than Pink, Green, or Blue clusters
Actors from the "Green" cluster were inviting people to watch Youtube videos, read articles and blog posts or branch out to Facebook groups and stories, as well as to other Telegram channels (Fig.1). That put them in direct opposition to frequent messengers from the “Pink'' cluster who, although were posting more frequently, shared relatively less linked content (hence their scant visibility in the Fig.1 graph) and when they did, the sources were less varied than for the “Greens''. Blue and Raspberry Pink accounts were posting less frequently to different suspicious external content, but we couldn’t independently label them as superspreaders.
This finding was a first step in establishing that active engagement of some of the participants in these channels was due to the casual nature of their conversations, and did not revolve around comprehensive “knowledge-sharing” as it did in the case of the “Green” cluster. That allowed us to refine our focus on actors who not only were spreading controversial narratives, but were also involved in frequent messaging with links to misleading, hateful or polarising content.
Accounts belonging to the “Green” cluster were most often linking to videos on video sharing platforms such as Youtube (168 in total) and Rumble (29 in total). These were often videos disputing the authenticity of the war itself, those blaming the Polish PM for providing the “Nazi” Azov battalion with Polish military uniforms, or discusing the presence of biolabs in Ukraine. Some of these videos are interviews with a QAnon, self-proclaimed expert on a range of difficult or scientific topics - Zygfryd Ciupka.
The second most popular citation was the blog section of a supposed legal advisory website - legaartis.pl (273 in total) that itself engages in peddling controversial narratives around the Russo-Ukrainian war. The website was flagged before by multiple fact-checking units of media outlets in Poland, such as Konkret24.pl.
The rest of their linking choices were a combination of posts from blogging sites such as Wordpress and Blogspot (38 in total) and social media content from Facebook (31 in total), Twitter (24 in total) and Tiktok (20 in total). A lot of their external links were pointing to other channels on Telegram (181 in total) of which some had messages only in Russian (nastikagroup, atodoneck), German (fufmedia, bleibtstark, NeuzeitNachrichten) or English (QNewsOfficialTV, SGTNewsNetwork, LauraAbolichannel). Importantly, when informational websites were considered, these accounts linked most often to registered in Russia sputniknews.com (27 in total) and only then to sensationalist, right-wing news portals with domains registered in Poland such as nczas.com (12 in total), medianarodowe.com (12 in total), dorzeczy.pl (7 in total), magnapolonia.pl, (8 in total). The “Green” accounts linked 9 times to stronazycia.pl, an anti-abortion website that doesn’t disclose its DNS information.
A large number of domains to which the “Green'' cluster was linking had domains registered in Poland (387), of which most links were to the blog section of the problematic legaartis.pl website (273).
The remaining external links to PL domains led to right-wing, conservative sites - nczas.com, medianarodowe.com, magnapolonia.org, dorzeczy.pl, naszapolska.pl, tvp.info, narodowcy.net.
There were also a number of links from addresses registered in the United States (65), and Russian Federation (42) as well as Canada -- mostly driven by the links to the video sharing platform Rumble.com - Canada (34). The most linked U.S. registered domains were alt tech platforms as bitchute.com and alternative news site beckernews.com but also a petition website created by a Polish Christian Culture Association on 24.02.2022 - brondlapolakow.pl, that pressures Polish government to simplify gun ownership regulations in the wake of the Russian-Ukrainian war. As for domains registered in .ru, the most often shared were links to sputniknews.com (27) and the German translation of the RT website (4). For 60 domains we were not able to find any registry data from WHOis database, and 6 domains had their addresses protected.
On Twitter, actors who publish controversial narratives link primarily to other Twitter accounts (which also share similar narratives) and to the Youtube platform. The account from which the greatest number of controversial narratives spreads (despite a small number of retweets and followers) also links to the greatest number of external links, primarily to Youtube (see Visualisation 3). This account particularly often shares links to the "ARIOWIE '' channel that revolves around corona-sceptic narratives and alternative medicine as well as to the “ARMAGEDON” channel, which publishes videos of an actor, who publishes extreme right-wing content - Wojciech Olszański, acting under the pseudonym Aleksander Jabłonowski. Olszański is a corona-sceptic and openly supports the policies of Vladimir Putin and Alexander Lukashenko. He actively participates in events organised by right-wing circles in Poland.
Following the logic of network analysis on Telegram, on Twitter we can also distinguish clusters. The light-green cluster is mainly one account that posts often with a large number of external links (mainly to Youtube or other Twitter accounts). The other colours are the remaining accounts, which mainly post text messages, occasionally linking to external sources.
Further analysis is necessary to determine the extent of foreign disinformation campaigning, which could rely on the methodology we adopted in this study for tracking suspicious actors, starting with narratives and following accounts, external links, and links to other platforms. We hope that this mode of analysis, which uses publicly available data (OSINT) and digital methods, will inspire others to use this methodology to study misinformation and problematic narratives that can lead to the discovery of problematic actors and entire networks of problematic content.
Copyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.