The Rise of Antisemitism Online During the Pandemic

8th June 2021

Since the beginning of the COVID-19 pandemic, the economic uncertainties and anxieties around the virus have been weaponised by a broad range of extremists, conspiracy theorists and disinformation actors, who have sought to propagandise, radicalise and mobilise captive online audiences during global lockdowns. Antisemitic hate speech is often a common feature of these diverse threats, with dangerous implications for public safety, social cohesion and democracy. However, the COVID-19 crisis has only served to exacerbate a worrying trend in terms of online antisemitism. 

New research by ISD presents a data-driven snapshot of the proliferation of COVID-19-related online antisemitic content in French and German on Twitter, Facebook and Telegram. The study provides insight into the nature and volume of antisemitic content across selected accounts in France and Germany, analysing the platforms where such content is found, as well as the most prominent antisemitic narratives – comparing key similarities and differences between these different language contexts.

This Dispatch highlights the key findings to emerge from the study and lays out some of the policy implications of these findings.

 _________________________________________________________________________

The research covered the period from January 2020 until March 2021 to build insights around the impact of the COVID-19 pandemic on online antisemitism. ISD analysts used the International Holocaust Remembrance Alliance’s (IHRA) working definition of antisemitism to identify channels containing antisemitic content, before developing keyword lists to identify antisemitic expressions widely used on these channels.

Key Findings 

The research identified 272 French language and 276 German language accounts and channels spreading antisemitic messages related to the COVID-19 pandemic. Telegram was the most significant platform for the proliferation of antisemitism in German, with 200 channels, whilst in French Twitter was most prominent, with 167 accounts identified. Facebook was the second most popular platform for antisemitism in both languages.

Within a dataset of over four million posts collected from these accounts, over 180,000 posts (around one in forty) were flagged as containing antisemitic references. This comprised over 17,000 Facebook posts, over 38,000 tweets and over 124,000 Telegram posts either containing antisemitic keywords or keywords associated with Jews in channels dominated by antisemitic references.

There was a considerable growth in the use of antisemitic keywords during the pandemic. Comparing the first two months of 2020 (pre-pandemic) and 2021 (during the pandemic), a seven-fold increase in antisemitic posting could be observed on the French language accounts, and over a thirteen-fold increase in antisemitic comments within the German channels studied.

The data shows considerable audience engagement with antisemitic content across platforms. French antisemitic content on Facebook was engaged with through likes, comments and shares over half a million times during 2020 and 2021, and received over three million retweets and likes on Twitter. In Germany, antisemitic content on Telegram has been viewed over two billion times in total. German and French accounts had a combined followership of almost 4.5 million.

The study found that a small number of the noisiest accounts create an outsized share of antisemitic content. The top ten most active German language channels (less than 5% of the total list of accounts) were responsible for over 50% of antisemitic posting. The three most prolific Telegram accounts were all chat groups associated with the QAnon movement.

Qualitative analysis revealed the proliferation of several significant antisemitic narratives related to the COVID-19 pandemic. These ranged from conspiracy theories presenting vaccines as a Jewish plot to sterilise or control populations, to representations of Jews as unhygienic or as a “virus” themselves.

A number of ‘classical’ anti-Jewish tropes have also proliferated online during the pandemic. The most dominant antisemitic narratives were conspiracy theories about Jews ruling international financial, political and media institutions, which comprised 89% of German antisemitic content and 55% of French, according to a manually-coded sample of posts. Examples of overt Holocaust denial can still be found in French and German channels despite it being a criminal offence in both countries.

Most of the antisemitic content which crossed the threshold of the non-legally binding IHRA working definition was non-violent and not obviously illegal under German and French law. Addressing the proliferation of such ‘legal but harmful’ antisemitic content provides a considerable challenge for tech companies and governments alike, not least as the research showed French and German language antisemitism online to be often characterised by coded language and subtle insidious tropes that are both challenging to detect and to categorise neatly.

Key Recommendations

The research findings  demonstrate how online antisemitic content in French and German have echoed a global trend, which has seen the COVID-19 pandemic accompanied by a ′′virus of hate′′ directed against vulnerable communities.

The research comes at a critical juncture in the European policy debate around countering online hate speech. EU Member States, including Germany and France, have been at the forefront of devising legislative responses to compel social media companies to remove illegal hate speech from their platforms, through initiatives such as the Network Enforcement Act (NetzDG) in Germany and laws that have been proposed in parallel in France.

At the EU level, initiatives such as the Digital Services Act, the Code of Conduct and the European Democracy Action Plan present important opportunities for more systematic approaches to regulation and oversight of platforms. Based on the research findings, the report lays out a range of recommendations which include calls to:

Address online antisemitism as part of a comprehensive framework for digital regulation at a European level, aligning diverse EU efforts from tackling conspiracy theories and disinformation to promoting platform transparency on enforcement of terms of service.

Promote better understanding among users and platform moderators alike on the diverse manifestations of antisemitism contained within the IHRA working definition, to help recognise and address more insidious antisemitic content.

Beyond removing illegal hate speech, consider proactive measures to address the proliferation of ‘grey zone’ legal but harmful antisemitic content and behaviours prevalent across platforms, including moving beyond solely ‘content-based’ approaches towards broader ‘systems-based’ digital regulation which guarantees the safety of users while preserving rights of expression.

Support further research into antisemitism online aimed at better understanding the networks, behaviours and audiences that comprise the ecosystem of online antisemitism in order to inform effective responses. Approaches that consider image-based antisemitic content and incorporate an intersectional perspective on online hate speech are especially required.

Ultimately, ISD’s research demonstrates the need for an improved understanding of the relationship between online and offline antisemitism. Hate crimes have surged during the pandemic, and while causal links between online hate and offline attacks are difficult to establish, it is important for future research to map correlations around the real-world consequences of digital hate, as well as how offline activity can precipitate online harms. To effectively prioritise both digital and ‘real-world’ strategies to counter antisemitism, a joined-up strategy is required that is sensitive to the ambivalent relationship between online hateful content and offline hate crimes or incidents.

 

This article presents the key findings from a new research study by ISD. Commissioned by the European Commission, the report will help shape and evidence the EU’s upcoming strategy on countering antisemitism, which will be presented at the end of 2021. The full report, which provides an in depth analysis, can be accessed here.

 

Online Extremism Indice: Bangladesh

As part of the Strong Cities Network’s programme of ongoing research, ISD analysts mapped and analysed the online extremist landscape in Bangladesh. This Dispatch looks at their findings.

Ten Years On – The Enduring Legacy of the 2011 Oslo Attacks

It has been a decade since the 2011 terror attacks in Oslo, considered by many to be a “turning point” in far right extremist mobilisation. This Dispatch outlines the ways in which the ideas, tactics and dynamics underpinning the attack have come to shape one of the most persistent and dangerous terror threats we are seeing today.