4 December 2023
In this data briefing, ISD provides an analysis of the specific manifestations of online antisemitism seen in the wake of the Hamas attack on Israel and the subsequent conflict, providing a breakdown of content according to the different examples presented in the International Holocaust Remembrance Alliance’s (IHRA) working definition of antisemitism. Based on manual coding of a thousand antisemitic comments by domain experts, this briefing draws on a dataset of 15,720 antisemitic YouTube comments collected before and after the 7 October attacks, annotated as antisemitic by a bespoke classifier trained against the IHRA definition.
- Among antisemitic YouTube comments on videos relating to Israel/Gaza, the most common form of antisemitism was conspiracy theories about Jews (39%), symbols and narratives of classical forms of antisemitism (19%) and calls for extremist violence against Jews (12%).
- Conspiracy theories commonly referred to alleged Jewish financial or political greed, referencing the Rothschilds or other banks, and linking conspiracy theories about previous events, including 9/11 or the USS Liberty, to the October 7 attack to claim that Israel had carried out a false flag operation.
- Classical antisemitic tropes invoked the deicide myth or often deliberately recontextualised a Bible verse labelling ‘fake Jews’ as the ‘Synagogue of Satan’.
- Many posts overtly called for violence against Jewish people or called for the destruction of Israel, often associated with Islamist ideology.
- 79% of antisemitic comments contained a single IHRA definition example. 21% of antisemitic posts contained multiple examples, while 2% of antisemitic posts contained 3 or more examples, with 4 the highest number of examples present in a single post (0.2% of the database antisemitic comments).
- The most common intersections between examples were combinations of conspiracy theories about Jewish power and classical antisemitism (9% of all comments), or conspiracy theories about Jewish power combined with either extremist calls for violence against Jews, or Nazi comparisons (both 3% of all comments).
- Despite forthright definitional debates around drawing thresholds between anti-Israel and antisemitic narratives, our analysis showed that Israel-related antisemitism was in the minority in the dataset, with most posts evaluated containing explicit and unambiguous antisemitism.
Following Hamas’ October 7 terrorist attack, Jewish communities globally experienced a dramatic upsurge in antisemitic incidents, including targeting and harassment in online spaces. In this context, ISD analysts investigated the scale and nature of antisemitism on social media platforms, drawing on a bespoke hate speech classifier. Deployment of this classifier onto YouTube comments on videos about the conflict identified over 15,000 antisemitic comments from 4 to 13 October. Comparing the three days before and after the attack, we saw over a 50-fold increase in the absolute volume of antisemitic comments, and a 242% increase in the proportion of comments which were antisemitic. Using distinct methodologies, ISD research also found a three-fold increase in anti-Jewish slurs on alt-tech social media platforms (including 4chan, Bitchute and Gab), and a major rise in threats against Jewish institutions and individuals.
This article builds on these initial findings to generate more granular insight into the nature and narratives of online antisemitism in the wake of 7 October, raising key definitional debates and identifying the overt nature of much of the antisemitic content on mainstream platforms.
First a random sample of 1,000 antisemitic YouTube comments was generated based on the classifier developed in partnership with CASM Technology (with this technical methodology outlined in full in the previous piece). ISD expert analysts manually labelled the sample according to the 11 examples of antisemitism identified by the International Holocaust Remembrance Alliance (IHRA) working definition of antisemitism. Comments which clearly related to one example were labelled, and those which were deemed to be ‘edge cases’ – i.e. comments which could represent multiple meanings – were labelled separately. Second coders were assigned to review and resolve edge cases. Given the provisional nature of this analysis, coding is likely not uniformly consistent across coders, but rather aims to be demonstrative of the rough proportions of various expressions of antisemitism identified in this research.
As the most widely accepted and useful definition for measuring the diverse manifestations of contemporary antisemitism, the IHRA definition, and its 11 accompanying examples, are designed as guides for narratives which may, in context, constitute antisemitism. Labelling comments without the full context of the videos under which they were posted might lead to ambiguity (despite all videos being broadly about the Israel/Gaza conflict). For example, whereas a comment containing expressions of celebration may appear innocuous, when relating to a video specifically about the 7 October attack it could be viewed as celebration of terrorist violence. Consideration around the complexities of determining such context at scale would likely yield further interesting analytical results in future studies.
Such were the nuances and challenges faced by analysts identifying forms of antisemitism on social media, shared by all researchers attempting to understand its scale and nature. Since 7 October, policymakers and law enforcement have struggled to apply abstract definitions of extremism and hate speech to phrases such as ‘from the river to the sea’, calls for ‘jihad’ or an ‘intifada’. Such phrases demand context, and their inference may well have changed in the post 7-October environment. There exists a clear gap between how these phrases are sometimes intended and how they are received by Jewish communities.
Researchers in this project used a plausibility test, where comments were only labelled against an IHRA example where they could be deemed to be relevant. Phrases such as ‘Israhell’ were considered to be potentially offensive but not specifically antisemitic depending on context and were only included where linked with specific indicators of intent. The considerations analysts gave to such ‘edge cases’ are brought out in the below analysis. The data included in this study thereby likely constitutes a conservative estimate.
The volumes of antisemitic YouTube comments coded against each of the IHRA working definition’s examples are elucidated in the below graph. The most common forms of antisemitism were conspiracy theories about Jews (39%), symbols and narratives of classical forms of antisemitism (19%) and calls for extremist violence against Jews (12%).
The average antisemitic post contained 1.24 examples from the IHRA definition. 79% of antisemitic comments contained a single IHRA definition example, 21% of antisemitic posts contained multiple examples, and 2% of antisemitic posts contained 3 or more examples, with 4 the highest number of examples present in a single post (0.2% of antisemitic comments).
The posts which demonstrated the greatest number of examples from the IHRA definition all contained conspiracy theories about Jewish power, classical antisemitism and comparisons between Israel and the Nazis – all among the most common IHRA examples within the dataset. The most common intersections between examples were combinations of conspiracy theories about Jewish power and classical antisemitism (9% of all comments), extremist calls for violence against Jews, and Nazi comparisons (both 3% of all comments). This is demonstrative of the underlying conspiratorial nature of a majority of antisemitic thinking.
Specific antisemitic narratives
The highest volume of antisemitic comments (36% of all comments) were labelled as “mendacious, dehumanising, demonising or stereotypical allegations about Jews”, including sharing of antisemitic conspiracy theories, laid out in the second example of the IHRA working definition. In many cases, these comments used abusive language such as “zio” or other derogatory slur words about Jewish people.
“I had dealings with Jews when living in UK […] they were the most horrible greedy crooked uncaring people I had ever meet” – Example of antisemitic stereotypes about greed
A majority of these comments engaged in some form of antisemitic conspiracy theory, of which multiple were referenced. Many conspiracies followed a common theme of Jewish manipulation of media, financial institutions, or political spaces, rooted in allegations of Jewish greed or false claims of victimhood for the ostensible purpose of controlling populations. In some scenarios, these comments were crystallised into more recognised conspiracy theories about the Rothschild family or banks perceived to be Jewish such as JP Morgan, who were accused of “enslaving” populations or at direct fault for Israel’s actions in Gaza. The Rothschild conspiracy theory appeared 24 times, while the USS Liberty conspiracy theory – which claims that Israel’s mistaken strike on an American warship during the 6 day war was intentional – was referenced 11 times.
“9/11” or “911” was explicitly referenced 40 times, placing blame on Jews, Israel or Israeli agencies for the attack. This was sometimes simply alluded to through discussion of ‘dancing Israelis’, referencing a video which allegedly shows Israeli men dancing as the twin towers were attacked, used by conspiracy theorists to evidence Israeli culpability for the atrocities.
“Your daily reminder to look up “the dancing jews” on 9/11. Jews arent your friends…” – Example of antisemitic 9/11 conspiracy theory
A highly common conspiracy was allegations of “fake Jews”; claiming that Jewish people are not the true descendants of the biblical forefathers, but rather are pretending to be so. The term “fake Jew” was used 40 times, and the Khazar conspiracy theory was referenced an additional 18 times, alleging that Jewish people are descendent from medieval Khazars. These conspiracy theories often contained biblical or classical antisemitism language and tropes. Other comments implied that the October 7 attack was a false flag or cover-up operation.
“this whole situation is a false flag operation to give Netanyahu an excuse to genocide more Palestinians while the western Zionist run media shills full scale for the Rothschilds” – Example of antisemitic ‘false flag’ conspiracy
Due to varying use of language, it was not always possible to discern the antisemitic nature of comments in the grey area between legitimate criticism of Israel and antisemitic conspiracy theories, particularly where wider context was unavailable. For example, the term “Israel lobby” is often used euphemistically to refer to alleged Jewish financial or political interests, transplanting classic conspiracy theories onto Israel as a collective of Jews. However, in some specific cases this may simply refer to aid directed towards Israel from President Biden, or Israel’s representations in the international arena, and therefore may not necessarily constitute antisemitism.
Classical antisemitism is defined in IHRA’s eighth example as the use of “symbols and images associated with classic antisemitism…to characterise Israel or Israelis”. In practice, many of these tropes are rooted in medieval or Christian religious antisemitism, including the blood libel or the deicide myth. Classical antisemitism was the second largest proportion of the sample, constituting 22% of comments.
A significant proportion of the comments labelled as classical antisemitism specifically related to the antisemitic application of a bible verse from Revelation 2:9 “I know thy works, and tribulation, and poverty, (but thou art rich) and I know the blasphemy of them which say they are Jews, and are not, but are the synagogue of Satan”. Commenters either directly lifted the verse or referred to Jews as the “synagogue of Satan”, with 76 mentions of the phrase in the sample. The decontextualisation of this verse and its use in the context of videos about the Israel-Gaza war is used to specifically repeat the classical slur that Jewish people are Satanic or anti-Christian. The phraseology “synagogue of Satan” is also commonly used by extremist Black Hebrew Israelite sects, who commonly believe they are the true descendants of twelve tribes of Israel, and are actually “edomites who hate the true Israelite”. Multiple comments also repeated the accusation that Jewish people killed Christ, promoting anti-Jewish hatred against 21st century Jewish communities on this basis.
Scholars of antisemitism often refer to the long tail of antisemitism throughout history, including its roots in medieval Christian thought, which ultimately led to the expulsion of Jews from multiple European countries. In the contemporary landscape, antisemitism is often considered to be old wine in new bottles, with the same structures of antisemitism re-imagined in the context of current affairs. The proportion of classical antisemitism in this data demonstrates not only the diversity of contemporary antisemitic thought, but the persistence of its classical iterations and religious roots in the online environment.
Calls for extremist violence
The third most common antisemitic narrative was identified in the IHRA definition’s first example, of “calling for, aiding, or justifying the killing or harming of Jews in the name of a radical ideological or an extremist view of religion”. While it is known that antisemitism exists across the political mainstream, this example specifically references incitement to antisemitic violence and other antisemitic forms of political extremism. Analysts identified very few uses of codes or phrases associated with the extreme-right, which may indicate either their lack of presence or their successful moderation. Many such calls to violence used Islamist language.
“Allah’s wrath descent down upon the Yahud” – Example of extremist calls for violence against Jews
Such extremist calls for violence constituted 14% of antisemitic comments. Comments labelled under this example included specific calls for destruction of Israel or advocating the killing of Jews. One such comment read “we will crush you, you criminal Jews, you murderers of children”, and another, “Jews are the biggest deceiving cowards. If I ever get my hands on you, I’ll show you what life ending is all about you litter snake coward”. The phrase ‘death to Israel’ was used 24 times in this sample, often accompanied by religious extremist expressions.
Some comments in this example celebrated the October 7 attack or justified it in the name of religious violence. Such extremist calls to violence and celebration of Hamas’ attack represent highly overt forms of antisemitic expression.
“This brings a big smile to my face , the racist Zionist invading coward being humbled . What a joy this has really made my day .” – Example of celebration of extremist violence against Jews
Beyond IHRA examples found in more than one in ten comments, the fourth most common narrative was comparisons of Jews or Israel to Nazis or Hitler, making up 8% of the sample. This often took the form of using the phrase “zionazi” as a form of abuse against Israelis or Jews. Other comments accused Jewish people of becoming their once-oppressors, such as calling Israelis “the Nazis of our world” or blaming Israelis for re-allocating Holocaust trauma onto Palestinians. In some cases, where Jews were seen as a European “problem”, comments that Jews should have stayed in Europe after the Holocaust instead of moving to British Mandate Palestine may be considered a soft form of Holocaust revisionism for the clear lack of understanding of the situation of Jews in Europe during the Holocaust.
Much public discussion surrounding the IHRA working definition of antisemitism relates to a number of examples of Israel-related antisemitism, namely those which deny Jewish people self-determination, label Israel a racist endeavour or hold Jewish people collectively accountable for the actions of the state of Israel. In the data sample, these examples were less commonly apparent, collectively constituting only 4% of all antisemitic comments. This finding is particularly notable due to the direct relevance of these examples to the context of the war.
Further examples which constituted a small minority of content were those denying the facts of the Holocaust or accusing Jewish people of exaggerating the Holocaust for political or financial gain. While research demonstrates the continued proliferation of Holocaust denial and distortion on social media, this sample of comments specifically on videos regarding the Israel/Gaza conflict did not typically resort to Holocaust denial to express antisemitic ideas.
This article has deconstructed the various narratives of antisemitism in YouTube comments about the Israel/Gaza conflict using the IHRA’s 11 examples, showing the most common narrative by far to be conspiracy theories, followed by classical antisemitism and subsequently extremist calls to violence.
Antisemitism on social media is often conceptualised as a ‘grey area’ harm which may not specifically break laws in its expression and can be hard to define. However, this study has identified, in its three most common narratives, overt forms of antisemitism continue to proliferate on mainstream social media platforms, and have experienced a marked increase since October 7. Indeed, comments which specifically incite violence in the name of extremist causes may well cross the threshold of criminality. Such findings mirror previous ISD research on the proliferation of branded terrorist content on X and violent content available to minors on Instagram, in pointing to failures of mainstream platforms to implement their terms of service in the wake of the October 7 attack.
A limitation of any antisemitism content classifier on social media is its need to integrate context, along with code words and dog whistles, into its identification algorithms. While this bespoke classifier has a high degree of accuracy in identifying antisemitism, it is impossible to know how much relevant content has been omitted. This study therefore does not claim that grey area content does not exist, merely that it is not the only form of antisemitism on social media, nor even the most prominent.
This study highlights the need for urgent action by social media platforms to tackle the current surge in overtly antisemitic content, which manifests in numerous guises. In the emergent regulatory frameworks in Germany, the United Kingdom and European Union, platforms now have a legal responsibility to limit illegal incitement and hate speech on their platforms. Online antisemitism does not just sit in the grey zone but as demonstrated by this study, is clearly illuminated in its purest historical and violent forms.