Mis- and disinformation and conspiracy theories about the October 7 Hamas attack on Israel

7 October 2024

One year on from the October 7 attack, where 1,200 Israelis were killed and 250 taken hostage, this article takes stock of narratives which seek to deny, distort and justify the events surrounding the attack. Through the collection of a unique dataset, a network map of key actors, and qualitative analysis of the most shared content, this research provides data-driven insights into wide-reaching mis- and disinformation and conspiracy theory content about the beginning of the Israel-Gaza conflict.  


Key Findings

  • Mis- and disinformation related to the conflict and October 7 was shared by a host of actors to promote existing conspiracy theories, including antisemitic tropes and anti-establishment narratives. 
  • A common manipulation tactic by users who sought to distort or justify the October 7 attack includes the seemingly deliberate misrepresentation of credible reporting from news outlets in order to promote conspiracy theories.  
  • Mis-contextualised conflict footage and misrepresented findings from media investigations were used to infer that the Israeli military killed its own civilians on October 7 in numbers surpassing the deaths caused by Hamas.  
  • Network mapping of conversations on X about the October 7 attacks shows tightly interconnected communities, with opposing influencers in regular conversation with each other. 
  • Violence against civilians was justified through the ‘myth of the Israeli civilian’, which characterises Israeli civilians as legitimate military targets. There was active discussion about this concept, with the exact phrase “no such thing as an Israeli civilian” posted on X 1,333 times by 1,213 unique authors between 1 October 2023 and 31 July 2024. 
  • The focus of online conversation changed over the course of the data collection period in accordance with real world events like the release of an Israeli media investigation into use of the Hannibal Directive on October 7 and Israeli allegations against UNRWA, but such events were typically used to reinforce pre-existing beliefs.  

Introduction

Previous ISD research has mapped the widespread proliferation of mis- and disinformation, extremism and targeted hate across social media in the aftermath of Hamas’ attack on Israel on October 7 2023 and the subsequent war in Gaza. However, there has been less focus to date on the origins or impact of conspiracy theories which have distorted facts about the events of October 7 itself. Adjacent content has sought to justify violence against civilians on October 7, either by referencing the actions of the Israeli state or by characterising victims as legitimate military targets. Narratives which excuse or celebrate violence against civilians, visible on both sides of the conflict, constitute an alarming disregard for international humanitarian law.  

This analysis examines the digital ecosystem on X relating to the distortion of facts related to – and justification of – the attack on October 7. Network mapping will first present the nature of online conversation, highlighting actors and communities of particular influence. Then, the core tactics and themes of mis- and disinformation, conspiracy theories and justification of violence relating to October 7 are explored through a set of case studies. 

Forthcoming research from ISD investigates a parallel discourse which dehumanises Palestinians, particularly in Gaza, and frequently advocates for collective punishment and other violations of international law. The parallels between that discourse and the narratives analysed in this investigation are significant. 

Methodology

ISD analysts collected posts on X (formerly Twitter) containing one or more keywords [1] relevant to discourse distortion or justification of violence on October 7. While this is not a problem unique to X, researchers chose to focus only on X due to its evident centrality in online conversations relating to October 7, but also due to limits to data access on other platforms. The keyword list was based on existing ethnographic monitoring insights and previous research on the conflict.  

In total, ISD collected 1.8 million posts on X from 1 October 2023 – 31 July 2024, posted by nearly 600,000 unique authors. The 150 posts with the highest reach at each of six peaks during this period were then analysed qualitatively in order to understand the most popular narratives and tactics employed by X users in the period.  

Figure 1: The volume over time of posts on X discussing distortion and justification of the October 7 attacks.

Network Mapping: Nature of Online Conversation

To generate a snapshot of activity seeking to distort information surrounding the October 7 attack, and in some cases even to justify it, analysts created a network map based on X data from 31 May – 31 July 2024. This period was chosen in order to incorporate the most recent peak in relevant discussion on the topic.  

In Figure 2, each node represents a user, and links are drawn between nodes when users share, reply to, or mention each other. Accounts whose posts were popular (determined by the total number of outbound links) appear larger in the graph, indicating influence within the network and therefore the online conversation between these actors. As such, the structure of the map reveals the network’s central and fringe actors, highlighting the flow of engagement around key narratives and clusters of users interacting more frequently with one another. 

Figure 2: Network graph illustrating user interactions on X based on a keyword search of English-language discourse between 31 May – 31 July 2024.

While social network maps about contentious issues often display as a ‘polarised crowd’ – with tightly focused clusters of users engaging primarily with like-minded individuals – this network graph indicates a more complex dynamic. Our network mapping shows that in this case, X users promoting opposing narratives are tightly interconnected, often engaging with each other through replies or mentions. 

Discourse about the October 7 attack, and the ensuing actions taken by both Israel and Hamas, has been deeply polarised, with users presenting contradictory information and interpretations of the same events. The map above highlights that influencers across the political spectrum are key drivers of this conversation, frequently directing their comments towards official Israeli government accounts, media outlets or journalists, and those with differing views. These interactions have produced a network that spans diverse ideologies and types of accounts. 

One notable trend is the connection between those sharing mis- and disinformation content and media outlets; it is within these spaces that we observed information from traditional news sources being distorted to promote misleading narratives. Similarly, we also observed significant overlap in interactions between highly followed accounts with starkly opposing viewpoints, as they are often engaged in disputes over basic facts related to the October 7 attack and its aftermath.  

For example, while quadrant A includes several large pro-Israel X accounts, it also includes smaller nodes that represent the accounts of international bodies such as UNRWA and that of the US President, demonstrating regular tagging of those accounts. Additionally, two large nodes representing the Israeli government and the IDF are intermingled with pro-Israel influencers, demonstrating the central role of government sources in discussion about the conflict on X. 

Meanwhile, users positioned on the fringes of the network (quadrants C and D), particularly those justifying the attack, tended to function more as broadcasters than engagers. These accounts disseminated messaging to their followers without participating as much in debates, in contrast to the more interactive behaviour observed within the core of the network of users. 

Strategies of Denial and Justification

A central tactic used to spread mis- and disinformation was the misrepresentation or selective use of reputable reporting. This is evidenced in the network map in the size of nodes relating to media sources, and their proximity to accounts promoting mis- and disinformation or conspiracy theories. At times throughout the data collection period, users shared a piece from a reputable source while making inaccurate claims about the contents of the report. Research has shown that users often share (and receive) news posts without accessing the linked news piece, making misrepresentative links in a social media post an effective tool for spreading mis- and disinformation amongst those unlikely to engage with the shared source content.  

For example, in one case, self-described “MAGA communist” influencer Jackson Hinkle misrepresented reporting from Israeli news outlet Haaretz. Hinkle is known for spreading mis- and disinformation about the Russia-Ukraine conflict, for example by falsely claiming that Ukraine was behind the Crocus City Music Hall attack in March 2024.  

In relation to October 7, Hinkle shared a link to a Haaretz article on X, with text stating that only 900 Israelis died, 50% were soldiers, and most of the fatalities were settlers (Image 1). Four hours later, these claims were refuted by Haaretz as “blatant lies” with “absolutely no basis in Haaretz’s reporting”. Despite this refutation, Hinkle’s post from October 2023 had garnered over 5 million views by the time of analysis and remains available on X. Haaretz’s post, on the other hand, has 2.6 million views, and therefore is unlikely to reach all users who saw Hinkle’s false representation of its content. 

False representation is a particular issue when the influencer has a larger following than the news outlet, or if they have distinct follower bases. Even if the outlet refutes the claims and the refutations receive high engagement, it is possible that many users who consumed the original misinformation will never see this correction. 

Image 1: Post shared by Jackson Hinkle incorrectly referencing a Haaretz article.

While Hinkle’s post remains available on X, it now features a Community Note regarding Haaretz’s refutation. Community Notes are a crowdsourced platform feature on X and, while recent studies have found they are largely accurate on topics like the COVID-19 vaccine, it is possible that with politically-charged and rapidly-changing situations such as conflict, users that write Community Notes could introduce their own biases. 

This case demonstrates how, even with high traction posts, there are limited options for news outlets who are aware that their reporting is being misrepresented and reaching large audiences in the process. When the available measures are implemented, such as Community Notes and public refutations, their reach and effect is also limited. Such misrepresentation can cause reputational damage to the news outlet and contributes to overall muddying of the information environment around the conflict. 

Key Trends

When analysing the misrepresentation of media and other reputable news sources, three key trends emerge: the dissemination of false or misleading information; the justification of violence against civilians; and the spread of conspiracy theories. 

Dissemination of False and Misleading information 

A key false narrative disseminated on X about the October 7 attack was that the majority of the victims were Israeli soldiers. In some cases, users claimed that Israeli forces deliberately lied about the military background of victims in order to gain sympathy and support on the international stage for their offensive in Gaza. Another common narrative was the exaggeration of Israel’s role in Israeli civilian deaths on October 7.   

Image 2: An article shared by The Grayzone suggesting Israel shelled its own civilians on October 7.

Within the propagation of this narrative, an article shared by the Grayzone on 27 October formed the basis for subsequent high traction posts on X. The Grayzone is an outlet known to have previously shared disinformation about other conflicts and is alleged to have ties to both Russian and Iranian state media.  

The Grayzone article suggests that Israeli security forces were responsible for high numbers of civilian casualties on October 7, shifting the blame from Hamas actions to the Israeli military. However, the article misrepresented the reporting upon which this claim was based. For example, the article claimed that “at least 340 active soldiers and intelligence officers were killed on October 7, accounting for close to 50% of total confirmed Israeli deaths.” This does not align with the number for total confirmed Israeli deaths in the Haaretz source article being 1,200.  

The article also used a clip from frequently mis-contextualised footage of a helicopter (discussed in more detail below), implying it was targeting Israelis on October 7. At the time of analysis, the Grayzone article had been shared 49,200 times on X and had 7.9k engagements on Facebook.  

Image 3: A post with 9 million views highlighting The Grayzone’s article.

Articles which include mis- and disinformation, or misrepresented reporting from news outlets, also formed the basis for other high-traction content which sought to exaggerate Israeli culpability for civilian casualties on October 7. Alleged evidence for this culpability included footage supposedly shot from a military helicopter in Gaza released by the IDF on October 9. The video was falsely misrepresented as footage from October 7 and claimed to show the IDF shooting at Nova Festival in southern Israel – one of the sites of the Hamas attack.  

Across ISD’s dataset, this footage of the Apache helicopter was mentioned over 13,000 times by over 10,000 unique authors. While it was refuted by journalists as early as 9 November, users continued to share the footage throughout the following year, reaching millions of views collectively.   

Images 4 & 5: Posts by the same account 10 months apart using the mis-contextualised Apache helicopter footage to claim that Israel killed its own civilians on October 7. The two posts have nearly 16 million combined views.

Justification of violence against civilians  

A key feature of posts justifying the October 7 attack and associated atrocities was to designate civilian victims as either “settlers” or “soldiers”. The implication of this argument is that civilian victims of October 7 were instead legitimate military targets. The “no Israeli civilians” discourse is notably paralleled in the Israeli and Western context by a “no innocents” discourse, which rejects the description of Palestinians in Gaza as civilian and asserts collective complicity for October 7. In both cases, proponents frequently attempt to portray enemy civilians (including minors) as active or potential military threats, including by allegations of association with Hamas or the IDF, or by the presence of weapons in civilian houses.  

For example, the most shared post in the dataset (image 6) includes images of Israeli settler attacks on Palestinians in the occupied Palestinian territories, which it describes as the “myth of Israeli ‘peaceful’ civilians”. As such, the post uses images of Israeli civilian violence to infer that all Israelis are violent and therefore legitimate targets.  

Image 6: A post with 5 million views alleging the ‘myth of Israeli “peaceful” civilians’.

Other high-traction posts sought to justify the targeting of civilians by Hamas on October 7 with references to Israeli actions and perceived injustices against Palestinians. 

Conspiracy theories about October 7 and related reporting 

In some cases, the idea that the Israeli military deliberately killed its own citizens was linked to the ‘Hannibal Directive’ – a controversial Israeli military policy that directs the use of force (even if it poses a threat to life) to prevent the abduction of hostages. The Hannibal Directive has been central to false claims that Israeli security forces killed as many or more civilians than Hamas, and in downplaying well-documented war crimes against civilians. Corresponding with a peak of relevant posting activity in the second week of July 2024, a Haaretz article on the use of the Hannibal Directive on October 7 at three army facilities was mentioned 16,513 times on X, the most of any URL shared in our dataset.  

Image 7: High traction post uses the Haaretz article on the Hannibal Directive to shift responsibility for Israeli civilian casualties from Hamas to Israel.

The events of October 7 fuelled a diverse range of online conspiracy theorists, particularly those with anti-establishment or anti-globalist views. For example, one right-wing nationalist UK account suggested that October 7 may have been a UN operation, reflecting a growing trend of conspiracy theories targeting NGOs, international organisations and reputable news outlets. Additionally, in a livestream on October 8, misogynist influencer Andrew Tate implied that the Israeli state may have instigated the war as a pretext for a tightening of government control over its population.  

Conclusion and recommendations

Dehumanising tropes and the proliferation of mis- and disinformation in the context of the Israel-Hamas war have a toxic effect on the wider online ecosystem. These themes are then used to draw connections between the conflict and escalating tensions amongst domestic communities in Western countries. The post-October 7 increase in anti-Muslim and antisemitic hate online has been accompanied by a significant increase in reported hate crimes in European countries, the US and Canada.  

This research demonstrates how public figures and influencers on X misrepresent reputable news outlets as a tactic to spread mis- and disinformation online and contribute to conspiracy theories about the events of October 7.  

In response, the EU’s Code of Practice on Disinformation requires platforms to “adopt, reinforce and implement clear policies regarding impermissible manipulative behaviours and practices on their services, based on the latest evidence on the conducts and tactics, techniques and procedures (TTPs) employed by malicious actors”. The misrepresentation of source media and reporting constitutes a clear tactic in the propagation of online disinformation which platforms have a duty to prevent and should incorporate into the training of their content moderation teams. 

Social media platforms should set up dedicated teams and develop policies that allow trusted flaggers, including journalists, to report content that demonstrably misrepresents their work so this content can be properly labelled and contextualised. They should also implement further measures to make sure that this misinformation is not simply reposted. Such policies would protect the reputation of credible news outlets as well as limit malicious efforts to exploit their name to spread false and misleading content.  

In the absence of comprehensive enforcement of these policies, conspiracy theorists can continue to misrepresent credible reporting in the knowledge that users may fail to verify source information. Platforms should also adequately fund their trust and safety and content moderation teams to ensure that reports of mispresented work can be addressed efficiently to minimise further spread. 

To further help users to contextualise the information they are seeing, platforms should implement user transparency tools, such as the ability to see the age of and significant changes to accounts and posts. Finally, the effect of Israel-Hamas war discourse on conspiracist or misinformation news outlet accounts should be the subject of ongoing monitoring to inform efforts to prevent polarisation amongst platform users.  

The sole responsibility for any content supported by the European Media and Information Fund lies with the author(s) and it may not necessarily reflect the positions of the EMIF and the Fund Partners, the Calouste Gulbenkian Foundation and the European University Institute. 

 

End notes

[1] (“October 7” OR “7/10” OR “10/7” OR “Oct 7” OR “7 October” OR “7 Oct” OR (“Al-Aqsa*” AND (“Storm” OR “Flood” OR “Deluge”)))

AND

(“fake*” OR “forged” OR “lies*” OR “psy-op” OR “conspiracy” OR “believe” OR “own people” OR “trust” OR “Hannibal” OR “no such thing as” OR “settlers” OR “deserved*” OR “open air prison” OR “Warsaw” OR “prison breakout” OR “starved” OR “inmates” OR “settler colonial” OR “justified*” OR “is behind” OR “cover up” OR “involved”)