Violent and graphic content of the Gaza conflict served to minors’ accounts

18 October 2023

By: Isabelle Frances-Wright and Moustafa Ayad

Content warning: This dispatch contains graphic descriptions of violence, including to children.


Over a 48-hour period, ISD analysts surfaced more than 300 posts or videos across Instagram, TikTok, and Snapchat, portraying extremely graphic, distressing, or violent imagery around the conflict between Hamas and Israel, available to the accounts of 13-year-olds utilising the platforms. This content was surfaced despite all platforms having clear policies and features designed to protect younger users from violent and graphic content. A clear majority of these distressing posts (239 out of 305) were hosted on Instagram.

The content accessible to these minor users included the corpses of infants and children, as well as adults and severely injured children. Graphic examples include naked, uncovered corpses being subjected to degrading acts (urination) and the cracked skulls of infants (less than 6 months old).

Across all platforms assessed, graphic and violent content was easily accessible to minors. To assess this, analysts created profiles on each platform with an age of 13, in the UK jurisdiction. Content related to the conflict was easily surfaced using prominent relevant hashtags, including #Gaza and #Gazaconflict, as well as content geotagged in Gaza.

While violent and graphic content was available to minors on all platforms assessed, the content appeared most prevalent and accessible on Instagram. When searching simply for “Gaza” as a minor account, 16.9% of the initial post sampling on Instagram was graphic and/or violent, compared with 3.07% on TikTok and 1.53% on Snapchat.

On TikTok, while the volume of graphic content appears lower, and some videos were removed during the analysis period, the platform’s auto-suggest feature within search populated suggestions that included “Gaza dead children”, “Gaza dead kids”, and “dead woman Gaza”.

Snapchat displayed a comparatively lower volume of violent and graphic content, partly attributable to the lower quantity of publicly available content in its “spotlight feature.” However, as this feature gains popularity, Snapchat may face increased challenges in content moderation. 

Minors having the ability to view this type of content without adequate contextualisation may be at a heightened risk of trauma, which could result in them exhibiting violent behavior or other negative outcomes. Based on a review of numerous studies on violent online content and the mental health of children,“exposure to violent content can decrease empathy and cause increased aggressive thoughts, anger, and aggressive behavior”.

The platforms assessed have safeguards in place for accounts held by 13- to 16-year-olds, with policies to limit the discoverability of graphic and mature content. Meta states it restricts the ability of minors (under 18) to view graphic content, by either removing the content or placing a warning label/interstitial over the content. TikTok, going further than its Violent & Shocking content policy applicable to adults, also prohibits the depiction of human and/or animal blood in content accessible to minors.

During the course of analysis, ISD analysts enabled additional content filtering tools, beyond the platforms’ standard enhanced safety functionality for minors. The “Sensitive Content Control Feature” was used on Instagram, the “Restricted Mode” was enabled on TikTok. Despite these safeguards, content containing extreme gore was still accessible.

Policy implications

While all platforms assessed have content policies that prohibit violent and graphic content, how these policies apply to minors is particularly unclear in the case of corpses (both covered and uncovered) that have not been subject to throat slitting, severe burns, and /or dismemberment.

Even though the EU’s Digital Services Act has legislation in place to compel VLOPs (Very Large Operating Platforms, including TikTok, Instagram and Snapchat) to prevent psychological harm and safeguard the well-being of young people, no such federal legislation currently exists in the US. This limits legal recourse to little or non at all as researchers identify this type of content, accessible to minors.

Investigation: Political violence, harassment, intimidation & threats during Ireland’s 2024 general election campaign

30 November 2024 A joint investigation by the Institute for Strategic Dialogue (ISD) and Hope and Courage Collective (H&CC) documented 55 incidents encompassing politically motivated violence, threats, harassment, targeting and smears across a spectrum of activity in the five weeks leading up to the Irish General Election on 29 November. These included 4 incidents of offline violence; 13 incidents related ...