As graphic posts on the Israel-Hamas conflict surge online, ISD researchers found that minors’ accounts are being exposed to highly distressing material despite platform safeguards. Isabelle Frances-Wright, ISD’s Head of Technology and Society, and Moustafa Ayad, ISD’s Executive Director for Africa, the Middle East and Asia, found over 300 pieces of violent and highly graphic content on Instagram, TikTok and Snapchat through accounts set us as British 13-year-olds over a 48-hour period.
Our study on the accessibility of graphic and violent material by these accounts is covered by the New York Times. In the article, report co-author Isabelle says to the outlet: “In times of conflict, where misinformation and disinformation run rampant, it becomes even more critical to safeguard young people from the potential emotional impact of such material, and provide the support necessary to process and contextualise this type of content.”
Of the surfaced posts, 239 of 305 were found on Instagram. Additionally, TikTok’s auto-suggest feature populated suggested searches within the platform such as “Gaza dead children”, “Gaza dead kids” and “dead woman Gaza”. The content found included the corpses of infants and children, as well as adults and severely injured children. Researchers took a step further to enable additional filtering tools on the platforms such as Instagram’s Sensitive Content Control feature or TikTok’s Restricted Mode, but content containing extreme gore was still accessible.
The full Dispatch is available on our website.