The ‘Content Blitz’: Social Media Response Times to Terrorist Content

12th November, 2020

By Moustafa Ayad

Neither its military defeat in 2017/2018 – nor the current Coronavirus pandemic – has stopped the Islamic State of Iraq and Syria (ISIS)  from archiving its content and deploying this on social media as a recruitment strategy when opportunities arise.

On 18 October 2020, ISIS spokesman Abu Hamza al-Qurayishi released a 32-minute audio speech reiterating calls for attacks against a slew of enemies, both near and far. In the hours after its release, ISD researchers tracked bands of ISIS supporters flooding social media feeds with links to the speech across platforms such as Twitter, YouTube and Facebook (where supporters overlaid it on images, turning it into video). In an effort to understand each platform’s strengths and weaknesses as regards striking terrorist content off their platforms, ISD manually timed the platforms’ takedowns of egregious violations of their content regulation policies as outlined in their respective community guidelines.


In the past, ISIS supporters have often used Twitter as a staging ground for what researchers call a “content blitz”, the tentacles of which extend into a range of social media platforms and anonymous file sharing applications. By tracking their Twitter accounts, ISD was able to establish that these supporters do indeed mobilise small bands of accounts across platforms in the wake of a release of ISIS propaganda. They do this in order to seed and spread ISIS content as far as possible, through a mixture of newly created and clearly ‘sockpuppet’ profiles.

The ways in which these accounts post content to each platform have distinct functions for supporters. Twitter, for instance, is a content aggregator, used to spread links to other content hubs such as stand-alone ISIS websites, ISIS support accounts on other platforms, and anonymous file sharing applications. Small bands of ISIS-supporting accounts appear on the platform on a daily basis, latching onto multiple key trending hash-tags in order to mainstream their content. The hashtags used include current trends in Arabic such as #Corona_Virus or #Islamic_State. While YouTube remains a difficult platform for ISIS supporters to gain footholds in, Facebook remains a key platform for not only seeding content but creating communities of terror group supporters around that content. 

Platform response times

The platform safety nets that have been established through automated video matching systems (pre-upload) and manual detection (post-upload) to detect terrorist content varied in effectiveness across platforms in this study. YouTube acted most quickly, restricting and later taking down the full 32-minute speech (that supporters had uploaded) within 35 minutes. Using an account named after the ISIS outlet that produced the content, al-Furqan, the video garnered only 33 views in that time. 

Twitter lagged behind YouTube by several hours, with accounts exploiting hashtags and sharing links to and/or clips of the content remaining active for up to 6 hours. The content was shared with hashtags such #Islamic_State in combination with other trending hashtags to attract views and ultimately game the platform’s algorithm into boosting ISIS content as ‘top’ content under those hashtags. Facebook content shared by the Twitter set of accounts remained active days after the release. ISD research has previously demonstrated that ISIS content can remain live on Facebook for months after release, despite the platform developing an extensive hash-based detection system for terrorist content.

Facebook video posts and comments

In addition to primary source content, the Twitter network linked to key accounts on Facebook. Using only a sample of the latest release, ISIS supporters posted five Facebook video versions of the speech on the platform. One piece of content carried the original group branding as its primary image, while the others had distinct, non-official branding created for the release. The four officially-branded and non-officially branded videos garnered 10,984 views and 2,482 shares in total. In one instance, supporters held a watch party for the content. Ultimately, only one piece of content out of the five identified was taken down. The same account then wrote a post sharing a series of links to the others in the comments section of the post.

This small case study provides insight into the capacity and capabilities of different platforms to pinpoint terrorist content in the wake of new releases. It highlights gaps in moderation and detection that can be overcome by understanding just how terrorist exploitation of the platforms has shifted over the past few years. Just as the key lines of the speech noted, the Islamic State is “bakiya” or “staying”. Adapting to this reality requires understanding of the tactics of terrorist group supporters in order to build mechanisms that more effectively combat their exploitation of major social media platforms.


Moustafa Ayad is the Deputy Director of International Technology, Communications and Education at ISD, and currently leads the relaunch of the Against Violent Extremism Network — the largest, and oldest, global network of former extremists and survivors of extremist attacks — in the U.S. and Canada. Moustafa is a strategic communications professional with more than a decade’s worth of experience, whose previous work has included designing and deploying youth, elections, and alternative narrative creative campaigns in conflict and post-conflict environments across the MENA.

False and unverified claims proliferate online following Trump assassination attempt

Unverified claims about the attempted assassination of former President Donald Trump proliferated across social media in the 24 hours following the incident and continue to spread in the absence of more detailed factual information about the shooter. ISD identified the main false claims being promoted and how they mutated and spread across platforms.