Meming 9/11: A new generation of Salafi-jihadists terrorposting the September 11 attacks

12 September 2023

By: Moustafa Ayad


The anniversary of the terrorist attacks on September 11 have become a perennial shitposting rite of passage for Salafi-jihadist meme networks on mainstream social media platforms such as Facebook and TikTok, as well as encrypted messaging applications such as Telegram. Twenty-two years on, online networks dedicated to sharing content linked to al-Qaeda and the Islamic State have turned the anniversary of the largest terror attack on United States soil into a social media spectacle used to mock the United States, its military and intelligence establishment, as well as Muslims who’ve denounced the attacks.  

While the use of September 11 as a crowning achievement of the global Salafi-jihadist movement is part and parcel of this online extremist ecosystem – especially in communities that support al-Qaeda – there have been major shifts in the behavior, aesthetics, and tactics of these communities to communicate the event’s significance. Though Islamic State communities online similarly venerate the attacks, their focus is typically more muted when compared to discussion of the rise of the group’s so-called ‘caliphate’.  

Legacy and official online channels linked to a range of Salafi-jihadist groups have often memorialized the attacks as part and parcel of their outreach strategies, consistently using the anniversary as a rallying point. However, as with everything on the internet, this content, and its primary message, has also merged with other digital communities engaged in not just supporting Salafi-jihadist causes, but other internet subcultures.  

Since 2020, the Institute for Strategic Dialogue (ISD) has monitored the specific shifts in violent extremist content and narratives between established networks of al-Qaeda and Islamic State supporters and a younger network of supporters, who have used the anniversary of the September 11 attacks to celebrate and venerate the attackers, their ideology, and ridicule the United States, the victims, and Muslims who’ve denounced terrorism. In 2020 and 2021, ISD monitored and tracked an al-Qaeda support network on Facebook that developed a September 11 meme contest, which included undisclosed prizes for the most shared meme. Many of these memes were shared in public groups and pages on Facebook with meme names of Islamic State and al-Qaeda content such as “The Clanging of the Memes,” which drew its inspiration from the Islamic State video series titled “The Clanging of the Swords.”  

Established networks dedicated to the official media outlets of al-Qaeda and the Islamic State have long proliferated on messaging applications such as Telegram and RocketChat, promoting official stand-alone websites linked to the groups, which are continually targeted by governments, technology companies, and hacking collectives. In parallel, a digitally native generation of jihadist fan boys and ‘edgelords’ are forming collectives consisting of a rogue’s gallery of trolls from a range of communities, infusing new aesthetics and language into the propaganda shared about the September 11 attacks. These communities use Arabic-language films and memes as a means of mocking the impact of the attacks, and established English-language memes that have been shared on online message boards for years.  

Compatriots in content: Analyzing a broad range of account types

To understand this shift in the celebratory aesthetics, researchers at ISD identified accounts, pages, groups, and channels dedicated to sharing Salafi-jihadist content in some shape or form on three platforms, Facebook, TikTok, and Telegram, and then classified the accounts, pages, groups, and channels into legacy and support categories.  

This was comprised of legacy accounts, pages, groups, and channels that specialize in sharing official content from al-Qaeda and the Islamic State outlets. In contrast, support accounts, pages, groups, and channels often shared content produced by sources other than official outlets, aggregating media from non-official sources or producing new content. Much of the content drawn on by support accounts, pages, groups, and channels comes from pop culture reference points either in the Middle East and North Africa or in the US, making for an odd amalgamation of Subway sandwich shop memes of the September 11 attacks, to popular Egyptian “electro shaabi” music references.  

The analysis centered on 95 individual accounts on Facebook (drawn from supporter accounts tagged in posts by an al-Qaeda supporting account displaying seemingly automated behavior), four groups (shared by al-Qaeda or Islamic State supporters on Facebook), and 10 pages (liked by al-Qaeda or Islamic State accounts on Facebook), researchers also monitored 10 Telegram channels (shared by al-Qaeda or Islamic State accounts on Facebook in comments sections) and 20 TikTok accounts producing, resharing, and linking to al-Qaeda or Islamic State content. From there, researchers monitored the accounts, pages, groups, and channels for content linked to September 11. This content was then used to determine the shifts between more established and legacy accounts, pages, groups, channels, and emergent communities creating and aggregating content about the September 11 attacks using different aesthetics and sources. 

Key narratives: “Happy 9/11”

Researchers collected 100 pieces of September 11 content from both legacy and support accounts, pages, groups, and channels, and noted the shifts in content between the two different typologies. The most notable shift in how the September 11 attacks were venerated in Salafi-jihadist networks either supportive of al-Qaeda or the Islamic State is the use of multilingual memes built on characters made infamous in ‘chan culture’ combining the use of official al-Qaeda and Islamic State content with the aesthetics of the international far right. The type of content that falls into this group is typically Gigachad video memes of the World Trade Center Tower One and Two, Soy Boy Face memes of World Trade Center Tower One and Two, and content that uses fashwave and ‘Dark Foreigner’ aesthetics to celebrate the ideologues and ideology behind the attacks. ‘Dark Foreigner,’ a Canadian graphic designer named Patrick Gordon MacDonald, affiliated with the neo-Nazi accelerationist group Atomwaffen Division, popularized brooding design used by a range of neo-Nazi accelerationists, and was arrested and charged with terrorism in July 2023.  

Similarly, researchers found trolling content (“for the lulz” content) are very popular with the users in these communities.  For instance, one piece of content depicting World Trade Center Tower One and Two as being into bondage, discipline, dominance, and submission (BDSM), while wearing leather straps and bondage gags, came with a caption that read “blow me, break me, make feel like an inside job,” indicated a level of disregard for the attacks as well as dark humor present in many of these spaces online. One Telegram channel with just under 300 subscribers produced more than 150 September 11 memes and meme videos and linked to the channel in Facebook comments as a means for other accounts to use the content for their own profiles, pages and groups. The Telegram channel functioned as a meme pipeline from Telegram for users looking for September 11 memes in groups and pages on Facebook.  

In contrast to this extremist spectrum-straddling aesthetics of support accounts, legacy accounts promoted official September 11 content such as Osama bin Laden speeches, al-Qaeda magazine content and official videos from the group’s outlets to venerate the attacks and the attackers. Distinct to this group of accounts, groups, and channels, are hashtag campaigns such as “#breakthecrusaders” and “#americasnightmare” as well as specific campaigns around one of the architects of the attacks, Khalid Sheikh Muhammad. Content in these circles rarely deviates from the official channels, in many cases they are just carbon copy posts from official al-Qaeda and Islamic State channels and forums on Telegram or RocketChat. This included sharing links to the newly released al-Qaeda One Ummah magazine dedicated to the 22nd Anniversary of the September 11 attacks.  

Intersection with a wider community of online actors

Support accounts in the dataset also spread beyond Salafi-jihadist communities, and included gore posters, many of whom take joy in wonton violence. Recent analysis for ISD from Human Digital has shown how ‘gore sites’ – repositories of extremely violent content – have been used to host terrorist materials. These gore accounts similarly take part in Salafi-jihadist communities siphoning off violent content to share in their own distinct communities. In one group on Facebook, researchers found September 11 meme videos being shared along aside a meme featuring the attack footage of the Buffalo mass casualty attacker Payton Gendron. This same group was as comfortable with venerating Adolf Hitler as it was supporting Osama bin Laden, while embedding violent al-Qaeda and Islamic State content into anime memes.  

Monitoring September 11 content by Salafi-jihadist legacy and support accounts, groups, and channels on Facebook, TikTok and Telegram, distinctions between accounts that promote official content and those that imbue official content with a variety of aesthetics culled from a range of internet subcultures become abundantly clear.   

This shift can likely be attributed to several factors, but primarily the rise of a younger generation of supporters online, who are as comfortable with using online trolling content from a range of platforms and online communities, as they are with official al-Qaeda and Islamic State content. As this younger generation of Salafi-jihadist supporters comes of age in a post-September 11 world, shifts in the aesthetics, context, and composition of their content not only becomes more prominent, but a more compelling way of breaking away from the formulaic legacy content of their forebearers, infusing it with the ever-evolving counter-culture content of the internet.    

TikTok series: Policy recommendations 

ISD identified platform safety improvements for hate speech on TikTok, including better enforcement, clearer policies and filling content knowledge gaps.