Minors exposed to mass shooter glorification across mainstream social media platforms
24 January 2024
By: Moustafa Ayad, Isabelle Frances-Wright
Primary Findings
Using social media accounts set up as minors’ accounts, ISD examined the availability of content glorifying mass casualty shooters for minor users across TikTok, Discord, Roblox, Telegram and X. In this investigation, our researchers surfaced 127 videos glorifying a range of mass casualty attackers on TikTok, and another 54 posts on X over the past four months, and tracked this content as it stretched across Roblox, Discord and Telegram. The findings are as follows:
- A content ecosystem glorifying mass casualty shooter exists and is readily accessible to minors across platforms including TikTok, Discord, Roblox, Telegram and X, with additional evidence of mass casualty shooters being glorified via account names and avatars on Snapchat.
- The Institute for Strategic Dialogue’s (ISD) research found that content glorifying the Christchurch attacker, Brenton Tarrant; the Charleston church attacker, Dylann Roof; Columbine attackers, Eric Harris and Dylan Klebold; and the Buffalo shooter, Payton Gendron; garnered over 1.7 million views in just the last four months.
- Content glorifying mass shooters is freely available to minors on TikTok, despite policies that prohibit it. The top three most viewed videos from the sample collected on TikTok included fictional Pixar poster versions of the Columbine shooters, a video using the livestreamed footage of the Christchurch attacks that lauded the attacker as a “very nice man” and a Roblox videogame recreation of the Columbine school shooting. The three videos combined received more than 682,000 views.
- There is a clear pipeline from mainstream platforms like TikTok to more opaque and less moderated platforms like Discord and Telegram where overt mass shooter glorification, hate speech and gore can (according to users), be more freely shared and where indoctrination and radicalization may therefore be more likely to occur. Young mass shooting fans, or those with an initial interest in mass shootings, find each other on more open and mainstream platforms such as TikTok. There they exchange information about how to communicate on more opaque platforms such as Discord and Telegram and in some cases Instagram direct messaging, for instance sharing message group invites or account usernames for entry into those channels.
- Much of this content is either created by minors, or with the intent to appeal to minors. Content venerating the attackers analyzed by ISD researchers included teenage songs, love hearts, cartoons, mass shooters’ images interspersed with pop culture figures such as Lana Del Rey, as well as anime scenes.
- The content shared on TikTok and Discord is often reposted from or centered around Roblox games, where users create avatars of mass shooters or recreate mass shootings in their entirety via gameplay. In some instances, Roblox users are creating virtual school shooting training modules to “educate” users, promoting the playable versions of the simulations on TikTok, and sharing updates on Discord, illustrating the linkages between platforms and the ability of the content to proliferate across open and closed social media spaces. Basic ban evasion tactics allow these communities to persist and grow despite moderation efforts. This includes tactics such as using contradictory hashtags and directing followers to back up accounts and secondary accounts across a range of platforms.
Mass Shooter Glorification Available to Minors Spreads Across Platforms
The glorification of mass casualty shootings and the perpetrators of these heinous acts has been a perennial problem for social media platforms. Despite numerous reports about the prevalence and impact of mass casualty shooting glorification on Meta, TikTok and X (formerly Twitter) there remain significant gaps in the abilities of companies to curtail the proliferation of content dedicated to the perpetrators of a range of mass casualty attacks globally. Instrumental to many of these reports are the effects of the videos on minors who may be interacting with, and even producing, content supportive of these mass casualty perpetrators, and specifically the role social media may play in radicalizing young people.
Using social media accounts set up as minors’ accounts, ISD surfaced 127 videos glorifying a range of mass casualty attackers over the past four months on TikTok, and another 54 posts on X. The content and the users that produce and share it are part of a pipeline of mass casualty attack content that stretches across TikTok and X, passes through Roblox, and usually ends up in closed online spaces such as Discord, where they share instructions and tactics for building out similar mass casualty simulation games and feature specific modifications (“mods”) such as gaming avatars that match the attire of mass casualty shooters.
This pipeline is typically composed of users that are steeped in gore communities where gruesome videos of suicides, murders, accidents, and war casualties are a form of online cultural currency, and are linked to, or are part of, white supremacist ecosystems rife with antisemitism, Islamophobia, racism, and xenophobia. The users will often adopt avatars of infamous mass casualty attackers such as Brenton Tarrant of the Christchurch mosque shootings in New Zealand, Payton Gendron of the Tops supermarket shooting, Dylann Roof of the Charleston church shooting in South Carolina, or Eric Harris and Dylan Klebold of the Columbine school shooting.
The Online Ecosystem of Mass Shooter Content: Made and Consumed by Minors
In many respects, the Columbine shootings are considered a foundational event for supporters and content producers of mass casualty attacks. References to Klebold and Harris, including the music they listened to and the clothing and haircuts they wore, are commonly referenced and admired across an online ecosystem that has remained active across a range of social media platforms. There have been 394 school shootings since Columbine, and more than 360,000 students have experienced gun violence in schools, yet many of these online users revere the Columbine shooting as an inspirational, generational event. ISD found the most watched video in the dataset was a TikTok video using fictional Disney posters of “Eric and Dylan”, coupled with a similar poster of a Brazilian school shooter. The video amassed more than 399,000 views in three months.
While the Columbine shooting remains and will continue to be a mainstay in these communities, the Christchurch mass casualty attack continues to inspire a significant amount of content and engagement often containing language from the “manifesto” written by the perpetrator Brenton Tarrant. ISD found 73 recent Tarrant glorification videos on TikTok, and 32 photos and posts celebrating him and his “legacy” on X, including the livestreamed video of the attack, interspersed with anime scenes for effect. The videos celebrating Tarrant on TikTok garnered more than 790,000 views and were tied to users that similarly shared content that was fervently homophobic, xenophobic, antisemitic, Islamophobic and racist. The Christchurch shooting, and the content developed in its wake, led to more specific conversations in the comment sections of the videos, such as the goading of content producers to commit similar attacks, or linking to more egregious and graphic content in closed spaces such as Discord.
ISD researchers identified an abundance of gamified Roblox “school shooting” recreation videos on TikTok, with accounts set up to share updates of “fictional shooting” games, some of which were named after a popular refrain from the neo-Nazi accelerationist Terrorgram collective. The games included recreations of the real-world mass shootings at a grocery store in Buffalo, complete with environments that mimicked the event, such as the Tops supermarket. One of the videos promoting the game was a recreation of the livestream of the Buffalo shooting. The video of the user playing the game featured the main character, presumably Gendron, shooting minorities as a blood-dripped font flashed on screen reading “ethnic cleansing.” The same user created a Columbine first person shooter game, which was banned from the Roblox platform, illustrating the connection between Columbine and other mass casualty attacks that followed. In the comments sections of the Roblox videos, researchers found other users goading content producers into attempting similar attacks, as well as calls for releasing the games more widely so that they could be available to larger audiences. One user commented on a Columbine-inspired Roblox game that this was the best game they had played, but it could use more “screaming” and “begging”.
ISD researchers similarly found five videos with more than 43,000 views on TikTok that glorified the actions of the Buffalo mass shooter Payton Gendron. These included Roblox versions of the shooting described as “educational.” Similarly, the hashtag #ericharris_dylan, which features content overtly glorifying the Columbine shooters (mixed in with legitimate documentary content) received 19.2 million views, with a large amount of the content being labeled by creators as “truecrimecommunity”.
By presenting content as educational, users are attempting to blend content which glorifies mass shooters with legitimate educational content and therefore circumvent platform guidelines via public interest exceptions. this, and other types of evasion tactics were present across multiple platforms, featuring a cross section of mass casualty shooters. Examples across TikTok and Discord include using hashtags and phrases such as “blacklivesmatter,” “truecrimecommunity,” “ilovejews,” and “fake,” that users believe will mislead platform moderation teams as to the intent of the content. Additionally, TikTok’s autosuggest feature within search would often proactively suggest misspellings of mass shooter names being used by mass shooting fans (when the accurately spelled names along with some variations were blocked from search).
On TikTok, 20.4% of videos were removed during the course of analysis, at both the video and account levels. While this shows that enforcement is occurring, the account holders frequently have back-up accounts that they have already transmitted to their followers in the event of a ban. In some instances, accounts brag in their bios and within comments about how many times they have been banned and what “account number” they are currently at. Ban evasion tactics contribute to the multi-platform spread of this content, with users exchanging communication methods (group names, account handles for messaging) across multiple platforms such as Discord, Telegram and Instagram.
The most egregious content identified was found on Discord. Within Discord servers identified by ISD during the course of analysis – which were accessible to minors – content included full length videos of mass casualty shootings, nazi iconography and instructions to “troll muslims” and “troll jews”.
While Discord has launched a “family center”, similar to other platforms, Discord does not allow parents to view the content of the groups (“servers”) the teen has joined, or what they have written, instead only providing the names of servers and accounts the teen has interacted with. Given that these servers often use benign or deceptive names (when compared to the groups content), it is unclear how helpful the “activity snapshot” that Discord provides to parents will be when attempting to protect their children from harm.
Conclusion
This research evidences major failings in platforms’ enforcement of content policies in relation to the protection of minors and the proliferation of terrorist or violent extremist content, as well as a gap in the efficacy and enforcement of existing mechanisms designed to support cross-platform information sharing and response around TVEC.
It also illustrates the trajectory of such content and engagement from mainstream to less restrictive fringe online environments in which there may be a higher risk of radicalization to violence. While content was produced for a specific platform, users were often being linked to, or ushered into closed online spaces such as Telegram and Discord, where there was an expectation of less stringent moderation.
Gamified versions of real-world mass casualty attacks have added an extra layer of complexity to the challenge platforms (and regulators) face, where “fictional” representations of real-world events are being passed off as benign games that are not to be taken seriously, blurring the lines around what is considered entertainment vs glorification and incitement of violence, particularly in the context of minors.