One click away: Foiled plot targeting Taylor Swift concert highlights minors’ access to terrorist content online
7 August 2024
By: Isabelle Frances-Wright, Moustafa Ayad and Ellen Jacobs
In the wake of a foiled Islamic State-inspired (IS) plot to conduct a mass casualty attack on Taylor Swift’s Vienna concerts this week, the social media footprint of the accused has become central to the question of how the teenage suspects were radicalized. Media reports, as well as authorities, have indicated that s were radicalized online, thrusting the role of encrypted messaging applications and social media platforms once again into the limelight. ISD research has long followed IS’ use of social media platforms, tracking their recruitment and propaganda tactics. Both mainstream and fringe platforms contribute to the proliferation of terrorist content, and minors’ exposure to that content.
Following the reports of the teenage Vienna suspects alleged radicalization, ISD assessed just how easily terrorist content was accessible to minors across Instagram, YouTube and TikTok, with a specific focus on IS and al-Qaeda (AQ) content naming prominent terrorist ideologues. Using a handful of English search terms, analysts found 56 videos or posts accessible via accounts set up for minors that appeared to violate platforms’ terms of service (see below). Of the searched terms, only 25 percent were blocked on TikTok, while none were blocked on YouTube or Instagram. In many cases, simple workarounds were available to reach the terrorist content through these accounts. If analysts had searched beyond the most easily accessible videos or used common misspellings of these figures (a typical workaround for those seeking to evade platform policies), this number would almost certainly have increased.
Background
Of the three teenagers arrested for the plot, authorities stated that a 19-year-old and a 17-year-old “radicalized themselves on the internet” ahead of the planned plot to “kill as many people as possible” at the now-cancelled concert. Authorities also detained and questioned a 15-year-old in connection with the case. While academics have noted that the internet provides the opportunity for self-radicalization, the influence of social networks — such as friends, families and acquaintances — cannot be dismissed.
News reports in both the US and Europe noted the 19-year-old and 17-year-old pledged allegiance to the current caliph of IS, citing unnamed sources stating the 19-year-old pledged allegiance to the Islamic State Khorasan Province (ISKP). German-language outlets such as Bild have cited unnamed German intelligence sources indicating the German Salafi preacher Abdul Baraa (real name Ahmad Armih) played a role in radicalizing at least one suspect. Baraa’s TikTok, YouTube, Spotify and Instagram channels have followers in the tens of thousands, and German authorities have previously expressed concerns that his content is an “accelerant” for radicalization.
Since January 2024 there has been a marked increase in the number of minors arrested for IS-inspired plots and sharing IS-linked propaganda across Western Europe. The arrests also coincide with a significant increase in propaganda from IS and al-Qaeda (AQ) calling for attacks across Europe, especially in France and Germany. IS support groups produced posters depicting stadiums in France, Germany and the US as desired targets for attacks. This propaganda began circulating following the Moscow Crocus City Hall attack – for which IS claimed sole responsibility – which killed 60 people and wounded more than 145.
In the UK, a recent analysis co-authored by ISD showed that 16 individuals were convicted of Islamist terrorism offences they committed as minors, which paled in comparison to minors engaged in extreme-right terrorism. The research found that there were seven attempts by children to travel independently to Syria and Afghanistan to engage in terrorism, while three minors sought to engage in a domestic terrorist attack on UK soil.
Over the years, the Islamic State has sought to tailor some of its content to youth, including the development of specialty magazines, as well as the use of cartoons and mobile video games.
ISD has repeatedly found that many of these communities use popular social media platforms such as TikTok, Facebook and X (formerly Twitter) to further spread their content. They also often use encrypted messaging applications such as Telegram to share more graphic content and discuss concrete ways to support terrorist groups. Part and parcel of the issue is the algorithmic architecture of popular social media platforms and lapses in moderation, effectively serving users terrorist content affiliated with IS or AQ.
Terrorist content easily accessible to minors through basic search terms
ISD analysts assessed just how easily terrorist content was accessible to minors across Instagram, YouTube and TikTok, with a specific focus on IS and AQ content naming prominent terrorist ideologues. Using eight English search terms referencing extremely well-known figures associated with IS, analysts found 56 videos or posts in the search results that appeared in violation of platforms’ terms of service. Had analysts continued to search beyond the most easily accessible videos or used basic misspellings of these figures (a common workaround for those seeking to evade platform policies) this number would almost certainly have increased.
Auto suggest and inefficient search blocking remain a problem across platforms
Of the terms assessed by ISD, 25 percent were blocked on TikTok, while none were blocked on YouTube or Instagram.
On Instagram, search results for names of terrorist figures at times came with a warning: “the term that you searched for is sometimes associated with activities of dangerous organizations and individuals, which isn’t allowed on Instagram”. However, the platforms still allowed a minor to click through to “see results anyway”.
While some terms were blocked in TikTok’s search feature, content could still be accessed when searching using the same term via hashtags. Other workarounds, such as misspelling terms, were suggested by TikTok’s auto-suggest search function. On YouTube, searches for known terrorist figures yielded content such as ‘lectures’ as suggested results for minors.
The majority of unblocked terms generated content likely violating platforms’ own terms of service with a single search. This included content praising known terrorist leaders or individuals who promote violence or hateful ideologies both online or offline. Examples included videos of known terrorist figures praising suicide bombings that killed civilians, YouTube content featuring Anwar al-Awlaki, the former leader of Al-Qaeda in the Arabian Peninsula (AQAP), and Roblox IS videos.
While there are clearly gaps in mainstream platforms’ enforcement of their own safety policies, their safety measures still vastly exceed those of more fringe platforms such as Telegram. Research has repeatedly shown that these alternative platforms remain an effective tool to spread extremist propaganda and radicalize individuals. One of the suspects in the plot used Telegram to pledge allegiance to ISKP.
Regulation remains critical in light of inadequate platform policy enforcement
Despite having more ‘robust’ policies on kids’ safety and terrorist content, there is an issue of enforcement on mainstream platforms such as TikTok and Instagram. However, greater data access is required for researchers, legislators and the public to understand the prevalence and severity of extremist content available to minors.
Amidst an industry trend of restricting data access for researchers, journalists, and civil society groups, there have been some promising regulatory developments. These include the EU’s Digital Services Act (DSA), which mandate greater transparency and access from the platforms. However, legislators elsewhere, including in the US, have yet to enact the same safeguards for minors as included in the DSA, or data access provisions for researchers.
Conclusion
As authorities continue to assess the situation in Vienna, it is clear that both mainstream and fringe social media platforms play a role in the online radicalization process, with minors’ access to terrorist content just a click away. The simple search conducted by ISD shows that this content is available at surface level on TikTok, Instagram, YouTube and more. Amid the rise in minors being arrested for IS-inspired plots and sharing propaganda in Western Europe, and the increase in IS and AQ calls for attacks, platforms should be aware and proactive about tactics being used to get and keep terrorist content active online. While platforms like TikTok and Instagram have robust policies on kids’ safety and terrorist content, enforcement issues remain. Greater data access for researchers and stronger regulations, like the EU’s Digital Services Act, are crucial, with the US lagging in similar safeguards.