UK Terrorgram Proscription: Useful, but limited tool to combat online network

2 May 2024


On April 22, the Home Office announced that the UK will proscribe the Terrorgram Collective, an online network of neo-fascist accelerationists who produce and share propaganda encouraging adherents to carry out terrorist attacks. Following the collapse of Iron March and Fascist Forge, Terrorgram became the organizing venue for a number of accelerationist groups including the Injekt Division, Feuerkrieg Division, and other self-described Atomwaffen Division spinoffs which began distributing their propaganda materials through the network. While the Terrorgram network is one of the most influential violent extremist propaganda networks of its type, relatively few attacks have been publicly attributed to or openly inspired by its works. Notable exceptions include the killing of two people outside an LGBTQ+ bar in Bratislava in 2022 by a man dubbed ‘Terrorgram’s first Saint’ and the 2023 plot by Brandon Russell and Sarah Clendaniel to attack electrical substations surrounding Baltimore, Maryland. The nature of the decentralised network allows influencers in less conspicuous private channels to evade attention despite their connections to those who plot or carry out attacks.

ISD’s digital analysis unit have been monitoring Terrorgram for years: in 2020, we published a report analysing a network of 208 channels distributing white supremacist and pro-terrorist content on Telegram. This research demonstrated how these channels glorify terrorism, call for violence, spread extremist ideological material and demonise minority groups.

The UK’s first proscription of an online terrorist network will criminalise support for or membership of Terrorgram. Such amorphous extremist ecosystems have traditionally challenged policy responses geared towards groups and organisations and this proscription will provide UK authorities with a powerful tool to both limit the spread of Terrorgram propaganda and prosecute Terrorgram members in the UK.

The proscription has the potential to impact on any UK-based members of the network involved in the production or distribution of branded Terrorgram propaganda. However, the nature and structure of Terrorgram poses challenges that will likely limit impact due to loosely defined membership and the use of non-compliant platforms.

Group membership in the post-organisational era

When considering Terrorgram, it is important to distinguish between the small core of leaders who produce propaganda and guide the network and the large number of adherents who are part of a wider online milieu. There are certainly core affiliates of the Terrorgram Collective, and the UK Independent Reviewer of Terrorism Legislation Jonathan Hall stated in a thread on X that the proscription decision indicates “Terrorgram is assessed to be an organisation with a core human leadership that directs and coordinates online activity.” However, much of the network consists of a constantly shifting carousel of radicalised individuals from around the world and defining ‘membership’ may pose an obstacle to UK authorities.

The core members of Terrorgram – i.e. those involved in the production and publication of branded products – may be targeted as members of Terrorgram; however, it is unclear if any of that small cadre reside in the UK. Proving “membership” for the much larger number of Terrorgram affiliates is harder, as there is no formal membership process. Even allegedly decentralised neo-fascist accelerationist groups such as Atomwaffen maintained a process for “patching” or formally admitting an individual to the organization. Despite the previous proscription of a number of decentralised networks, there remains only one related membership conviction of a teenager for belonging to the Feuerkrieg Division. Therefore, the enforcement of this proscription will likely focus on the distribution of Terrorgram propaganda rather than membership in the network.

The impact of proscription on group materials

While there are components of the network that primarily focus on radicalising others online, distributing non-official propaganda, and lionising “saints,” the core members of Terrorgram remain focused on the periodic production of branded propaganda magazines such as “Hard Reset” and “Do It For The ‘Gram.” While much of this material would already meet legal thresholds for the collection and distribution of terrorist publications (s58), their group affiliation will clarify these thresholds to social media companies and create strong legal impetus for their removal.

It remains to be seen which items will qualify as Terrorgram propaganda under this action. In addition to the branded publications, Terrorgram members and affiliates produce a large volume of informal and unofficial media products. Legal enforcement on the basis of this proscription decision against these products and establishing their provenance as Terrorgram-related would be difficult and could generate legal challenges. However, there are other unbranded pieces of propaganda including Saint calendars and cards [1] produced by known Terrorgram members or affiliates that are regularly distributed and key to Terrorgram recruitment and messaging. It is likely that UK authorities will pursue criminal action against those who possess or distribute branded Terrorgram products, but it is unclear whether they will do so for unbranded materials.

Limiting cross-platform pollination

Finally, this proscription will allow UK authorities to limit the spread of Terrorgram propaganda on other social media platforms. However, this will likely have a limited effect. Terrorgram (as implied by its name) operates primarily on Telegram, which has historically been ineffective at enforcing content restrictions. Even if they did, Telegram has little ability to geo-fence or determine the origin of the user. Terrorgram publications are circulated on mainstream social media platforms such as Instagram, Facebook, and X; however, the accounts that do so are already swiftly banned under existing content guidelines. Terrorgram affiliates plan for this and it is unclear if stricter enforcement would have a significant effect on the spread of propaganda.

Conclusion

With the decentralisation of far-right online ecosystems, policies rooted in the post-9/11 counter-terrorism framework have struggled to appropriately address a new generation of threats. The UK Government’s decision to define the Terrorgram Collective as a group and ban it as such raises a multitude of questions for how effectively existing legislation can pivot. Simultaneously, the upcoming implementation of the Online Safety Act framework will require clear guidance on the illegality of online materials which seek to promote terrorism and violent extremism. The success of this new measure, including its use for membership or promotion-related convictions and the banning of online material, remains to be seen. While a first step towards addressing nebulous online terrorist sub-cultures is welcome, rather than shoehorning new threats into old policies, the urgency for a comprehensive framework for addressing all forms of online hate and extremism remains clear.

Endnotes

[1] See ISD’s Saints Culture Explainer for more details on these products.

TikTok series: Policy recommendations 

ISD identified platform safety improvements for hate speech on TikTok, including better enforcement, clearer policies and filling content knowledge gaps.