Beyond the Collective: Understanding Terrorgram’s efforts to infiltrate the mainstream on Telegram

24 August 2024

By: Steven Rai


ISD conducted a network analysis of Telegram channels that disseminate propaganda from the Terrorgram Collective, a disparate online network of neo-fascist accelerationists that has been linked to arrests or attacks in the US, Canada, and Slovakia, and which was proscribed by the UK government in April. Although most of the focus has been on the group’s overtly violent content, ISD analysis has identified multiple large channels that are not officially branded as being part of the Terrorgram Collective, but which nevertheless appear to be controlled by its supporters. All three push subscribers to join associated group chats where content supporting mass violence and societal collapse is rife. Telegram bots are also used to synchronize the spread of this material across groups. This reveals a coordinated strategy to indoctrinate less ideologically committed viewers into violent extremist positions and allows the Terrorgram Collective to broadcast their messaging to a larger audience than would otherwise be possible. Ultimately, the sprawling nature of the Terrorgram Collective and its mixture of overt and covert online operations challenges the ability of group-based proscription approaches to adequately address this threat.

Key Findings 
  • ISD analysis shows how three Telegram channels which collectively have over 70,000 subscribers are serving as a gateway to far-right terrorist content associated with the Terrorgram Collective.
  • None of these channels are officially aligned with the Terrorgram, nor do they disseminate Terrorgram-branded materials. Instead, two primarily function as news aggregators, while the other claims to be associated with Steve Bannon’s ‘War Room’ podcast. These channels are far more successful in attaining large followings than the overt Terrorgram spaces, which typically only have several hundred subscribers.
  • All three channels were connected to group chats which were filled with violent material, including Terrorgram propaganda. Further, a bot account was found in all three chats and in more overt Terrorgram groups, allowing users to anonymously funnel extremist content to subscribers. There is strong evidence that the prevalence of Terrorgram content in the group chats is not coincidental and that the main channels are linked to the Collective. This includes administrators espousing racist views, posts in explicit Terrorgram channels encouraging members to join the non-explicit channels, the aforementioned bot account, and an overlap in membership between the group chats and those of the more overt Terrorgram channels. Members of the Collective have also previously called for the infiltration of non-Terrorgram spaces.
  • So far, ISD has not found evidence that the Terrorgram’s efforts have been successful in radicalizing or recruiting through this ‘mainstreaming’ method. However, such tactics serve to expose a wider audience to overt terrorist content, and risk drawing unsuspecting subscribers of these larger channels into more violent extremist communities.

The following analysis is based on a qualitative review of over 60 Telegram channels that espouse accelerationist rhetoric, or which post material from the Terrorgram Collective. Several of these were observed sharing content from three seemingly non-violent channels, which led ISD to conduct further research to determine the nature of their relationship with the Collective. ISD also examined a single bot connected to the Collective which was present in multiple group chats and was used to disseminate content across various groups, including the three discussed below. 

 Examining the Evidence 

 The Terrorgram Collective, an online network of neo-fascist accelerationists who produce and share propaganda encouraging adherents to conduct terrorist attacks, primarily operates on Telegram, where they control a sprawling web of channels and group chats. Channels function as feeds, wherein one or more administrators broadcast messages to their subscribers. Administrators have the option to attach a group chat to their channel, wherein subscribers can interact with each other and engage in discussions. Channels and group chats can be public, meaning anyone can search for and join them, or they can be private, which requires an invitation to view or post content. 

The three Telegram channels analysed in this piece, all of which are public and connected to openly viewable group chats, appear to be co-opted or controlled by supporters of the Terrorgram Collective. However, they do not outwardly identify themselves as such. Two primarily recycle content from mainstream and fringe media outlets. Together, these two have a combined following of over 7,200 subscribers. Notably, this tactic of masquerading as a news network to spread violent content has been practiced by other extremists, including supporters of the Islamic State. 

The other channel examined by ISD purports to be officially associated with Steve Bannon’s War Room podcast. In its corresponding discussion group, the administrator claimed that Bannon himself was involved with the channel. Despite these claims, ISD has not discovered any evidence to suggest that this channel has an official connection to Bannon or his podcast, and the official War Room website advertises a different Telegram channel. Nevertheless, the administrator of this channel has amassed almost 63,000 subscribers, just a few hundred shy of the legitimate War Room channel.  

Figure 1. A channel that is believed to be linked to the Terrorgram Collective advertises itself as the “official home of the War Room Posse”, and in the associated group chat, the administrator claims that Bannon himself is involved with the channel.

Various pieces of evidence suggest that these three channels are part of a coordinated endeavour by adherents of the Terrorgram Collective to infiltrate and control more mainstream spaces. Members of the Collective have previously prescribed this strategy: an article in a 2021 Terrorgram publication states that “covert infiltrators” play as much of a role in their revolution as “men of direct action”. Another speculates on how to best “strike the Jew in his ivory tower”, calling on the reader to “propagandize the system against itself”.  

ISD conducted a thorough examination of the group chats connected to each of the three channels and found significant overlap between the members of these groups and those within the more overt Terrorgram channels. Furthermore, ISD identified the presence of the same bespoke bot account in each of these three groups as well as the Terrorgram-branded groups. Telegram bots can be built by anyone and function as automated accounts through which users can perform a wide variety of tasks. In this case, the custom-made bot is typically used to disseminate violent material, including official Terrorgram propaganda. By sending content to this bot, users can anonymously funnel material to any group in which the bot is present. This allows users to coordinate their messaging and post content in multiple groups with the click of a single button. By examining the groups to which the bot is connected, ISD mapped the constellation of channels that fall under the Terrorgram Collective umbrella (including those that do not outwardly identify as part of the Collective). 

Figure 2. In a group chat claiming to be officially associated with Steve Bannon’s War Room podcast, a bot account shares a download link to a video game wherein the player character must kill minorities and political enemies to save the President.

An analysis of overt Terrorgram channels found that most heavily repost content from the three more innocuous channels. Their administrators often direct ideologically-aligned viewers to participate in conversations in these non-explicit channels. On one occasion, a channel operated by a known Terrorgram propagandist counselled subscribers to push back against users in the fake Bannon War Room group who were espousing pro-Israel views. Notably, the discussions in this group were rife with hate speech and violent rhetoric, which was primarily driven by users present in the overt Terrorgram groups. Further, the aforementioned bot inundated the group with messages glorifying mass killings and terrorism.   

The operators of the two channels that are primarily geared towards aggregating news stories are even less subtle. There are numerous instances of their administrators openly espousing racist views and aligning themselves with the ‘Saints Culture’ popularised by the Terrorgram. However, these views are only revealed through the administrator’s comments in the associated group chats, rather than through the channels themselves. When a user in one of the news aggregator groups criticised the Charleston church shooter, the administrator promptly responded by describing him as a “hero” for killing a Black politician and activist. In another conversation discussing an article about the Terrorgram’s reaction to an attack on the North Carolina power grid in 2022, the same administrator noted that two Terrorgram-affiliated individuals were explicitly referenced in the piece as if they knew them on a first-name basis. Likewise, when a member of the other news aggregator channel asserted that the Terrorgram Collective did not exist, several users retaliated, with one claiming that it was “everywhere”. In various instances, users in this group insinuated that they had intimate knowledge of the Collective’s history and inner workings and characterised the group chat as “accelerationist”. The administrator of this group, despite being an active participant in the discussions, has not attempted to distance themselves from the Terrorgram Collective. This further suggests that the administrator either supports the network or is involved in furthering its activities.     

Figure 3. In a Terrorgram-aligned news aggregator channel, users defend the actions of the perpetrator of the 2019 Christchurch attacks.

Evaluating the Strategy 

It is easy to view accelerationists, including those associated with the Terrorgram Collective, as operating within a vacuum in which they produce materials for like-minded individuals and largely preach to the converted. However, a closer look at their online activities presents a more complex picture. As evidenced in this dispatch, disciples of the Collective also broadcast their beliefs in more popular spaces that they assess may be susceptible to radicalization. While associates of the Collective insist that “there is no political solution” and claim to operate outside of democratic processes, these activities reveal their eagerness to exploit and manipulate ecosystems that seem less extreme to foment chaos by injecting their beliefs into more ideologically diverse online spaces. This approach is typified in the words of an administrator of the fake Bannon channel: “All mainstream platforms are bad, but that doesn’t mean they shouldn’t be used to win people over”. 

At first glance, it may seem like the Terrorgram Collective’s co-opting of more mainstream spaces has been effective, particularly when it is measured in reach and viewership. The core Terrorgram channels that explicitly align themselves with accelerationism remain on the fringes of Telegram and have relatively small followings compared to other extremist or conspiratorial channels, such as those affiliated with QAnon. These core channels typically have several hundred subscribers and struggle to attain a larger viewership. Conversely, the three channels examined in this piece have collectively attracted around 70,000 subscribers and their content has been amplified in spaces that are even more popular.  

For example, a Telegram channel for Turning Point USA founder Charlie Kirk, which has more than 166,000 subscribers, has reshared several messages from the fake Bannon channel, resulting in more than 100,000 additional views of those posts. Similarly, in 2022, one of the Terrorgram-aligned news aggregator channels spearheaded a campaign to take down more than 5,000 “Antifa” accounts on X (then known as Twitter). At least two major media outlets published articles about this campaign and directly identified the channel in which it originated, resulting in a significant boost in subscribers. 

Figure 4. The administrator of a Terrorgram-aligned news aggregator channel indicates that news coverage helped boost their number of subscribers.

It remains to be seen whether the Terrorgram Collective’s messaging has significantly influenced these less extreme communities. ISD research did not uncover strong evidence suggesting that members of the more mainstream groups have adopted an extremist or violent ideology following their exposure to Terrorgram-branded content. By and large, members of these groups who adhere to a violent ideology seem to have started as participants in Terrorgram and other accelerationist-inspired communities rather than as followers of the less extreme channels. However, it is important to avoid dismissing the Terrorgram Collective’s strategy as ineffective based simply on the lack of observable indicators on Telegram. The Terrorgram Collective’s use of non-overt spaces serves to expose a wider audience to terrorist content, which risks drawing unsuspecting subscribers of these larger channels into more overtly violent extremist communities.  

As outlined in previous ISD publications, the UK’s proscription of the Terrorgram Collective as a terrorist entity is an important recognition of the dangers of this network. However, it is unclear whether group-based proscription approaches will be effective in addressing the diffuse, online nature of this network. Given Terrorgram’s focus on the creation and dissemination of violent propaganda, including in more mainstream ideological communities, authorities will need to think creatively to mitigate the threat. As a first step towards formulating these ideas, policymakers and practitioners must look beyond Terrorgram-branded spaces and understand how the Collective operates more broadly, including by infiltrating and perhaps even controlling seemingly peaceful communities. 

TikTok series: Policy recommendations 

ISD identified platform safety improvements for hate speech on TikTok, including better enforcement, clearer policies and filling content knowledge gaps.