TikTok and White Supremacist Content

Authors: Ciarán O’Connor and Jared Holt

Published: 12 September 2024

This report is part of a series looking at hate speech and extremist content on TikTok, that includes a methodology overview, policy recommendations, and analysis of anti-migrant and anti-refugee content on the platform.

Once considered a mere novelty app, TikTok is now a certified force in the information ecosystem.  

The short form video giant is now being used by 14% of Americans as a news platform, according to a Pew Research Centre from 2023, an amount four times more than in 2020.  The impact of the platform, once best known for dance crazes and being a tastemaker for online trends, cannot be ignored.  

To better understand the impact that TikTok has, in 2023 ISD analysts gathered and analyzed data on trends in hate speech and extremist content on TikTok, and how effectively they were being moderated by the platform. The results, which center on a particular moment in time, have come to inform a series of studies – the first two of which focus on white supremacist content, and anti-migrant and -refugee content.  

While TikTok appears to have taken measures to improve content moderation practices since ISD’s 2021 study on extremism and hate speech on the platform, this new series demonstrates that TikTok is still ineffective in removing violative content. For example, data for the white supremacy content study was collected during one week in mid-August 2023 and indicates that such content was alive and well on the platform: 70 of the 108 video samples studied were uploaded to TikTok within the most recent three months at the time of collection. Of those 108 videos, the median number of views at the time of analysis was 6,097, a significant increase from ISD’s 2021 report where the median across 1,030 videos was 503 views.    

The last nine months have been tumultuous for TikTok as a company. In April 2024, President Joe Biden signed a bill that could result in a nationwide ban of the app should TikTok’s parent company, the Beijing-based ByteDance, not sell the platform within 12 months. As part of an ongoing legal fight over the possible ban, the Justice Department, according to the Associated Press, this summer alleged that TikTok was gathering bulk information on users’ “views on divisive social issues like gun control, abortion and religion,” and harvesting data in violation of children’s online privacy law. 

As TikTok’s future remains undecided, content moderation issues on the platform persist. In July 2024, ISD published a report detailing the millions of views garnered by a network of neo-Nazi accounts on the platform. Just a month earlier, however, TikTok had published an updated transparency report in which they claim that in the first four months of this year, moderators proactively removed 97.7% of violative content. Of that same sample, 89.8% were removed within 24 hours, down .1% from that same period in 2023.  

Despite TikTok’s statements, ISD and similar organizations consistently find content in clear violation of the platform’s own policies.  

DOWNLOAD THE REPORT