New research points to failures by YouTube in protecting children from harmful content

18 June 2024

By: Aoife Gallagher

A four-part ISD investigation reveals misogynistic content, and videos related to self-harm and suicide, are being recommended to young users. 


On February 18, 2023, a video titled “Motivational speech by Andrew Tate” was recommend to a YouTube account registered as a 13-year-old boy. Tate, a notoriously misogynist influencer, is currently awaiting trial in Romania on charges of rape and human trafficking; after this, he faces extradition to the UK to face charges of sexual aggression. He was banned from YouTube in 2022, but his content remains on the site, posted by accounts with hundreds of thousands of subscribers.

During the 13-minute video, Tate attacks the education, judicial and medical systems as “scams”, encouraging his listeners to “control [their] minds” to “escape the Matrix”. He says that “the true importance of being a woman is procreation” and claims being a man is “so much harder than being a woman”. To “conquer the world” and make money, Tate encourages those listening to join his “university”, a venture described as both a “pyramid scheme” and a “cult”. The video was recommended to the same teenage account again five days later, on February 23, 2023.

On February 28, 2023, YouTube recommended a video titled “JJ and Mikey HANGED THEMSELVES – in Minecraft” to an account registered as a 14-year-old girl. The thumbnail of the video featured two Minecraft characters from a popular account aimed at teens, hanging by nooses withb blood around their necks. Upon playing the video, the characters from the thumbnail can be seen hanging above a tipped over chair (see image 1).

Image 1: Screenshot from a Minecraft video recommended to a teenage user.

Problematic recommendations

These are just some of the discoveries from a four-part investigation conducted by ISD in which analysts created a total of eight US-based YouTube accounts registered with varying ages, genders and interests. The recommendations shown to these accounts over the course of one month were then analysed.

While these four investigations surfaced a variety of noteworthy trends, the standout finding was YouTube’s failure to adequately safeguard teenage users from harmful and problematic content.

Teen accounts served suicide videos and sexual content

The account of the 14-year-old girl recommended suicide-themed Minecraft content was part of Investigation One. It was one of two teenage accounts – one male, one female – with an interest in popular gaming channels and videos. Analysts also found another video, recommended four times to both accounts, which had received a content warning from YouTube for discussing topics related to self-harm or suicide. This video featured two YouTubers exploring a haunted house and discussing rumours that visitors to the house had found marks on their necks. According to them, this is a sign of being haunted by the ghosts of people who had died by hanging there.

Image 2: An example of a channel recommended to teenage YouTubers where the videos feature ”sex mods”.

In other instances, ISD found videos recommended to teen watchers that contained “sex mods”, with Minecraft characters presented fully, or almost fully, nude. A closer look at the channels these videos came from shows that sexual content was a feature of the thumbnails in almost all cases (see image 2). Although several employed a “bait and switch” tactic, where sexual thumbnails were used to entice users but the videos lacked explicit material, those recommended to ISD’s accounts did feature partially nude characters.

The teenage gamer accounts also received recommendations for videos related to guns and ballistics. The common theme of the videos involved testing the strength of different materials by shooting them with a range of firearms. While these videos were produced in an entertaining style, they did glorify the use of weapons. ISD could not find any indication that YouTube was placing warnings on such videos for those under 18.

YouTube accounts fed Tate content despite ban

The 13-year-old recommended Andrew Tate content was one of two accounts in Investigation Two with an interest in the world of male lifestyle gurus; the other was a 30-year-old male. During their persona-building stage, both accounts watched and subscribed to content from podcaster Joe Rogan, academic turned commentator Jordan Peterson and FreshandFit – a channel that often posts misogynist content and has hosted extremists including Groyper Nick Fuentes.

Neither the teenager nor adult account showed any interest in Andrew Tate-related content. Nevertheless, both were served recommendations for content that prominently featured the banned influencer. The teen account was recommended 12 videos featuring Tate, while the adult account was only recommended 10.

In one such video, Tate blamed women for all the problems men face in modern times. He said women are not “self-accountable” and that men are held to different standards. Tate also rated women’s characteristics, assigning points based on his perception of their ability to cook or provide sexual gratification to a man.

Outside of Tate content, misogynistic themes were also present in a significant amount of content recommended from the channel FreshandFit. In one such video, a panel of women were interviewed about their dating lives. The hosts interrogated them about their sexual activity and about what they can offer to their partners, berated some for not providing enough through cooking or cleaning, and attacking others over their appearance.

“Love the fact that you guys make it okay for me to be me man and promote it. I love being an a******, but I do it in a good way, a loving way,” said one commenter on a video. Others celebrated the fact that one of the women cried and said other women should be more accommodating to their male partners like the panelists.

While both accounts subscribed to the channel during the persona-building stage, the high volume of FreshandFit videos recommended to the 13-year-old user (67 in total) was a cause for concern.

Lack of safeguards

These examples point to a failure by YouTube to protect young users from harmful content. When ISD analysts were creating the Google/YouTube accounts for the investigations, parental/guardian approval was not required for teenagers. While Google states that the minimum age for creating an account in the US is 13, there are no identity checks or safeguards to stop users simply lying about their age.

The results of these investigations suggest that YouTube does not differentiate between the recommendations sent to a teenage user of the platform and those of an adult if they share interests. This put young users at risk of exposure to harmful content, often without ever searching for it.

Additionally, these findings indicate that YouTube is not effectively using available tools, evident by the lack of content warnings and age restrictions on videos unsuitable for younger users.

The Executive Summary along with the four investigations examining the algorithmic recommendation of users with interests in gaming, male lifestyle gurus, Mommy vloggers, and Spanish-language news are available on our website.

False and unverified claims proliferate online following Trump assassination attempt

Unverified claims about the attempted assassination of former President Donald Trump proliferated across social media in the 24 hours following the incident and continue to spread in the absence of more detailed factual information about the shooter. ISD identified the main false claims being promoted and how they mutated and spread across platforms.