Authors: Elise Thomas, Kata Balint
Published: 27 April 2022
This research documents how YouTube’s algorithms contribute to promoting misogynistic, anti-feminist and other extremist content to Australian boys and young men. Using experimental accounts, this research tracks the content that YouTube, and their new ‘YouTube Shorts’ feature, routinely recommends to boys and young men.
This short-term, qualitative study involved analysing algorithmic recommendations and trajectories provided to 10 experimental accounts. As the study progressed, each account was recommended videos with messages antagonistic towards women and feminism. Following the recommendations and viewing and liking the suggested content resulted in more overtly misogynist ‘Manosphere’ and ‘incel’ content being recommended.
The study found that while the general Youtube interface recommended broadly similar content to topics the accounts originally engaged with, the new shorter video feature, called YouTube Shorts, appears to operate quite differently. Shorts seems to optimise more aggressively in response to user behaviour and show more extreme videos within a relatively brief timeframe. On Shorts, all accounts were shown vastly similar and sometimes even the same specific content from right-wing and self-described ‘alt-right’ content creators. The algorithm did not make any distinction between the underage and adult accounts in terms of the content served.