COVID-19 Disinformation Briefing No.2

COVID-19 Disinformation Briefing No.2 – 9th April 2020

Briefing paper

This is the second in a series of briefings from ISD’s Digital Research Unit on the information ecosystem around coronavirus (COVID-19).

These briefings expose how technology platforms are being used to promote disinformation, hate, extremism and authoritarianism in the context of COVID-19. It is based on ISD’s mixture of natural language processing, network analysis and ethnographic online research.

This briefing focuses on the way far-right groups and individuals are mobilising around COVID-19 in the US. The first briefing in the series can be found on ISD’s website.

Disinformation briefing: Narratives around Black Lives Matter and voter fraud

This short briefing details the methodology and key findings of a study conducted jointly by the ISD team and Politico. Leveraging data from across social media platforms, this investigation seeks to understand online discussions around the Black Lives Matter (BLM) movement and the issue of voter fraud ahead of the US Presidential election. The research was designed to shed light on the volume and nature of disinformation related to these two issues online and how this disinformation may be weaponised to attempt to influence attitudes ahead of the election.

QAnon and Conspiracy Beliefs

The findings from this study provide important context for understanding the relationship between QAnon and the broader problem of conspiracy theory beliefs. A majority of Americans know nothing about QAnon and fewer than one-in-ten have a favorable view toward it; yet, a majority of those who recognize and believe in QAnon conspiracy theories are not QAnon supporters (most said they had not even heard of QAnon).

Hosting the ‘Holohoax’: A Snapshot of Holocaust Denial Across Social Media

This briefing paper examines the extent to which Holocaust denial content is readily accessible across Facebook, Twitter, Reddit and YouTube. This paper also demonstrates how appropriately applied content moderation policies can be effective in denying dangerous conspiracy theorists a public platform by examining how Holocaust denial content has decreased significantly in the past year on YouTube.