The Long Road to the Capitol: A Hybrid Threat Landscape

26th January 2021

The Capitol attacks represented a perfect storm of disinformation, conspiracy theory, weaponised hate and extremism, the emergence of which ISD analysts had tracked closely over the months running up to the election. 

In the summer of 2020, the US grappled with the emergence of significant social justice mobilisation in the wake of the killing of George Floyd. These events, in conjunction with the difficulties caused by the COVID pandemic, meant disinformation proved a constant hazard. Commercial disinformation networks, including established for-profit purveyors of disinformation and extremist content, seized on the turmoil as an opportunity to publish a deluge of written, audio and video content.

_______________________________________________________________________

Large scale platform manipulation

The intersection of disinformation and extremist narratives on standalone websites with social media platforms like Facebook, Twitter and YouTube – all of which play central roles in hosting and amplifying such narratives – has marked the emergence of an increasingly hybridised threat landscape. One prominent node in this landscape studied by ISD was NaturalNews. This commercial enterprise and disinformation empire is presided over by Mike Adams, a businessman connected to Infowars’ Alex Jones and anti-government militia group, the Oath Keepers.

In June 2020, ISD research examined the scale and nature of NaturalNews’ largely unchecked activity on Facebook, and identified over 18,000 links to NaturalNews affiliated websites that were shared in public Facebook groups and pages over a nearly three-month period at the start of 2020. These affiliated websites engaged with an array of conspiracy theories ranging from health disinformation and climate change denial to anti-immigration narratives. Articles written by Adams and his associates demonised liberal politicians, voters and institutions, and in some cases urged action against these “malign” actors. 

The broad range of topics promoted by NaturalNews combined with their expert use of social media to amplify content meant that they were able to reach a vast audience. When this content was shared on Facebook, it may have acted as a gateway into NaturalNews’ network of extremist websites and narratives. These narratives, in turn, often used violent and dehumanising language to discuss the ‘evils’ of liberals, government and big tech companies, and appeared to be aimed at individuals already engaging with anti-government, militia, and “prepper” ideologies. The wide dissemination of these narratives meant NaturalNews’ sentiments were then fed into the online ecosystem of right-wing extremists, which includes individuals who entered the Capitol building on January 6. 

The residual threat of such large-scale platform manipulation was illustrated in a separate investigation conducted by ISD in June 2020, which revealed how spam-like networks on Facebook were – and still are – being used to distribute potentially harmful and divisive content across the platform at scale. In this investigation, content was amplified to Facebook users in groups and pages focused on topics such as US right-wing politics, right-wing politics in other countries such as the Philippines and Canada, and political wedge issues. 

The monetisation dynamic

Aside from stoking division and sowing discord, commercial actors such as NaturalNews also have monetary incentives to make the content on their sites as sensational and engaging as possible while remaining vague enough to attract the broadest audience possible. Unsurprisingly, ISD saw allegations of election fraud and interference ramp up after November 3 2020 (election day), when increased coverage of hot-button topics began to correlate directly to an increase in ad revenue.

But it is not only commercial actors who profit from these large-scale disinformation networks. This monetisation dynamic was highlighted in our assessment of “coordinated inauthentic behaviour” on Facebook, which noted that between July 2018 and July 2020, Facebook made over $23 million in advertising revenue from inauthentic networks that violated the platform’s policies.

Major inequalities in the platform policy landscape have been exploited by hate actors. Research published by ISD in October 2020 found that tech platforms such as PayPal, Stripe, Facebook and Amazon accepted payments to a number of US-based hate groups, including certain groups and individuals known to be present in Washington on January 6. ISD’s research found that, of the 54 platforms and mechanisms examined, 38% did not contain policies that explicitly prohibited hate groups from using their services. It also found that hate groups were able to use the services of 83% of platforms that did contain policies around hate, showing an overwhelming failure by platforms to implement and enforce their policies. 

It remains clear that a sizable minority of social media platforms have failed to put in place policies that explicitly prohibit hate actors from using – and profiting from – their services. They are failings that must be addressed as a matter of urgency. So long as they remain unaddressed, the possibility of radicalisation on these platforms leading to future violence similar to that we saw on January 6 remains likely. 

 

This is the final piece of a three-part Digital Dispatches series that looks back at how a year of online extremist mobilisation precipitated a violent assault on the heart of American democracy on January 6th 2021.

The first two pieces discuss the themes of extremism and disinformation throughout 2020 and the key actors responsible for pushing these narratives online.

COVID-19 Vaccine Misinformation Monitor: Canada

In Canada, protests against COVID-19 restrictions fuelled by conspiracies and misinformation have generated anger and hostility. This briefing takes a closer look at Canadian communities who were promoting COVID-19 vaccine misinformation throughout February.

Facebook’s News Ban and Its Effect on Online Conspiracy Communities

Elise Thomas discusses how Facebook’s decision to block Australian users from sharing or viewing news content on the platform created a short-lived experiment as to what happens to disinformation and conspiracy theories in the absence of journalism on social media platforms.