Extremists are Flourishing Behind Well-Built Websites thanks to Open Source Tech

28th September 2021

By Elise Thomas

While researchers and journalists have placed a necessary focus on the interplay between extremism and social media platforms in recent years, new analysis from ISD highlights how websites also continue to play a central role in the digital presence of many extreme-right organisations and communities.

These websites serve as repositories of propaganda; avenues for funding through donations and merchandise sales; discussion forums; and directories for the new accounts of far-right actors who resurface after suspensions from various social media platforms. 

_________________________________________________________________________

ISD’s analysis of 100 far right, neo-Nazi and white nationalist extremist websites has found that rather than being custom-built by professional developers or web designers, the majority are built using open source technologies, which are freely available tools and services. Specifically, 63% of the websites were built using WordPress and 18% were using Woocommerce, WordPress’s online payments tool. This has significant implications for our understanding of and response to the presence of extremists online.

Extremist actors use open source technologies for a number of reasons. In part, it is for the same reason that over 37% of all websites do: open source technologies are good quality, easy to use and free. In just a few clicks, extremist actors can create slick, well-designed websites which lend a veneer of professionalism to their propaganda, wrapping aesthetically pleasing templates around profoundly ugly ideas. Another reason lies in the difficulty of moderating open source services. It is very difficult for open source services to be taken down for content such as hate speech or incitement to violence. For example, the open source nature of WordPress.org (distinct from WordPress.com) means that it is not possible for the volunteer community which maintains it to remove websites, even if they wanted to.

While this does pose challenges, it also creates an opportunity for thinking in a more nuanced way about moderation and counter-measures to extremism online. It forces the conversation to move beyond a reductive leave up/take down binary.

Focusing on the example of the open source WordPress software, there are a number of different equities at play. The WordPress volunteer community strongly embraces the principles of free speech. There are profound and undeniable benefits which flow from their commitment to letting any individual use the WordPress software for any purpose, in particular for users living in authoritarian contexts where freedom of speech is restricted.

Yet, WordPress is the product of thousands of hours of labour by a community of largely unpaid volunteers and creators. These people lend their time, talent and creativity to turning WordPress into the wildly successful tool that it is today. They include many people who are non-white, women, LGBTQ+, or members of other groups targeted by white nationalists, neo-Nazis or far right extremists. They, as well as others in the WordPress community, would likely not want to see their freely-given skills and labour used to help promote hateful ideologies.

It is worth giving thought to whether there is a middle ground for such products and services: one that would allow them to support freedom of speech for everyone, whilst preventing hate-based groups from free-riding on the work of people who are profoundly opposed to the ideology they seek to promote.

Many of the sites identified in ISD’s analysis which belonged to far-right and neo-Nazi organisations are beautifully designed. This is no credit to the extremists themselves; it is the work of the WordPress designers and theme creators whose templates have been used. Nonetheless, the visual appeal of these sites makes the ideas expressed seem more palatable, in exactly the same way that other forms of propaganda rely on art and aesthetic design to make the message being conveyed more compelling. This raises a tangential question to that of freedom of speech: does the freedom to espouse hateful ideas also imply a right to the free access of tools which make them more aesthetically appealing, or better optimised for search engines and other similar functions? Accordingly, if the answer is no, then what can be done about it?

There is precedent for open source communities to push back against those who seek to use their tools to promote hate speech. One such case is Mastodon, an open source social media community which has waged a long-running war of attrition against an influx of far-right users from Gab – a war which, as of December 2020, they appear to have won. Another example is the Organisation for Ethical Source (OES), which works to promote a series of ethical values including “safeguards to minimize the risk of abuse or harm to others through [the] use or misuse” of open source tools. As of 2021, the OES also manages the Contributor Covenant, a voluntary project which seeks to make open source communities welcoming and harassment-free for all contributors.

While useful examples, none of these measures would be directly applicable to WordPress. The WordPress community would need to find its own measures which are both technically feasible and sustainable. Interventions at the theme or plugin level might be worth exploring, for example an opt-in system for individual creators which would allow them to block a list of extremist sites from making use of their products. This would allow the extremist sites to continue operating, thus not contravening their freedom of speech, while still returning agency to the hands of creators who could choose whether or not they want their work to be used to promote extremism.

The goal of this analysis is not to prescribe how open source communities ought to respond, but rather shed some light on the issue and spark a conversation within open source communities. Ultimately, it rests in the hands of those creators to decide whether the use of their tools to promote extreme and hateful ideologies is a problem they want to tackle – and if it is, what they are prepared to do about it.

 

Elise Thomas is an OSINT Analyst at ISD. She has previously worked for the Australian Strategic Policy Institute, and has written for Foreign Policy, The Daily Beast, Wired and others.