Digital Services Act: Europe’s Disinformation Community Calls for Urgent Action

7th September 2021

________________________________________________________________________

ISD is one of over 50 organisations and individuals to have recently co-signed an open letter on behalf of the disinformation expert community. In this letter, we call upon EU policy-makers to amend the draft EU Digital Services Act (DSA) to tackle disinformation head-on via stronger measures on platform accountability and more democratic oversight over our online environment.

The signatories call on the DSA’s negotiators to take into account recommendations aimed at limiting arbitrary actions and inactions by the largest platforms; stricter transparency measures for online platforms, such as mandating a searchable archive of terms & conditions or ensuring the availability of disaggregated ad spend data; and more flexible provision on data access for accredited researchers.

The open letter highlights the need to frame the DSA debate around tackling online harm to all of our fundamental rights, in addition to free expression and access to information. It calls on policy-makers to think about the problem in terms of tackling malicious activity carried out by threat actors, rather than moderating individual pieces of content.

ISD’s Head of Digital Policy and Strategy Chloe Colliver has said: “The DSA provides an opportunity to put transparency and fundamental rights at the heart of Europe’s efforts to tackle disinformation, and we call on MEPs and the Commission to take up the recommendations provided by civil society and independent researchers.

 

Open Letter to EU Policy-Makers

How the Digital Services Act (DSA) can Tackle Disinformation

The Digital Services Act (DSA), the EU’s draft law on internet safety and accountability, would introduce sweeping change to our online environment. Unfortunately, the law does not tackle disinformation head on.

Time to legislate on disinformation

The EU has been vigilant to the challenges of disinformation but has avoided taking action that would deliver meaningful platform accountability, relying instead on self-regulatory initiatives, coordination mechanisms, and capacity building. We need a new approach. We cannot sustain the status quo of arbitrary platform decision-making, differing from month-to-month and country-to-country [1]. Nor can we address information disorders through a purely content moderation approach that relies on automation riddled with false positives and negatives, unable to grasp the nuances or the tentacular design of disinformation campaigns.

There is a tendency in Brussels to frame the question of legislating on disinformation as a simple trade-off between free expression and total control of our online environment. Some fear attempts to create more platform oversight will be co-opted by authoritarian governments to censor their citizens. Authoritarian laws taken on the pretext of tackling disinformation is a serious concern and must be condemned [2]. But this is no reason for democratic societies to refrain from passing democratic laws.

Information disorders endanger our fundamental rights

Examples of dangerous misinformation, disinformation and malinformation can be found on a daily basis [3,4,5]. Added to the mix are individuals trying to profit from lucrative disinformation, and well-resourced information operations outsourced to cybersecurity professionals and military units to interfere in our democratic process, silence dissidents or polarise our politics. We are not talking about individual utterances or pieces of content, but malicious activity carried out by threat actors with concrete implications on our rights, our democracies, and our free press. Disinformation also endangers independent media both when it is used to attack journalists outright, and when it is used to inundate us with inaccuracies, sowing confusion and doubt.

Free expression is clearly at stake when it comes to tackling disinformation. Disinformation confounds our right to freedom of expression by making it harder for us to access timely, relevant, and accurate information. Disinformation is also deeply connected to our civil and political rights, like the right to assembly and association: strategic disinformation campaigns blur the line between organic grassroots activism and manufactured deception. Meanwhile a growing body of research documents the worrying trend of disinformation affecting the participation of women and minorities in political space [6].

How to improve the DSA

Through the Digital Services Act, the EU has a golden opportunity to pursue a rights-based approach to tackling disinformation, without dramatically changing the scope of the proposal.

The DSA already contains a number of good things: safeguards to prevent over-removals of content and explicit language on the need to tackle systemic risks tied to “information manipulation”. But this is not nearly enough. That is why, we identify 6 areas that should be addressed, meeting objectives of greater platform accountability and more democratic oversight. The areas below have all been proposed as amendments in some form by either the European Parliament’s committees on Internal Market & Consumer Protection (IMCO) and Legal Affairs (JURI), or included by Member States in the Council’s compromise text dated June 4 2021. They can therefore be treated as realistic improvements to the current draft proposal of the European Commission.

Greater platform accountability

1 – Accountability for arbitrary actions and inactions by the platforms: The internal complaint-handling system foreseen by the DSA (article 17) must allow individuals or entities to lodge a complaint if platforms do not take action to remove or disable access to a piece of content. Similarly, the text should ensure individuals or entities can invoke not only the illegality of the content but also an incompatibility with the terms & conditions of the provider of the hosting service when submitting a notice (art. 14) for the piece of content that is to be taken down.

2- Meaningful transparency: Searchable archives of the terms & conditions (T&Cs) of online platforms (art. 12), including all past versions and date of application of their T&Cs. Granular reporting requirements for online ads visible through online ad repositories where multi-criterion queries can be performed per advertiser with aggregated data on the amount spent, the target of the ad, and the audience the advertiser wishes to reach (arts. 23, 24). The data in the ad repository should be made available to researchers for at least five years (art. 30).

3 – Risk assessments: Ensuring all systemic risks in the regulation are reported by the Very Large Online Platforms (VLOPs), not just the most prominent and recurrent risks (art. 27).

More democratic oversight

4 – Data access: Ensure civil society and journalists can qualify as vetted researchers in order to access platform data (art 31).

5 – Establish a European Oversight Board as an independent and well-resourced body to oversee implementation of the DSA with experts drawn from civil society.

6 – Special recognition for the role of whistleblowers in holding platforms to account.

We hope the DSA’s legislators will seize this opportunity to be a global standard setter when it comes to legislating on disinformation with an approach grounded in fundamental rights.

We, the undersigned organisations, share a common vision to improve the health of our online environment and call on the co-legislators in the European Parliament and Council of the EU to take the above considerations into account as they negotiate the Digital Services Act.

 

EU DisinfoLab Alliance for Healthy Infosphere* (alliance of 16 organisations)
AMO.cz
Antibodies to Misinformation
Avaaz
Check First
Check My Ads
Citizen D
Conspiracy Watch
Counter Extremism Project
Danae Tsabouraki, Consultant, Athens Technology Center
Dangerous Speech Project
Dare to be Grey
DebunkEU
Defend Democracy
Delfi
Democracy Reporting International
Demos
Digital Rights Foundation (Pakistan)
Dr. Courtney C. Radsch, Fellow, Center for Media, ,Data and Society at Central European University
Dr. Georgios Terzis, Associate Professor of Political and Global Communication, Vrije Universiteit Brussel
Dr. Giovanni de Gregorio, Centre for Socio-Legal Studies, Faculty of Law, Oxford University
Dr. Nikos Sarris, Senior Researcher, CERTH / Media Technologies Advisor, Athens Technology Center
Dr. Symeon Papadopoulos, Multimedia Knowledge and Social Media Analytics Labratory (MK Lab) of the Centre for Research and Technology Hellas – Information Technologies Institute (CERTH-ITI)
DROG
FakeNews.PL
Fondation Hirondelle
Future Eins
Global Disinformation Index (GDI)
Globsec
GONG
Graham Bookie, Director & Managing Editor, DFRLab
Hate Aid
Institute for Strategic Dialogue
#JeSuisLà
Lie Detectors
Memo98
Nicolas Quénel, freelance journalist
Open Society European Policy Institute (OSEPI)
Peter Pomerantsev, SNF Agora Senior Fellow, John Hopkins University
Prof. Kalina Bontcheva, Professor of Text Analysis, Department of Computer Science, University of Sheffield
Prof. Stephan Lewandowsky, Chair in Cognitive Psychology, School of Psychological Science, University of Bristol
Prof. Trisha Meyer, Research Director of the Centre for Digitalisation, Democracy and Innovation at the Brussels School of Governance, Vrije Universiteit Brussel, (VUB)
Rappler
Reporters Foundation Poland
Reporters Without Borders (RSF)
Savoir*Devenir
#ShePersisted Global
Stiftung Neue Verantwortung (SNV)
Stéphane Duguin, Chief Executive Officer, CyberPeace Institute
SumOfUs
The Daphne Foundation
Tom Southern, Project Director, Open Information Partnership
Tracking Exposed
Transparency International
Who Targets Me

 

1 EU DisinfoLab, (April 2021) “Bulgaria, the Wild Wild East of Vaccine Disinformation”.

2 First Draft, (August 2020) “Fake news’ laws, privacy & free speech on trial: Government overreach in the infodemic?”.

3 A campaign seeking to suppress voter turnout in an election by alleging that legal voting practices are illegal (disinformation), disproven fears about the risk of a vaccine (misinformation), racist slurs spreading online after a football match (malinformation). See Wardle & Derakhshan, First Draft “Information Disorders: Towards an Interdisciplinary Framework for Research and Policy-Making”; See also Camille François, Graphika and Harvard University’s Berkman Klein Center, (Sep 2019) “Actors, Behaviors, Content: A Disinformation ABC”, Testimony to the US House of Representatives.

4 EU DisinfoLab, (April 2021) “The Good, the Bad, and the Ugly: How Platforms are Prioritising Some EU Member States in their COVID-19 disinformation Responses”.

5 Politico, (July 2021) “FA condemns racist abuse of England’s football players after Euro2020 loss”.

6 Heinrich Böll Stiftung, (July 2021) “Gendered disinformation: 6 reasons why liberal democracies need to respond to this threat”.

The Houthi (Ansar Allah) Digital Ecosystem

This Dispatch analyses how the Houthi movement leverages social media amidst the Israel-Hamas conflict and Red Sea crisis, exploiting inconsistencies in government designations & platform policies to reach wider online audiences.