Published: 26th June 2020
Written By: Chloe Colliver
In October 2018, Facebook, Google, Twitter, Mozilla and a selection of advertising industry companies signed up to the newly drafted EU Code of Practice on Disinformation (CoPD). This assessment attempts to evaluate the enforcement of the CoPD during the EU parliamentary elections. It finds that the CoPD prompted progress from tech companies in dealing with specific areas of disinformation risk, most notably transparency for political advertising. However, the effectiveness of the CoPD in achieving substantive changes was fundamentally challenged by its self-regulatory set-up and the lack of enforcement in place for non-compliance.
This report provides a set of recommendations, which seek to inform the continuing efforts to counter disinformation and online harms at the EU level through the upcoming Digital Services Act and European Democracy Action Plan in 2020 and beyond. The Institute for Strategic Dialogue (ISD) calls for policymakers in the EU to design and enforce systemic transparency for advertising, content moderation, appeals and redress systems, and algorithmic design and output in order to address the risks posed by disinformation in the European context.
This assessment is based on research conducted by ISD and additional insights from partner research organisations who evaluated the presence and scale of disinformation targeting the European parliamentary elections of May 2019 on the social media platforms that signed up to the CoPD. The full methods and findings of that research are available in the accompanying report, Click Here for Outrage: Disinformation in the European Parliamentary Elections 2019.