EU Digital Services Act

By: Terra Rolfe

This Explainer highlights the EU’s Digital Services Act, focusing on its role in ISD’s efforts against election risks, terrorism, extremism, and hate, and tech accountability through transparency and data access.

Introduction

The European Union’s Digital Services Act (DSA), which fully came into force on 17 February 2024, is the world’s first systemic online safety law that takes a comprehensive approach to a range of online services and types of online risks. It aims to create a “safe, predictable and trusted online environment that facilitates innovation” while protecting fundamental rights enshrined under the Charter of Fundamental Rights of the European Union, such as rights to freedom of expression and information, or non-discrimination (Article 1.1). It also creates new transparency and data access mechanisms for conducting public-interest research online. These mechanisms seek to enable academic and civil society researchers to better understand online risks and how they are facilitated by platform systems, internal processes, features or functionalities.

Many of the DSA’s provisions are relevant for preventing online risks related to democratic processes, extremism, disinformation and hate; these issues are the focus of ISD’s work. This Explainer focuses on these issues, as well as transparency and data access. Transparency and data access are a focus as they are necessary for effective compliance monitoring, independent research on the impact of risk mitigation measures, and understanding the nature and scope of online risks themselves. The European Commission’s overview provides a broader overview of the DSA’s due diligence obligations and impact outside of these areas.

Contents

Glossary

Timeline, Structure, and Enforcement

Additional Obligations for VLOPs and VLOSEs

Electoral Processes, Misinformation and Disinformation

Extremism, Terrorism and Hate Speech

Recommender Systems

Transparency

Data Access

Further Reading

 

Glossary

The DSA introduced many new terms to the digital policy lexicon. The following definitions are key to accurately understanding the scope and applications of the DSA, as well as related EU technology law:

Digital Services Coordinators (DSCs): The authorities within each EU member state responsible for enforcing the DSA, alongside the bloc-level European Commission and Board for Digital Services. They have both investigative and enforcement powers. The detailed requirements for DSCs can be found in Section 1 of Chapter IV.

Disinformation: As defined in the European Democracy Action Plan, disinformation is false or misleading content that is spread with an intention to deceive or secure economic or political gain, and which may cause public harm.

Hosting services: Services that store information provided by or at the request of a recipient of the service. This includes cloud hosting, web hosting and online platforms.

Illegal content: Any information that — by itself or in relation to an activity — is not in compliance with the law of the EU or a member state. Content deemed illegal at the EU level includes child sexual abuse material, terrorist content and some forms of hate speech, including the trivialisation and denial of genocide. By contrast, Holocaust denial for example is strictly illegal in Germany but not the Netherlands. The DSA itself does not define what content is illegal.

Intermediary services: Services offering network infrastructure such as internet access providers and domain name registrars. This is the broadest category of services covered under the DSA and includes all hosting services.

Misinformation: As defined in the European Democracy Action Plan, misinformation is false or misleading content shared without harmful intent. Even if shared in good faith, the effects can still be harmful.

Online platforms: Services that bring together sellers and consumers, such as online marketplaces, app stores, and collaborative economy and social media platforms.

Systemic risks: An undefined term within the DSA and other EU digital legislation, it refers to risks that are understood to have ‘pervasive effects’. The DSA requires VLOPs and VLOSEs (below) to identify, report and mitigate systemic risks that fall under four categories: the dissemination of illegal content; negative effects on fundamental rights; negative effects on civic discourse, electoral processes, public security or health; and negative effects related to gender-based violence, minors, and individuals’ physical and mental well-being.

Very large online platforms (VLOPs) and very large online search engines (VLOSEs): Online platforms and search engines used by an average of 45 million monthly users or higher (equal to 10 percent of the population in the EU). As VLOPs and VLOSEs pose particular risks regarding the dissemination of illegal content and societal harms, they are subject to the most stringent requirements under the DSA. Examples of VLOPs and VLOSEs include Meta (Facebook, Instagram) and Google (Google Search, YouTube). A complete and updated list of designated VLOPs and VLOSEs can be found here.

Timeline, Structure, and Enforcement

Obligations for VLOPs and VLOSEs began in August 2023, prior to the full implementation of the DSA for all platforms and services in February 2024. As of August 2024, the European Commission has already launched several investigations and proceedings against VLOPs and VLOSEs, addressing issues from the dissemination of illegal content to potentially addictive design functionalities.

Even before any actions have concluded, platforms have amended or paused some changes to their products due to interim orders. For example, TikTok voluntarily suspended TikTok Lite in the EU following the European Commission’s announcement of a noncompliance investigation, and interim measures related to some of Lite’s functionalities, such as financial rewards for extra time spent on the platform.

As DSA sanctions include fines of up to 6 percent of a company’s annual turnover, the law is expected to significantly change platform practices and design in Europe, with potential knock-on effects worldwide (the so-called “Brussels Effect”).

The DSA imposes broad due diligence obligations on all digital intermediary services operating in the EU, though they notably do not require ‘general monitoring’ (beyond that to identify systemic risks) nor active fact-finding of harmful content. The DSA and other EU technology legislation generally do not hold services liable for harmful content they host. Services classified as intermediaries include hosting services and online platforms. VLOPS and VLOSEs have the greatest number of obligations based on a tiered system of societal impacts (see Figure 1 below).

Figure 1 via the European Commission.

VLOPs and VLOSEs are subject to the most stringent set of obligations due to their reach and role in facilitating the spread of ideas, access to information, public debate and economic transactions (Recital 75), increasing their potential to pose systemic societal risks.

For these reasons, this Explainer largely focuses on the DSA’s obligations for VLOPs and VLOSEs, as well as smaller services which have been linked to the spread of extremism, disinformation and hate. The following articles are particularly relevant for understanding the impact of the DSA on these issues, as well as transparency and data access:

  • Article 8: Does not require general monitoring or active fact-finding obligations for services.
  • Article 9: Requires platforms to act promptly on orders to act against illegal content.
  • Article 14: Requires services to have clear and publicly accessible terms and conditions.
  • Article 16: Mandates notice and action mechanisms for users to report illegal content.
  • Article 22: Establishes trusted flagging regime for reporting content.
  • Article 24: Mandates transparency reporting obligations for online platforms.
  • Article 26: Sets transparency and accountability standards for advertising on online platforms.
  • Article 27: Mandates minimum levels of recommender system transparency.
  • Articles 33-40: Mandate additional obligations for VLOPs and VLOSEs, including identification and mitigation of risks, recommender systems, crisis response mechanisms, transparency and data access.
  • Article 45: Establishes codes of conduct, including laying the groundwork for mechanisms to take action against ‘legal but harmful’ content, such as disinformation.
  • Articles 65-74: Set requirements for supervision, investigation, enforcement, and monitoring of VLOPs and VLOSEs by DSCs and the European Commission.

The majority of the DSA’s rules are enforced at the member state level by designated DSCs, who are responsible for intermediary services, hosting services, and non-VLOP platforms registered in their territory. Designated DSCs are a mix of new bodies and existing communications, broadcast, consumer rights and/or competition regulators, with varied backgrounds and existing forms of expertise.

As of early August 2024, not all member states have designated a DSC, although all are required to do so. Member states that failed to designate or provide DSCs with necessary powers before the February 2024 deadline earlier this year faced proceedings from the Commission in April 2024.

The Commission has the right to supervise, enforce and monitor the compliance of VLOPs and VLOSEs, with support from the DSC in the member state that a service is established in. As VLOPs and VLOSEs are particularly concentrated in some EU member states, especially Ireland, some DSCs are expected to have a more significant influence on the DSA’s enforcement.

In addition to the main text of the DSA, secondary legislation (referred to as ‘delegated’ and ‘implementing’ acts) supplements or specifies conditions on specific topics related to the main text. Few acts have been adopted so far, but several are expected to be released in the coming year, including a Delegated Act on Data Access for Research.

These acts will be supported by voluntary codes of conduct (Article 45), which provide a pathway for services to address specific public interest objectives. Existing examples that are already or expected to become formally connected to the DSA (albeit potentially with revisions) include the 2022 updated Code of Practice on Disinformation and the 2016 Code of Conduct on Countering Illegal Hate Speech Online. More codes are also expected to be adopted, for example on child safety.

The DSA’s goals are also complemented by other digital legislation (both existing and incoming) including the Digital Markets Act, AI Act and the Regulation on the Transparency and Targeting of Political Advertising.

Lastly, the DSA is unique amongst nascent digital legislation for explicitly addressing content moderation across a diversity of languages. It requires VLOPs and VLOSEs to report on the human resources allocated for each of the 24 official EU languages as part of larger transparency reports (Article 42.2). This is a positive step towards ensuring that resources are allocated to content moderation, which has historically been lacking in languages other than English. However, it does not address all languages spoken within the EU, especially by minority and diaspora communities. Platform under-resourcing and under-moderation of content in these languages can leave communities vulnerable to hate speech, misinformation, disinformation, and foreign information manipulation and interference.

Additional obligations for VLOPs and VLOSEs

The DSA requires VLOPs and VLOSEs to assess several types of systemic risks, taking into consideration their severity and probability. Systemic risks include the dissemination of illegal content, negative impact on civic discourse, electoral processes and public security. It also includes the exercise of fundamental rights, gender-based violence, and the protection of public health (Article 34).

Instead of requiring all services to implement the same measures, companies can take the approach they see as most appropriate, provided it sufficiently mitigates the identified risks. Companies must conduct an annual risk assessment and share their results with the Commission and their local DSC.

Risk assessments should consider how particular elements of a service may influence systemic risks across the EU, while considering regional and linguistic differences (Article 34.2). Considerations must include:

  • The design of algorithmic systems, including those used to recommend information,
  • Content moderation systems,
  • Terms and conditions, as well as their enforcement,
  • Systems for selecting ads and presenting them to users,
  • Data-related practices.

These elements must be assessed for their use, including the potential for bad actors to intentionally manipulate them including through automated methods. Companies must also assess the risks of illegal content and information (such as terrorist content; discussed in more detail below) being amplified and widely disseminated via their service.

Once VLOPs and VLOSEs have conducted a risk assessment, they must use their findings to mitigate any identified risks (Article 35). Mitigation measures should be targeted to specific risks. Examples include changing the way that misleading information related to elections is labelled or testing and adapting a content recommendation system. The adoption of codes of conduct by VLOPs and VLOSEs can also be used as evidence of actions to mitigate risks.

All VLOPs and VLOSEs must also undergo annual independent audits (Article 37). Audits assess services’ compliance with their obligations under the DSA, including on transparency. Independent audits also contribute to transparency and accountability by providing an independent assessment of services’ practices and performance.

Electoral Processes, Misinformation and Disinformation

VLOP and VLOSE obligations to mitigate systemic risks to electoral processes and civic discourse were supplemented in March 2024 by the Guidelines on the Mitigation of Systemic Risks for Electoral Processes, which recommend best practices for VLOPs and VLOSEs to mitigate risks before, during and after elections.

While not legally binding, the guidelines further clarify the DSA text and provide practical, contemporary examples of how services can meet their obligations to mitigate systemic risks under the DSA. For example, the guidelines outline measures regarding online political advertising and generative AI based on emerging research and legislative developments.

Codes of conduct, which may include revised versions of existing codes of practice such as the 2022 Code of Practice on Disinformation, provide additional measures to combat risks to electoral processes. At the time of writing, the 2022 Code of Practice has been signed by 34 companies and civil society organisations. It provides 44 commitments and 128 specific measures to reduce disinformation, such as increasing recommender system transparency and providing resources to support users’ media literacy. While signing up to a code of conduct is evidence of VLOP or VLOSE action to mitigate risks, existing codes are also open to smaller signatories. For example, the video service Vimeo is a signatory to the Code of Practice.

The DSA also introduces rules related to political and commercial advertising. Both of these can spread misinformation and disinformation relating to salient social and political topics, such as electoral processes, conflict, climate and public health. All online platforms have an obligation to present ads in a clear and easily identifiable fashion. They are also required to provide information on how advertising is funded, on whose behalf content is presented and the main factors influencing their presentation to a particular user (Article 26).

VLOPs and VLOSEs must also compile all ads shared on their services in a publicly accessible, searchable repository with details about ad publication and performance (Article 39). Such ad libraries are particularly useful to journalists and researchers monitoring misinformation, disinformation, and foreign interference, particularly around elections.

Terrorism, Extremism and Hate Speech

Illegal Content

The DSA sets out several obligations regarding illegal content. Illegal content is not defined in the DSA itself, but in other laws of the EU and its member states. Some of these laws set requirements beyond those in the DSA. The Terrorist Content Online Regulation (TCO), for example, requires hosting service providers to remove terrorist content from their services within an hour of notification by a member state. The TCO also defines what constitutes terrorist content online in the EU.

Member states can also have additional laws regarding illegal content, which the provisions of the DSA support within that country. Examples of content that would be clearly considered illegal at the EU-wide level include content from proscribed terrorist organisations, or some forms of hate speech. By contrast, some forms of Holocaust denial could meet the threshold of illegality in Germany but not in other member states.

The DSA does not require intermediary services to proactively search for illegal content (Article 8); services are also not liable for illegal content on their services in most circumstances (Articles 4-6). However, once ordered to act against illegal content by a national authority, services must do so quickly (Article 9). The narrower category of hosting service providers is also required to have mechanisms that easily allow users to report illegal content on their services, which must be acted on quickly (Article 16). Such submissions limit services’ intermediary liability protections under Article 6, as they make providers aware of the presence of illegal content on their services. Hosting services must also notify relevant authorities if they become aware of a threat to life or personal safety on their services (Article 18).

Lastly, online platforms must also prioritise notices submitted by ‘trusted flaggers’, which are DSC-designated independent entities with expertise regarding illegal content and established links to certain platforms (Article 22).  VLOPs and VLOSEs must also take proactive measures to mitigate systemic risks posed by the spread of illegal content over their services (see Additional Obligations for VLOPs and VLOSEs above).

‘Legal but Harmful’ Extremist or Hate Content

For VLOPs and VLOSEs, the DSA includes additional obligations that could also contribute to mitigating risks posed by ‘legal but harmful’ hate or extremist content, for example via the obligations to mitigate risks to civic discourse, electoral processes, public security, gender-based violence or fundamental rights (Article 34). Other obligations, such as those that require platforms to have clear terms and conditions, and enforce them consistently (Article 35), should also have a positive impact on legal but harmful content linked to extremism or hate on platforms that prohibit these types of content or activities.

Smaller High-Risk Platforms

The DSA’S primary focus on VLOPs and VLOSEs has been criticised by some counter-extremism practitioners and researchers for insufficiently addressing the risks micro, small and medium-sized platforms can pose regarding illegal hate and terrorism. Such platforms can play a key role in the dissemination of illegal content, as they typically have fewer content moderation resources, or in some cases, appear to welcome these types of online communities. For example, Telegram is a medium-sized platform with an estimated user-base slightly under the threshold for designation as a VLOP, but it has consistently been linked to harmful activities including extremist content dissemination and calls to violence.

Crisis Response

Lastly, the DSA also introduces two crisis response provisions, with a relatively vague definition of a crisis as a “serious threat to public security or public health in the Union or significant parts of it” (Article 36), such as acts of terrorism or armed conflicts, natural disasters, and pandemics (Recital 91).

The crisis response mechanisms have been criticised by some civil society organisations for being vague and granting unilateral responsibility to the European Commission, with the potential for infringing on the rule of law. VLOPs and VLOSEs are subject to a binding crisis response mechanism, which introduces further obligations in the event of a crisis (Article 36). If the Commission declares that a crisis has occurred, it can require VLOPs and VLOSEs to:

  • Assess whether their service significantly contributes to the situation,
  • Take measures to prevent, eliminate or limit that contribution,
  • Report to the Commission regarding their assessment.

Measures to limit contributions to a crisis could include adapting content moderation or promoting authoritative information, for example (Recital 91). These binding measures are accompanied with voluntary crisis protocols under Article 48. These can be initiated by the Commission in response to the “rapid spread of illegal content or disinformation” or in the need for the “rapid dissemination of reliable information” (Recital 108). Voluntary protocols extend beyond VLOPs and VLOSEs to include smaller platforms and search engines.

Recommender Systems

The DSA also introduces obligations for online platforms to responsibly use recommendation algorithms, which rank and suggest content for users’ news feeds (or other areas of platforms’ interfaces where recommendations are provided), on their services. This aims to address the “significant impact” recommender systems can have “on the ability of recipients to retrieve and interact with information online” (Recital 70). This is a key obligation considering the role of algorithmic systems in affecting online discourse.

All online platforms, not just VLOPs and VLOSEs, must clearly and accessibly describe the parameters used within their recommender systems to order the display of content to users (Article 27). If available, any options for users of the online platform to change or influence parameters must be indicated.

The parameters described in the terms and conditions must explain why particular content or information is suggested to the user of a service. At minimum, this should include the most significant criteria in presenting information suggested to a user, as well as the reasons for the importance of certain parameters. Relevant parameters can include demographic information (e.g., age, gender, cultural background), as well as engagements and interests with which users engage on the platform.

When multiple recommender systems are available, platforms should also allow users to select which system they would prefer to use. A common alternative is the reverse-chronological newsfeed, which shows users content from other users they follow, based solely on how recently it was posted. In addition, VLOPs and VLOSEs must provide at least one option for their recommender system that is not reliant on profiling based on user behaviour or characteristics data (Article 38).

Transparency

Transparency is key to building trust between the public, governments, regulators, the private sector and online platforms, and ensuring accountability. Transparency is also crucial to the evidence-gathering process that informs debates and policy on terrorism, extremism, hate speech, disinformation and foreign interference. For an in-depth explanation of the importance of transparency, see ISD’s Explainer on the topic.

A few of the DSA’s most important provisions to address transparency are the following:

  • All intermediary services must implement clear terms and conditions (Article 14). Services must include information on any restrictions they impose on users of their service, including regarding content moderation and internal complaints handling processes.
  • All intermediary services must publish transparency reports on at least an annual basis (Article 15). The details of what this report must include differs depending on whether the service is classified as a hosting service or is simply an intermediary service. However, at minimum this will include information on orders or notices received regarding illegal content, as well as on content moderation and any use of automated means for removing content. There will be mandatory categories which platforms must report on, although at the time of writing these have not been finalised.
  • All hosting services must have notice and action mechanisms in place for users to report illegal content, including transparency measures on how services process reports (Article 16). Similarly, users whose content is removed from a service due to illegal conduct or breaching terms and conditions must be provided with a clear and specific statement of reasons for the removal. These measures increase transparency for users, as they provide visibility on how and why platforms respond to their actions.
  • All services will be subject to regulatory oversight from DSCs, with the Commission providing additional oversight of VLOPs and VLOSEs (Articles 51, 56). These oversight mechanisms increase transparency by allowing regulators to better understand platforms and hold them accountable.

Data Access

‘Meaningful’ access to data for public-interest researchers is key to allow them to evidence policy decisions and ensure platform accountability. A broader overview of the importance of data access is available in ISD’s Data Access Explainer.

The DSA is one of the first pieces of legislation to mandate that large platforms provide access to data for public interest research. This is done through three mechanisms:

  • Article 40.4 allows DSCs to request data, which may not be publicly available, from VLOPs and VLOSEs. This is done on behalf of researchers that are vetted by a DSC (see Article 40.8 on the vetting process). Data must be used solely for research on the impact of risk mitigation measures, or that contributes to the detection, identification and understanding of systemic risks within the EU.
  • The Delegated Act on Data Access will implement Article 40.4, further detailing the new framework for vetted researchers to access data from VLOPs and VLOSEs. As of August 2024, it is in the final stages of development by the Commission.
  • Article 40.12 requires VLOPs and VLOSEs to give access “without undue delay” to data that is publicly accessible via their online interfaces, ideally in real time. This access is intended for researchers affiliated with academic or nonprofit organisations that are independent, transparent regarding funding and capable of sufficient data protection. Researchers must use the data exclusively for research that contributes to detecting, identifying and understanding systemic risks within the EU. A third-party overview of platforms’ data access application forms is available

Further Reading

For those interested in learning more about specific provisions and mechanisms within the DSA, the following resources offer greater depth or cover areas of the legislation not discussed in this Explainer.

General resources:

  • The Digital Services Act Package (European Commission): The homepage for the DSA and DMA, including many of the links below.
  • DSA FAQs (European Commission): An accessible list of answers to frequently asked questions, including how the DSA covers illegal content, disinformation, and more.
  • Overview of EU technology legislation (Interface): Provides an overview of the wider technology legislation landscape in the EU.
  • List of designated VLOPs and VLOSES and enforcement actions (European Commission): Lists all designated VLOPs and VLOSEs, their EU user numbers, the EU company where they are legally based, and lists links to Commission requests for information and, for some platforms, notices of the ‘opening of proceedings’ notices to date. These notices list the types of risks and potential areas of non-compliance under investigation.
  • The impact of the DSA on digital platforms (European Commission): A public facing, high-level and non-comprehensive summary of areas in which the Commission claims progress to date.
  • Digital services coordinators (European Commission): A summary of the roles of DSCs, as well as an up-to-date list of the DSCs within each member state.

Transparency and data access resources:

 _________________________________________________________________________________   

This Explainer was uploaded on 13 August 2024.

Gangstalking and Targeted Individuals

‘Targeted individuals’ (TIs) are self-identified individuals who believe they are victims of constant group stalking, monitoring, and harassment (i.e. “gangstalking”) by shadowy adversaries, most commonly government agents. TIs have committed at least four mass shootings or acts of violence in the United States since 2013.

US ‘Antifa’ Groups

'Antifa' groups operate in a mostly decentralized way but share a belief that they're resisting fascist ideology, primarily through protests and counterdemonstrations against far-right extremism, utilizing tactics such as doxxing, and at times criminal activity.

Neo-Confederate Ideology

'Neo-Confederate' refers to individuals or groups echoing US Confederate beliefs, emphasizing states' rights & heritage preservation, with some overlap with white supremacist ideologies.