Explainer: DSA
A guide to the Digital Services Act, the EU’s new law to rein in Big Tech
Everything you need to know about the Digital Services Act (DSA), Europe’s new law to make powerful tech platforms like YouTube, TikTok, Facebook, and Twitter more transparent and accountable for the risks they pose to society.
If you want to learn more about our policy & advocacy work on the DSA, get in touch with
What is the DSA, and why do we need it?
The Digital Services Act (DSA) is a new set of regulations that aims to force major internet platforms like Facebook, YouTube, TikTok, Twitter and others to do more to tackle the spread of illegal content and other societal risks on their services in the EU—or else risk billions of Euros in fines. Together with its sister legislation, the Digital Markets Act, it establishes a single set of rules that will apply across the whole EU and sets a potential global standard in platform governance.
The DSA aims to end an era of in which tech companies have essentially regulated themselves – setting their own policies on how to moderate content, and issuing “transparency reports” about their efforts to combat harms like disinformation that have been practically impossible for third parties to scrutinize. The DSA promises to change this status quo by forcing platforms to be more transparent about how their algorithmic systems work, and holding them to account for the societal risks stemming from the use of their services.
The DSA was published in the official journal of the European Union in October 2022 and entered into force shortly thereafter. The law will become applicable across the EU in February of 2024—but the new rules already kick in on 25 August 2023 for the largest platforms and search engines operating in Europe (19 of which were officially designated by the European Commission in April). These so-called Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs) meet the threshold of having over 45 million active EU users.
How does the DSA change the status quo?
The final version of the DSA is an over 300-page long legal document with complex rules detailing tech companies’ new legal obligations, as well as the responsibilities of the EU and member states with regard to its enforcement. It includes:
- Clear rules for dealing with illegal content: The DSA updates the process by which digital service providers must act to rapidly delete illegal content based on national or EU law. It also reinforces an EU-wide ban on general content monitoring, such that platforms won’t be forced to systematically police their platforms to the detriment of free speech.
- New rights for users to challenge content moderation decisions: Platforms must provide affected users with detailed explanations if ever they block accounts, or else remove or demote content. Users will have new rights to challenge these decisions with the platforms and seek out-of-court settlements if necessary.
- More transparency on recommender systems and online advertising: Platforms must clearly lay out how their content moderation and algorithmic recommender systems work in their terms of service, and they must offer users at least one option for an alternative recommender system (or “feed”) not based on profiling. They must also give users clear information about why they were targeted with an ad and how to change ad targeting parameters.
- Limited restrictions on targeted advertising and deceptive designs: The DSA establishes a ban on targeting advertisements to children and profiling individuals based on “sensitive” traits like their religious affiliation or sexual orientation. The DSA will also introduce limits on design practices that deceive and manipulate users, aka “dark patterns.”
- General transparency and reporting requirements: Platforms will be required to produce annual reports on their content moderation efforts, including the number of orders (received from Member States or “trusted flaggers”) to take down illegal content, as well as the volume of complaints from users and how these were handled. The transparency reports must also describe any automated systems used to moderate content and disclose what their accuracy and possible error rate could be.
- Obligations for the largest platforms to rein in “systemic risks”: EU lawmakers recognized that the largest platforms pose the greatest potential risks to society—such risks include negative effects on fundamental rights, civic discourse and elections, gender-based violence, and public health. That’s why the DSA will obligate platforms with over 45 million users in the EU, like YouTube, TikTok, and Instagram, to formally assess how their products, including algorithmic systems, may exacerbate these risks to society and to take measurable steps to prevent them.
- Legally-mandated data access for external scrutiny: Platforms’ self-assessments and risk-mitigation efforts won’t simply be taken on faith—platforms will also be forced to share their internal data with independent auditors, EU and Member State authorities, as well as researchers from academia and civil society who may scrutinize these findings, and thereby help identify systemic risks and hold platforms accountable for their obligation to rein them in.
- New competencies and enforcement powers for the European Commission and national authorities: Enforcement will be coordinated between new national and EU-level bodies. The Commission will have direct supervision and enforcement powers over the largest platforms and search engines, and can impose fines of up to 6% of their global turnover. The Commission may also apply supervisory fees to platforms to help finance their own enforcement tasks.
DSA’s next steps: Compliance for VLOPs and VLOSEs
The 25 August 2023 was the date when VLOPs and VLOSEs were obligated to submit their first systemic risk assessments to the European Commission's regulators and independent auditors as well as to implement new rules on content moderation. Many of these new features should therefore become visible and/or accessible to users of the relevant platforms in the short-term, such as an option to use a reverse-chronological recommender feed instead of the current standard algorithmically-curated feed.
However, the systemic risk assessments which are lynchpins of the DSA’s accountability structure are being produced without official guidance from the European Commission and will remain largely beyond public view. This raises the concerning possibility that social media platforms adopt risk assessment metrics and methodologies that serve to protect their bottom line rather than the public interest. That’s why AlgorithmWatch has advocated for a meaningful data access for public interest scrutiny and a formal advisory mechanism for civil society to contribute independent expertise and act as a check on audit-washing.
- What to expect from systemic risk assessments?: The DSA is remarkably quiet on how, precisely, VLOPs and VLOSEs should conduct risk assessments and what legislators expect from them. It is also unclear what the Commission will release about their contents, and when (the DSA timeline promises only that the public will see the first official audit reports from December 2024, and these disclosures will likely be heavily redacted by the platforms). Given the current lack of formal guidance, AlgorithmWatch has developed a framework for identifying platform risks to freedom of speech and media pluralism to help inform what different stakeholders can and should expect from a risk assessment, and how it could be done in practice.
- New user-level transparency and controls: If VLOPs and VLOSEs are complying with the DSA, then users should already start noticing changes on major social media platforms and search engines; for example, a feature which makes it easier to flag illegal content, as well as updated Terms and Conditions which should be clear and intelligible to all users, even minors. Meta, for example, released a new transparency report in June describing their content ranking system, improvements to user controls of their feed, and tools for public interest research. Although the report has its shortcomings, it is an apparent signal of Meta’s attempt to comply with some of the DSA’s transparency requirements.
DSA’s next steps: Empowering Digital Services Coordinators (DSCs) and enforcement
On 17 February 2024, the DSA will become fully applicable across the EU for all entities in its scope. By that time, each EU Member state will need to appoint its own Digital Services Coordinator (DSC) — an independent regulator responsible for enforcing the rules on smaller platforms established in their country as well as liaising with the Commission and other DSCs in broader enforcement efforts. With this timeline closing in, EU countries and the Commission must continue to build up the necessary capacities and human resources to adequately enforce the law (the Commission has, for example, launched a European Centre for Algorithmic Transparency to aid in its enforcement efforts).
The enforcement battles ahead are not an abstraction, as the turmoil at Twitter under new owner Elon Musk may present a major early test for EU watchdogs looking to enforce the DSA. The company’s haphazard rollout of a paid verification scheme, for example, flooded the platform with misinformation causing real-world damage—such a move in the future could trigger an investigation and the possibility of fines under the DSA given that the law forbids VLOPs from implementing any major design change without conducting a prior risk assessment.
Meanwhile, because the DSA establishes one set of platform regulations for the entire EU, regulators across the union are now in the process of revising or adopting laws to prepare for national implementation. In Germany, the legal predecessor for platform regulation (which focused on social media platforms), the Netzwerkdurchsetzungsgesetz, will likely be replaced by a new piece of legislation, the so-called Digitale-Dienste-Gesetz. While the law is still in the drafting phase, it seems likely to designate the Bundesnetzagentur as Germany’s future DSC, while also providing for a designated research budget and appointing an advisory body that is meant to consult the DSC on research efforts and enforcement measures.
Beyond the open questions on enforcement and national-level implementation, there are a slew of delegated acts, implementing acts, potential codes of conduct, and voluntary standards referenced in the DSA, some of which have yet to be fully developed. These will eventually clarify certain aspects of the law, such as the technical conditions for data sharing between platforms and external researchers who may serve as a check on platforms' systemic risk assessments and audit reports.
Read more on our policy & advocacy work on the Digital Services Act.