A guide to the Digital Services Act, the EU’s new law to rein in Big Tech

Everything you need to know about the Digital Services Act (DSA), Europe’s new law to make powerful tech platforms like YouTube, TikTok, Facebook, and Instagram more transparent and accountable for the risks they pose to society.

What is the DSA, and why do we need it?

The Digital Services Act (DSA) is a new set of regulations that would force major internet platforms like Facebook, YouTube, and others to do more to tackle the spread of illegal content and other societal risks on their services in the EU—or else risk billions of Euros in fines. Together with its sister legislation, the Digital Markets Act, it establishes a single set of rules that will apply across the whole EU and sets a potential global standard in platform governance.

The DSA aims to end an era of in which tech companies have essentially regulated themselves – setting their own policies on how to moderate content, and issuing “transparency reports” about their efforts to combat harms like disinformation that have been practically impossible for third parties to scrutinize. The DSA promises to change this status quo by forcing platforms to be more transparent about how their algorithmic systems work, and holding them to account for the societal harms stemming from the use of their services.

What’s new in the DSA?

The final version of the DSA is an over 300-page long legal document with complex rules detailing tech companies’ new legal obligations, as well as the responsibilities of the EU and member states with regard to its enforcement. It includes:

  • Clear rules for dealing with illegal content: The DSA updates the process by which digital service providers must act to rapidly delete illegal content based on national or EU law. It also reinforces an EU-wide ban on general content monitoring, such that platforms won’t be forced to systematically police their platforms to the detriment of free speech.
  • New rights for users to challenge content moderation decisions: Platforms must provide affected users with detailed explanations if ever they block accounts, or else remove or demote content. Users will have new rights to challenge these decisions with the platforms and seek out-of-court settlements if necessary.
  • More transparency on recommender systems and online advertising: Platforms must clearly lay out how their content moderation and algorithmic recommender systems work in their terms of service, and they must offer users at least one option for an alternative recommender system (or “feed”) not based on profiling. They must also give users clear information about why they were targeted with an ad and how to change ad targeting parameters.
  • Limited restrictions on targeted advertising and deceptive designs: The DSA establishes a ban on targeting advertisements to children and profiling individuals based on “sensitive” traits like their religious affiliation or sexual orientation. The DSA will also introduce limits on design practices that deceive and manipulate users, i.e. “dark patterns.”
  • General transparency and reporting requirements: Platforms will be required to produce annual reports on their content moderation efforts, including the number of orders (received from Member States or “trusted flaggers”) to take down illegal content, as well as the volume of complaints from users and how these were handled. The transparency reports must also describe any automated systems used to moderate content and disclose what their accuracy and possible error rate could be.
  • Obligations for the largest platforms to rein in “systemic risks”: EU lawmakers recognized that the largest platforms pose the greatest potential risks to society—such risks include negative effects on fundamental rights, civic discourse and elections, gender-based violence, and public health. That’s why the DSA will obligate platforms with over 45 million users in the EU, like YouTube, TikTok, and Instagram, to formally assess how their products, including algorithmic systems, and may exacerbate these risks to society and to take measurable steps to prevent them.
  • Legally-mandated data access for external scrutiny: Platforms’ self-assessments and risk-mitigation efforts won’t simply be taken on faith—platforms will also be forced to share their internal data with independent auditors, EU and Member State authorities, as well as researchers from academia and civil society who may scrutinize these findings, and thereby help identify systemic risks and hold platforms accountable for their obligation to rein them in.
  • New competencies and enforcement powers for the European Commission and national authorities: Enforcement will be coordinated between new national and EU-level bodies. The Commission will have direct supervision and enforcement powers over the largest platforms and search engines, and can impose fines of up to 6% of their global turnover. The Commission may also apply supervisory fees to platforms to help finance their own enforcement tasks.

What are the DSA’s next steps?

The DSA was published in the official journal of the European Union on 27 October 2022, and entered into force shortly thereafter (on 16 November). The law will become applicable across the EU in February of 2024—but the new rules kick in earlier for the largest platforms and search engines, which will have four months to comply with the DSA once being designated by the European Commission. Meanwhile, because the DSA establishes one set of platform regulations for the entire EU, national digital regulations like Germany’s NetzDG will have to be fundamentally revised.

Here are some notable upcoming steps for putting the DSA into action:

  • Transparency on user numbers to determine VLOPs & VLOSEs
    By 17 February 2023, platforms and search engines must report the number of their active end users to the EU Commission. The Commission will then assess which services meet the threshold (over 45 million active EU users) of Very Large Online Platform (VLOP) or Very Large Online Search Engine (VLOSE).
  • Compliance for VLOPs & VLOSEs
    The largest platforms and search engines will have four months—until June 2023—to comply with the rules in the DSA. This includes conducting and publishing their first annual risk assessments, as well as implementing rules on content moderation. Users should start to notice changes on the platforms by this time, for example, a feature which makes it easy to flag illegal content, plus updated Terms and Conditions which should be clear and intelligible to all users, even minors.   
  • Empowering Digital Services Coordinators (DSCs)
    On 17 February 2024, the DSA will become fully applicable across the EU for all entities in its scope. By that time, each EU Member state will need to appoint its own Digital Services Coordinator (DSC) — an independent regulator responsible for enforcing the rules on smaller platforms established in their country as well as liaising with the Commission and other DSCs in broader enforcement efforts.

Now that the timeline for DSA implementation is set, EU countries and the Commission will need to build up the necessary capacities and human resources to adequately enforce the law. To that end, the Commission has recently launched a European Centre for Algorithmic Transparency to aid in its enforcement efforts, and has promised to thoroughly examine VLOP and VLOSE compliance starting in the summer of 2023.

The enforcement battles ahead are not an abstraction—indeed, the recent turmoil at Twitter under new owner Elon Musk may present a major early test for EU watchdogs looking to enforce the DSA. The company’s haphazard rollout of a paid verification scheme, for example, flooded the platform with misinformation causing real-world damage—a move that, if Twitter were to be designated a VLOP, would likely trigger an investigation and the possibility of significant fines under the DSA given that the law forbids VLOPs from implementing any major design change without conducting a prior risk assessment.

Beyond the open questions on enforcement, there are a slew of delegated acts, implementing acts, potential codes of conduct, and voluntary standards referenced in the DSA, most of which have yet to be developed. These will eventually clarify certain aspects of the law, such as the technical conditions for data sharing between platforms and external researchers.

List of our publications relating to the DSA

Read more on our policy & advocacy work on the Digital Services Act.