What is the DSA, and why do we need it?
The Digital Services Act (DSA) is a new set of regulations that would force major internet platforms like Facebook, YouTube, and others to do more to tackle the spread of illegal content and other societal risks on their services in the EU – or else risk billions of Euros in fines. Together with its sister legislation, the Digital Markets Act, it establishes a single set of rules that will apply across the whole EU and sets a potential global standard in platform governance.
The DSA aims to end an era of in which tech companies have essentially regulated themselves – setting their own policies on how to moderate content, and issuing “transparency reports” about their efforts to combat harms like disinformation that have been practically impossible for third parties to scrutinize. The DSA promises to change this status quo by forcing platforms to be more transparent about how their algorithmic systems work, and holding them to account for the societal harms stemming from the use of their services.
What’s new in the DSA?
The final version of the DSA is an over 300-page long legal document with complex rules detailing tech companies’ new legal obligations, as well as the responsibilities of the EU and member states with regard to its enforcement. It includes:
- Clear rules for dealing with illegal content: The DSA updates the process by which digital service providers must act to rapidly delete illegal content based on national or EU law. It also reinforces an EU-wide ban on general content monitoring, such that platforms won’t be forced to systematically police their platforms to the detriment of free speech.
- New rights for users to challenge content moderation decisions: Platforms must provide affected users with detailed explanations if ever they block accounts, or else remove or demote content. Users will have new rights to challenge these decisions with the platforms and seek out-of-court settlements if necessary.
- More transparency on recommender systems and online advertising: Platforms must clearly lay out how their content moderation and algorithmic recommender systems work in their terms of service, and they must offer users at least one option for an alternative recommender system (or “feed”) not based on profiling. They must also give users clear information about why they were targeted with an ad and how to change ad targeting parameters.
- Limited restrictions on targeted advertising and deceptive designs: The DSA establishes a ban on targeting advertisements to children and profiling individuals based on “sensitive” traits like their religious affiliation or sexual orientation. The DSA will also introduce limits on design practices that deceive and manipulate users, i.e. “dark patterns.”
- General transparency and reporting requirements: Platforms will be required to produce annual reports on their content moderation efforts, including the number of orders (received from Member States or “trusted flaggers”) to take down illegal content, as well as the volume of complaints from users and how these were handled. The transparency reports must also describe any automated systems used to moderate content and disclose what their accuracy and possible error rate could be.
- Obligations for the largest platforms to rein in “systemic risks”: EU lawmakers recognized thatthe largest platforms pose the greatest potential risks to society – such risks include negative effects on fundamental rights, civic discourse and elections, gender-based violence, and public health. That’s why the DSA will obligate platforms with over 45 million users in the EU, like YouTube, TikTok, and Instagram, to formally assess how their products, including algorithmic systems, may exacerbate these risks to society and to take measurable steps to prevent them.
- Legally-mandated data access for external scrutiny: Platforms’ self-assessments and risk-mitigation efforts won’t simply be taken on faith – platforms will also be forced to share their internal data with independent auditors, EU and Member State authorities, as well as researchers from academia and civil society who may scrutinize these findings, and thereby help identify systemic risks and hold platforms accountable for their obligation to rein them in.
- New competencies and enforcement powers for the European Commission and national authorities: Enforcement will be coordinated between new national and EU-level bodies. The Commission will have direct supervision and enforcement powers over the largest platforms and search engines, and can impose fines of up to 6% of their global turnover. The Commission may also apply supervisory fees to platforms to help finance their own enforcement tasks.
What are the DSA’s next steps?
The DSA is expected to be formally adopted by the Council of the European Union in October 2022. It will then be published in the EU Official Journal, enter into force twenty days later, and become applicable across the EU after fifteen months or from 1 January 2024 (whichever comes later). The new rules will kick in even earlier for the largest platforms and search engines subject to systemic obligations: They will have four months to comply with the DSA once being designated by the European Commission.
In the meantime, EU countries and the Commission will need to build up the necessary capacities and human resources to adequately implement and enforce the DSA. Each Member State will need to empower a Digital Service Coordinator – an independent regulator responsible for enforcing the rules on smaller platforms established in their country, as well as rules concerning non-systemic issues for the largest platforms. And the Commission has promised to develop a “high-profile European Centre for Algorithmic Transparency” to aid in its enforcement efforts.
Because the DSA will create one set of platform regulations for the entire EU, national digital regulations like Germany’s NetzDG will have to be fundamentally revised as soon as the DSA comes into force. And beyond the open questions on enforcement, there are a slew of delegated acts, implementing acts, potential codes of conduct, and voluntary standards referenced in the DSA, most of which have yet to be developed. These will eventually clarify certain aspects of the law, such as the technical conditions for data sharing between platforms and external researchers.
List of our publications relating to the DSA
- Story | Facebook’s gutting of CrowdTangle: a step backward for platform transparency | 3 August 2022
- Op-ed | The Digital Services Act: It’s time for Europe to turn the tables on Big Tech | 5 July 2022
- Policy Brief | Our recommendations for strengthening data access for public interest research | 5 July 2022
- Blog | The Digital Services Act: EU sets a new standard for platform accountability | 25 April 2022
- Policy Paper | DSA trilogues in the endgame: Policymakers must prioritize platform transparency | 30 March 2022
- Joint Civil Society Briefing for the Digital Services Act Trilogues | 22 March 2022
- Joint Statement on Stakeholder Inclusion in the Code of Practice on Disinformation Revision Process | 24 February 2022
- Parliamentary question | EU Commission responds to MEP Breyer’s question concerning AlgorithmWatch | 11 January 2022
- DSA milestone | EU lawmakers have responded to our calls for meaningful transparency for big tech | 14 December 2021
- Open Letter to Members of the European Parliament IMCO Committee | Holding platforms accountable: The DSA must empower vetted public interest research to reign in platform risks to the public sphere | 29 November 2021
- Open letter to Members of European Parliament | Under Facebook’s thumb: Platforms must stop suppressing public interest research | 13 August 2021
- Position paper | The DSA proposal is a good start. Now policymakers must ensure that it has teeth | 16 December 2020
- Podcast | The EU Digital Services Act – Why data access matters | 9 December 2020
- Online Policy Dialogue with European Commission Executive Vice President Margarethe Vestager | Beyond the buzzwords: Putting meaningful transparency at the heart of the Digital Services Act | 3 November 2020
- Statement | Civil Society Coalition Led by AlgorithmWatch Calls for Binding Transparency Rules for Online Platforms | 30 October 2020
- Governing Platforms Project: Final Recommendations | Putting Meaningful Transparency at the Heart of the Digital Services Act | 30 October 2020
- Consultation submission | Our response to the European Commission’s planned Digital Services Act | 9 September 2020
- Joint statement | AlgorithmWatch joins call for ‘Universal Advertising Transparency by Default’ | 8 September 2020
- Story/campaign | Left on Read: How Facebook and others keep researchers in the dark | 9 July 2020
- Project | Governing Platforms | 2019-2021