
Explainer: DSA
A guide to the Digital Services Act, the EU’s law to rein in Big Tech
Everything you need to know about the Digital Services Act (DSA), Europe’s law to make powerful tech platforms like YouTube, TikTok, Facebook, and X more transparent and accountable for the risks they pose to society.

What is the DSA, and why do we need it?
The Digital Services Act (DSA) is a regulation that aims to force online services like Facebook, YouTube, TikTok, X and others to do more to tackle the spread of illegal content and other societal risks on their services in the EU – or else risk fines for non-compliance. It is directed at ‘online intermediaries’ such as social media platforms, search engines, and online marketplaces – more information can be found on the EU Commission website. Together with its sister legislation, the Digital Markets Act, it establishes a single set of rules that apply across the whole EU and sets a potential global standard in platform governance.
The DSA aims to end an era in which tech companies have essentially regulated themselves – setting their own policies on how to moderate content, and issuing “transparency reports” about their efforts to combat harms like disinformation that have been practically impossible for third parties to scrutinize. The DSA set out to change this status quo by forcing platforms to be more transparent about how their algorithmic systems work, and holding them to account for the societal risks stemming from the use of their services. Yet, the enforcement of rules is only coming on with delays.
The DSA was published in the official journal of the European Union in October 2022 and entered into force shortly thereafter. The law has become effective as of February 2024, with certain rules already applying since August 2023 concerning the largest platforms and search engines operating in Europe. At present, over 20 platforms have been formally designated by the European Commission as Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs) meeting the threshold of having over 45 million monthly active EU users. Those include social media platforms like TikTok, X, Snapchat, Youtube, Meta’s Instagram and Facebook, search engines like Google Search and Microsoft Bing, as well as marketplaces and app stores like Amazon, AliExpress, Shein, Booking.com or Apple’s App Store, and some adult content platforms. The 45 million active users figure is 10% of the EU population, and updates as the population increases.
What does the DSA do?
The DSA is not a “censorship law”. It is not designed to specify takedowns of particular types of content, except if it’s illegal under existing national or EU laws. On the contrary, it creates transparency around services' moderation decisions, including when governments demand them (more here).
Tech companies’ legal obligations, as well as the responsibilities of the EU and member states regarding its enforcement include:
- Legally mandated data access for external scrutiny: Research organizations – which can include civil society organizations and public interest journalism – who meet particular requirements can request data from VLOPs and VLOSEs. For publicly accessible data, such as public posts and engagement data, researchers apply directly to the relevant VLOP or VLOSE (see our guide here). There are ongoing legal discussions around whether collecting publicly accessible data without applying to platforms, e.g. by scraping, would also be covered if an organization meets the requirements; more clarity is needed. Non-public data can be accessed by applying via a national regulator (see guide here). These rules are designed to ensure that public interest research into systemic risks is not hampered by inability to access data due to technical barriers or threats of legal action by a VLOP or VLOSE.
- More transparency on recommender systems and online advertising: Platforms must clearly lay out how their content moderation and algorithmic recommender systems work in their terms of service, and they must offer users at least one option for an alternative recommender system (or “feed”) not based on profiling. They must also give users clear information about why they were targeted with an ad and VLOPs and VLOSEs must store ad information in a publicly accessible repository.
- Rights for users to challenge content moderation decisions: Platforms must provide affected users with detailed explanations if ever they block accounts, or else remove or demote content. Users will have new rights to challenge these decisions with the platforms and, if necessary, seek out-of-court settlements with specially established bodies.
- Clear rules for dealing with illegal content: The DSA updates the process by which digital service providers must act to properly address illegal content based on national or EU law. It also reinforces an EU-wide ban on general content monitoring, meaning that platforms don’t have to systematically police their platforms to the detriment of free speech.
- Limited restrictions on targeted advertising and deceptive designs: The DSA establishes a ban on targeting advertisements to children and profiling individuals based on “sensitive” traits like their religious affiliation or sexual orientation. The DSA also introduces limits on design practices that deceive and manipulate users, aka “dark patterns.”
- Obligations for the largest platforms to mitigate “systemic risks”: EU lawmakers recognized that the largest platforms pose greater potential risks to society – such “systemic risks” include negative effects on fundamental rights, civic discourse and elections, gender-based violence, and public health. That’s why the DSA obligates VLOPs and VLOSEs to formally assess how their products, including algorithmic systems, may exacerbate these risks to society and to take measurable steps to prevent them. While the concept of “systemic risks” remains vague, it provides opportunities for researchers and civil society organizations to investigate and contribute to the implementation and enforcement of the DSA (for more details, see our publications: Researching Systemic Risks under the DSA & A Dual Track Approach to Systemic Risks). Most of the VLOPs and VLOSEs publish these reports in November each year, and a civil society response to the first reports can be found here.
- General transparency and reporting requirements: Platforms are required to produce annual reports on their content moderation efforts, including the number of orders (received from Member States or “trusted flaggers”) to take down illegal content, as well as the volume of complaints from users and how these were handled. The transparency reports must also describe any automated systems used to moderate content and disclose what their accuracy and possible error rate could be.
- Competencies and enforcement powers for the European Commission and national authorities: The Commission, specifically Directorate-General CNECT, has direct supervision and enforcement powers over VLOPs and VLOSEs, and can impose fines of up to 6% of their global turnover. The Commission may also apply supervisory fees to platforms to help finance their own enforcement activities. Each EU member state also has a Digital Services Coordinator (DSC) to supervise smaller services based in their country, and also to support residents to exercise their rights, and other regulators to gather evidence and enforce requirements, under the DSA.
DSA-Timeline – What happened so far
- Overview: All European Commission activities, including various requests for information from VLOP/SEs, designations of new VLOP/SEs, and other enforcement actions can be found here.
- April 2023: Launch of European Centre for Algorithmic Transparency (ECAT).
- August 2023: VLOPs and VLOSEs filed their first DSA‑mandated risk assessments without European Commission guidance existing, meaning the reports remain largely inaccessible to the public and may reflect platform commercial interests more than public‑interest concerns. For more information, see Call for Evidence; Audits von Risiken; Risikobewertungen und Audits im DSA; How to define platform risks.
- February 2024: The DSA becomes fully applicable, creating new powers and responsibilities for Digital Services Coordinators and rules for requesting publicly accessible data, among other changes.
- April 2024: Election Guidelines for VLOPs and VLOSEs are published. Find our response here.
- May 2024: The Bundesnetzagentur (BNetzA) is designated as the German Digital Services Coordinator (DSC).
- November 2024: First risk assessments and audits published. For more information, see CSO analysis of Risk Assessments , Guide to DSA Risk Assessments, Flaws of the first Audits.
- By end of 2024: Most national DSCs have been designated (see list of DSCs here).
- February/May 2025: Democracy Reporting International and Gesellschaft Für Freiheitsrechte filed a lawsuit against X in Berlin court for refusing data access. Although X was not found to be in violation of the DSA, the judges underscored that researchers are entitled to claim access to data in the country where their work is carried out. See the full report here.
- June 2025: European Commission makes the commitments from AliExpress on improving transparency binding.
- August 2025: European Commission makes commitments to TikTok on permanently removing the TikTok Lite Rewards program from the EU binding.
- October 2025: DSA Article 40.4 came into effect, granting vetted researchers right to request access to non-public data from VLOPs and VLOSEs to study systemic risks on platforms and their risk mitigation measures.
- December 2025: First fine against X for non-compliance with the DSA’s transparency requirements.
- Throughout 2025: Further risk assessments published including from some newly designated VLOP/SEs.
Read more on our policy & advocacy work on the Digital Services Act.




