Press release

Reality check and a special present: The Digital Services Act turns three.

2 October 2025

Auf Deutsch lesen

#dsa #eu

Laptop with vibrant, glowing display in blue, pink and orange against dark background, dramatically lit with color reflection on keyboard
Joshua Woroniecki via Pixabay

Berlin, October 2, 2025. On October 4, 2025, the Digital Services Act (DSA) will celebrate its third anniversary. Although it is not perfect, AlgorithmWatch considers the DSA to be a landmark of digital regulation. Together with others, the organization has a special gift for the DSA.

Together with the Mozilla Foundation and the DSA40 Data Access Collaboratory, AlgorithmWatch has launched a new kind of “mass request” to mark the third anniversary of the DSA. With that, several civil society organizations want to receive a daily overview of the most viral posts in each EU member state from various providers of large online platforms. The platforms include Facebook, Instagram, LinkedIn, TikTok, YouTube, and X. The decision to accept or reject the request lies with the platforms.

“We want to be able to quickly identify which content has the greatest potential impact and better understand what type of content is most heavily promoted by the algorithms. If the platforms reject our request, we are prepared to challenge that decision,” says Oliver Marsh, Head of Tech Research at AlgorithmWatch.

This request is made possible by the DSA, which requires platforms to provide access to public data “without undue delay.”

Data access for research, civil society, and media professionals is one of the most important elements of the DSA. This data is essential for identifying potential risks and harm on the internet. However, there is a gap between idea and reality: “Unfortunately, many companies pay little attention to the existing rules for access to public data. X, for example, regularly refuses to grant access to requested data. This is currently affecting an AlgorithmWatch project on the topic of “non-consensual sexualization tools” (NSTs, often known as “nudifying apps”). Other platforms, including Meta and TikTok, have provided poor-quality data or created enormous hurdles in the past,” states Oliver Marsh.

The second key component of the DSA is risk assessment, which must be actively carried out by very large online platforms and search engines. However, this idea has yet to deliver on its promise. “The first risk assessment reports are of no real use, and it is still unclear how the DSA distinguishes systemic risks from other risks,” says Oliver Marsh.

For instance, Generative AI summaries on widely used search engines are likely to pose a systemic risk to reliable media and journalism. “Traffic to websites will continue to decline. This poses a serious risk that reliable journalism will no longer be a viable business model – but this has not been taken into account in the DSA so far,” says Oliver Marsh.

The big tech companies will continue to test products like these AI summaries on the public. The increasing alignment of many tech CEOs with anti-democratic forces in the US – and with political forces in Europe that they see as role models – carries a growing risk that opaque algorithms will influence public debate instead of transparent moderation decisions. In view of these and other challenges, the DSA can provide many different groups with tools to practice democratic control. “We are glad that the DSA exists. But we hope that its capabilities will grow by its next birthday,” summarizes Oliver Marsh.

More Information: