Press release

AlgorithmWatch and AI Forensics among the first organizations to request platform data under the DSA

Berlin, 15 February 2024. The EU’s Digital Services Act (DSA) is designed to give citizens new powers to protect their rights online. Today, many of its most impactful provisions are supposed to come into force: notably the creation of a Digital Services Coordinator (DSC) in every EU Member State and new opportunities for researchers to access platform data.

How is this important for regular platform users? Here’s a real-life example: A Dutch teenager that AlgorithmWatch talked to built an Instagram presence that brought her over 20,000 followers over two years. Then, overnight, it was gone. She had become victim to malicious reporting of her account for the sixth time in a row. Many content creators, especially women, are regularly reported to Meta - either by criminals who want to take over accounts or by online trolls. In theory, with the DSA, they now have a powerful tool to protect their rights - but it may fall short.

A new sheriff in town: the Digital Services Coordinator

Under the DSA, platforms must put in place their own complaint-handling mechanisms to address problems like the above. But if those are ineffective, what then? After 17 Feb, every EU Member State should have a Digital Services Coordinator (DSC). These bodies will have a range of powers; one is to appoint specialist bodies to which users in that country can complain about mistreatment by platforms. (so-called “out-of-court” settlement bodies). This provides users a straightforward, and independent, way to exercise their rights online.

New data available for researchers

Recent years have seen a worrying shift in platforms towards closing off access to data. Meta, for instance, has quietly deprecated Crowdtangle, the main tool used by researchers to monitor Facebook and Instagram. Lack of data access makes external research and understanding of these platforms extremely challenging.

With the DSCs comes an entirely new opportunity for data access and external scrutiny. The DSCs are able to request specific data from platforms, in order to support monitoring and mitigating systemic risks, such as risks arising from platforms to democratic elections. Researchers can also access such data via sending a request through a DSC. This data can include internal content and metrics that would previously have been visible only to platforms. But there are limitations to this new power. DSCs must approve (“vet”) applications and researchers to ensure their research is appropriate and that they can store data safely. Platforms can also challenge data access requests on grounds that there could be issues with security or trade secrets, leading to negotiations over alternative options with DSCs.

Acces to platform data: Will it work?

AlgorithmWatch and other organizations are concerned that there will be substantial delays in the DSCs being fully ready − for instance, in Germany the DSC may not be legally appointed until April. With the extremely important European Parliament election and elections in three German states coming soon, this would substantially limit the powers above.

AlgorithmWatch and AI Forensics are continuing their investigation into election misinformation on Microsoft’s search engine Bing. Previous research found that one third of Bing Chat’s answers to election-related questions about the Bavarian, Hessian, and Swiss elections in October 2023 contained factual errors including wrong election dates, outdated candidates, or even invented scandals concerning candidates.

“The opportunity to include previously inaccessible data in our research means we can explore the real scale of the issue, and the effectiveness of Microsoft’s mitigations. This is a really pressing concern given that there will be elections soon, where misinformation on online platforms could have a negative impact,” says Dr. Oliver Marsh, head of AlgorithmWatch’s project "Auditing Algorithms for Systemic Risks.”

“We hope that our data request will show how the DSA’s new provisions can be used to protect against online risks. If not, it might reveal administrative barriers that have to be resolved quickly - given the urgency of protecting citizens' rights in this crucial election year,” explains Raziye Buse Çetin, Policy and Communications Lead at AI Forensics.

If you would like to learn more about our data request, our research, or our work on the DSA, please get in touch with:

Oliver Marsh
Project Lead "Auditing Algorithms for Systemic Risks"
Clara Helming
Senior Advocacy & Policy Manager

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.