Policy Brief #6 of the Digital Autonomy Hub | Executive Summary:
Does the Instagram recommender algorithm encourage eating disorders in teenagers? How difficult is it for advertisers on YouTube to discriminate against racial and other minorities? Is Russian disinformation systematically evading Facebook’s content moderation systems in languages like Bulgarian? And when confronted with such risks on their platforms, are Big Tech companies taking the appropriate and necessary measures to deal with them? It is impossible for the public to have an informed debate about social media’s impacts on society so long as powerful platforms like Facebook, YouTube, Twitter, and TikTok remain opaque about the design and implementation of their decision-making algorithms. As revelations from Facebook whistleblower Frances Haugen and others have made clear, these platforms cannot be trusted to simply self-assess the risks their services may pose to the public – risks like the spreading of disinformation, electoral interference, or harms to mental health. Rather, they require external scrutiny to hold them accountable.
In this policy brief, published with the Digital Autonomy Hub (in German), we explain why independent research analyzing platforms in the public interest is an essential component of platform accountability that must be ensured through comprehensive data access frameworks. These insights should form the basis for more evidence-based platform regulation, so as to strengthen users' digital self-determination in dealing with online platforms and help mitigate threats to individual health, social cohesion, and democracy.
An important step in this direction is the EU’s new Digital Services Act (DSA). The DSA creates a uniform set of rules on the duties and responsibilities for online platforms in the EU, and via Article 31 establishes a regulated data access regime that would force the largest online platforms to share data with vetted researchers. Passing the DSA is only the first step, however, as authorities must quickly shift their focus to ensuring its effective implementation.
Implementation of the DSA will be a complex task given new institutions and capacities that must be built up at both national and European levels to effectively administer the law. In terms of data access, regulators under the DSA must be equipped to adequately vet researchers, safely facilitate data access, and compel platforms to cooperate. They will also have to determine how to balance the public interest of data access requests with the imperative to protect business secrets and the security of online services. And the European Commission must still clarify many technical details with important consequences for how the DSA’s data access rules will work in practice.
Our policy recommendations:
1. Trade secrets
We implore regulators not to allow platforms to abuse the “trade secrets” exemption in Article 31(2a) of the DSA to routinely undermine data access requests. Recital 64 in the DSA clarifies that the law must be interpreted such that platforms do not unduly invoke "trade secrets" as an excuse to deny access to data to vetted researchers. The strict vetting criteria that researchers must meet to gain access to platform data (independence from commercial interests, transparent funding, high data security requirements, etc.) ensures that researchers handle legitimately sensitive information responsibly.
2. Ensure data access for civil society organizations
We urge that the relevant regulatory authorities reliably enforce data access not only for academic researchers but also for civil society organizations under the DSA. The question of what constitutes a research organization must not be interpreted too narrowly, as the explicit mention of civil society organizations in Recital 64 makes clear. This must be reliably enforced, so that civil society can exercise meaningful public scrutiny over large online platforms.
3. An independent intermediary body to support data access
We recommend that the Commission create or appoint an independent intermediary body in the proposed delegated acts to support the sharing of data between platforms and researchers. This institution should facilitate secure access to platform data and could play an advisory role in the vetting process of researchers, which would particularly support Member States with limited capacities.
4. Sufficient resources for enforcement at the Commission level
While data access will mostly be administered by the national Digital Services Coordinators, other provisions in the DSA seek to address lessons learned from the GDPR by transferring enforcement powers over Very Large Online Platforms (VLOPs) from national authorities to the EU Commission. We stress that such independent EU-level enforcement powers must be matched with adequate resources. This would allow for deep and consistent checks of VLOP’s compliance with due diligence measures from the outset.
5. Strong national Digital Service Coordinators and clarifying the relationship between the DSA and NetzDG
We call on the German government to support a strong national Digital Services Coordinator with extensive competencies and to quickly clarify the details of the relationship between the DSA and the NetzDG with regard to researchers’ access to platform data.
6. A shield for research in the public interest
A comprehensive and strong regulatory framework is essential to enable data access for research in the public interest. The DSA is a milestone on this path. The implementation of the DSA must now live up to its high expectations.
A full version of this policy brief (in German) is available here: