Policy Brief: Our recommendations for strengthening data access for public interest research

Image by Alan Warburton / © BBC / Better Images of AI / Social Media / Licenced by CC-BY 4.0

Policy Brief #6 of the Digital Autonomy Hub | Executive Summary:

Does the Instagram recommender algorithm encourage eating disorders in teenagers? How difficult is it for advertisers on YouTube to discriminate against racial and other minorities? Is Russian disinformation systematically evading Facebook’s content moderation systems in languages like Bulgarian? And when confronted with such risks on their platforms, are Big Tech companies taking the appropriate and necessary measures to deal with them?  It is impossible for the public to have an informed debate about social media’s impacts on society so long as powerful platforms like Facebook, YouTube, Twitter, and TikTok remain opaque about the design and implementation of their decision-making algorithms. As revelations from Facebook whistleblower Frances Haugen and others have made clear, these platforms cannot be trusted to simply self-assess the risks their services may pose to the public – risks like the spreading of disinformation, electoral interference, or harms to mental health. Rather, they require external scrutiny to hold them accountable.

In this policy brief, published with the Digital Autonomy Hub (in German), we explain why independent research analyzing platforms in the public interest is an essential component of platform accountability that must be ensured through comprehensive data access frameworks. These insights should form the basis for more evidence-based platform regulation, so as to strengthen users' digital self-determination in dealing with online platforms and help mitigate threats to individual health, social cohesion, and democracy.

An important step in this direction is the EU’s new Digital Services Act (DSA). The DSA creates a uniform set of rules on the duties and responsibilities for online platforms in the EU, and via Article 31 establishes a regulated data access regime that would force the largest online platforms to share data with vetted researchers. Passing the DSA is only the first step, however, as authorities must quickly shift their focus to ensuring its effective implementation.

Implementation of the DSA will be a complex task given new institutions and capacities that must be built up at both national and European levels to effectively administer the law. In terms of data access, regulators under the DSA must be equipped to adequately vet researchers, safely facilitate data access, and compel platforms to cooperate. They will also have to determine how to balance the public interest of data access requests with the imperative to protect business secrets and the security of online services. And the European Commission must still clarify many technical details with important consequences for how the DSA’s data access rules will work in practice.

Our policy recommendations:

1. Trade secrets

We implore regulators not to allow platforms to abuse the “trade secrets” exemption in Article 31(2a) of the DSA to routinely undermine data access requests. Recital 64 in the DSA clarifies that the law must be interpreted such that platforms do not unduly invoke "trade secrets" as an excuse to deny access to data to vetted researchers. The strict vetting criteria that researchers must meet to gain access to platform data (independence from commercial interests, transparent funding, high data security requirements, etc.) ensures that researchers handle legitimately sensitive information responsibly.

2. Ensure data access for civil society organizations

We urge that the relevant regulatory authorities reliably enforce data access not only for academic researchers but also for civil society organizations under the DSA. The question of what constitutes a research organization must not be interpreted too narrowly, as the explicit mention of civil society organizations in Recital 64 makes clear. This must be reliably enforced, so that civil society can exercise meaningful public scrutiny over large online platforms.

3. An independent intermediary body to support data access

We recommend that the Commission create or appoint an independent intermediary body in the proposed delegated acts to support the sharing of data between platforms and researchers. This institution should facilitate secure access to platform data and could play an advisory role in the vetting process of researchers, which would particularly support Member States with limited capacities.

4. Sufficient resources for enforcement at the Commission level

While data access will mostly be administered by the national Digital Services Coordinators, other provisions in the DSA seek to address lessons learned from the GDPR by transferring enforcement powers over Very Large Online Platforms (VLOPs) from national authorities to the EU Commission. We stress that such independent EU-level enforcement powers must be matched with adequate resources. This would allow for deep and consistent checks of VLOP’s compliance with due diligence measures from the outset.

5. Strong national Digital Service Coordinators and clarifying the relationship between the DSA and NetzDG

We call on the German government to support a strong national Digital Services Coordinator with extensive competencies and to quickly clarify the details of the relationship between the DSA and the NetzDG with regard to researchers’ access to platform data.

6. A shield for research in the public interest

In addition to the DSA, European and national policymakers should ensure that researchers conducting privacy-compliant research in the public interest are better protected from platform power. It must not be possible for private corporations to systematically shut down research on our democratic public sphere by citing their terms of use and by filing – or threatening to file – intimidation lawsuits.

A comprehensive and strong regulatory framework is essential to enable data access for research in the public interest. The DSA is a milestone on this path. The implementation of the DSA must now live up to its high expectations.

A full version of this policy brief (in German) is available here:

Read more on our policy & advocacy work on ADM in the public sphere.

John Albert (he/him)

Policy & Advocacy Manager

Photo: Julia Bornkessel, CC BY 4.0

As a Policy & Advocacy Manager at AlgorithmWatch, John Albert covers ADM in the public sphere, especially the subjects platform regulation and legislative processes in the EU (e.g. the Digital Services Act, DSA). Before turning his eye to digital governance, John worked as a visual journalist and also produced documentary films. He holds master’s degrees in Public Policy from the Hertie School in Berlin and Journalism from Columbia University in New York.

Anne Mollen (she/her)

Senior Policy & Advocacy Manager

Photo: Julia Bornkessel, CC BY 4.0

As a Senior Policy & Advocacy Manager at AlgorithmWatch, Anne addresses ADM and sustainability, ADM at the workplace, and ADM in the public sector, the latter with a focus on AlgorithmWatch’s work related to German developments. She holds a PhD on technologies and people’s digital media practices in increasingly digital democracies from the University of Bremen. Anne has also been involved in many projects researching the interrelation between digital media technologies, society, and democracy.

Angela Müller

Head of Policy & Advocacy

Photo: David Bächtold

Angela is Head of Policy & Advocacy at AlgorithmWatch and AlgorithmWatch Switzerland and manages our activities on national and international levels. She also covers horizontal regulations, particularly the work of the Council of Europe (CAI) and the EU’s AI Act, ADM in the public sector and all our policy-related work in Switzerland, having testified as expert in various committees. Angela holds a Ph.D. (Dr.des.) in Law, where she focused on human rights in the context of globalization and new technologies, and a M.A. in Political and Economic Philosophy. She was a Visiting Researcher at Columbia University New York and Hebrew University Jerusalem. Previously, Angela worked at a civil society think tank, at universities, for an innovation platform, and at the Swiss Foreign Ministry.