DSA milestone: EU lawmakers have responded to our calls for meaningful transparency for big tech

Over the last months, AlgorithmWatch – supported by dozens of civil society organizations and researchers, and over 6.000 individuals – has advocated for using the Digital Services Act (DSA) to enable meaningful transparency into the way online platforms influence our public sphere. The vote in the European Parliament today shows that our work has made an impact.
Dmitry Ratushny | Unsplash

Today, the European Parliament reached a milestone in regulating Very Large Online Platforms via the Digital Services Act (DSA), as the finalized text of the DSA was approved in the Committee on the Internal Market and Consumer Protection (IMCO). This key committee vote marked an important moment for AlgorithmWatch and other civil society watchdogs, who have been working tirelessly to ensure that the DSA empowers vetted public interest research to reign in platform risks to the public sphere.

After Facebook forced AlgorithmWatch to shut down our Instagram Monitoring Project, we published an open letter in September, urging European Lawmakers to use the Digital Services Act to stop platforms from suppressing public interest research. Then, together with Global Witness, we wrote an open letter in November addressed to all IMCO Committee Members, calling on them to widen data access in the DSA to vetted public interest civil society organizations and remove the “trade secrets” exemption on the basis of which Very Large Online Platforms could deny requests for data access.

For these two letters, we received signatures from 50 civil society organizations and 38 international academics and independent researchers, as well as more than 6.000 public petitioners supporting our demands to lawmakers. Now we know that our voices have made an impact.

Crucially, and in line with our demands, the final DSA text approved in the IMCO includes data access and scrutiny by third-party vetted researchers being extended via Article 31(4) beyond academia to encompass civil society organizations who represent the public interest. Furthermore, the problematic “trade secret” language in Article 31(6), which had threatened to undercut the very purpose of the data access provision by effectively shielding platforms from data requests, was deleted from the Article.

Data access and scrutiny in Article 31 of the DSA, in concert with requirements for platforms to conduct risk-assessments (Article 26) and subject themselves to independent audits (Article 28), stand at the heart of the DSA’s transparency and oversight structure. While we celebrate the step forward taken in the IMCO today, we will need to strongly defend the integrity of the DSA’s provisions around data access and scrutiny in plenary and trilogue negotiations. Only if these provisions suggested by IMCO remain intact can we ensure that EU regulators and civil society have the legal means to truly hold big tech platforms accountable.