After a marathon negotiating session that extended into the early hours of Saturday morning (April 23), EU negotiators finally announced that they had reached a deal on the Digital Services Act (DSA) – what is a hopefully landmark achievement in regulating major tech companies like Facebook, YouTube, TikTok and Twitter. The DSA introduces a complex set of rules designed to protect people’s fundamental rights online, and seeks to make platforms more transparent and accountable for how they recommend and moderate content.
AlgorithmWatch welcomes the political agreement on the DSA for its potential to usher in a new era of transparency and public scrutiny over Big Tech companies. In particular, we celebrate the DSA’s requirements for online platforms to identify and tackle so-called “systemic risks” stemming from the design and use of their services – risks like the dissemination of hate speech as well as intentional manipulation which may have negative effects on civic discourse and electoral processes. While such self-reporting requirements for platforms are necessary, they are insufficient if we are to ensure that platforms adequately identify and act on systemic risks. That’s why the DSA’s risk assessment includes two supporting pillars: independent audits, and data access and scrutiny for both regulators and researchers working in the public interest.
Core to our mission at AlgorithmWatch is advocating for public interest research to better understand automated decision-making systems and their impact on society. That’s why we have long advocated for a DSA which empowers watchdogs working in the public interest with the legal means to investigate systemic risks stemming from online platforms — a call that became even more urgent when Facebook forced us to shut down our Instagram Monitoring Project in 2021. Thanks to our demands to lawmakers, and backed by significant support, we are encouraged by reports that the DSA’s transparency rules will open the door for civil society organizations seeking to access platform data. Despite the political deal that has now been reached on the DSA, however, the fine print on this and other issues will remain unknown until details are worked out at the so-called “technical level” and a final text is made available to the public (presumably sometime in May).
Given the opaque nature of the trilogue negotiations, we can’t yet evaluate exactly what was agreed upon on Saturday — and as the saying goes, the devil is always in the details. Some of the contours of the agreement have already been outlined in press releases by the EU Parliament, Council, and Commission, indicating where compromises were reached on wedge issues like online advertising and “dark patterns”. An early analysis suggests that, thanks in part to the collective efforts of civil society, the DSA will limit platforms’ most egregious forms of tracking-based advertising and deceptive design practices, and introduce a strong EU-level enforcement regime for the largest platforms. Also, we welcome that the DSA introduces important safeguards for individual rights such as improved “notice-and-action” procedures for users to flag potentially illegal online content, as well as redress mechanisms for users to dispute platforms’ content moderation decisions. Platforms, in turn, will be required to respond to such notices in a transparent way and without the threat of immediate legal liability — which is important to ensure that platforms do not over-block legal content. However, we suspect that some of the DSA’s final rules may not go far enough to protect users — a result which can be attributed to a massive lobbying campaign from Big Tech companies to influence EU regulations.
Although the EU now has a deal on the Digital Services Act, this story is far from over. To start with, we will only know whether the DSA keeps its promises once we see how it is and can be enforced in practice. History has shown us that platforms will exploit every opportunity to prevent public scrutiny. For example, we remain concerned about the extent to which these companies will be able to invoke their “trade secrets” under the DSA to deny data access requests for watchdogs. We can't understand the influence platforms have on our public debate if we don't understand how they algorithmically moderate it — the DSA offers a blueprint to help us do this, but it will only be meaningful if the law is effectively implemented and enforced.