Berlin / Brussels, 26 May 2020. An effective regulatory framework for intermediaries cannot be achieved without meaningful transparency in the form of data access for journalists, academics and civil society actors. This is the result of two new studies published by AlgorithmWatch in cooperation with the European Policy Centre and the Mainz Media Institute.
The COVID19 pandemic has underscored previous concerns about the tremendous power that intermediaries like Facebook and YouTube have in determining how information is found, moderated, and disseminated online. We are convinced that the EU’s forthcoming Digital Services Act presents a unique opportunity for European policymakers to address the systemic failures of existing regulatory frameworks, whilst safeguarding user rights and reducing online harms.
Since October 2019, AlgorithmWatch and the European Policy Centre have coordinated Governing Platforms, a multi-stakeholder dialogue series with participants from civil society, academia, policy, and the private sector. At today’s second (online) stakeholder convening, we present findings from two new studies commissioned from academic partners at the Mainz Media Institute.
In their study, ‘Are Algorithms a Threat to Democracy?’ Communications studies scholar Birgit Stark and her colleagues highlight some of the real, negative effects of online hate speech-especially for women, who are disproportionately impacted by online incivility. The study shows that while it is clear that intermediaries contribute to a growing climate of incivility online, the effects of challenges like disinformation are less clear, because research in this area is hindered by a lack of access to platform data.
“You can’t regulate what you don’t understand,” says Mackenzie Nelson, project manager at AlgorithmWatch and coordinator of Governing Platforms. “The Stark report highlights, among other findings, the importance of transparency and data access as a prerequisite for evidence-based regulation and accountability mechanisms.”
In recent weeks, civil society watchdogs have voiced concern about the spread and amplification of COVID19-related dis/misinformation, and some scholars have even gone so far as to warn of a social media-driven “infodemic.” At the same time, draconian emergency measures aimed at limiting the spread of Corona related “rumors” and “fake news” reveal the dangers of state-imposed restrictions on potentially harmful but legal online content.
Wary of such dangers, and the highly sensitive nature of communications regulation, Matthias Cornils, professor of media and public law at Mainz University, argues that public pressure from civil society and non-government actors is “a very important element of platform governance”. But in order to apply such pressure, civil society actors, journalists and users must be empowered, meaning that they are granted access to platform data.
“Decisions to regulate communications (both on and offline) should be grounded in empirical experience,” writes Cornils, author of, ‘Designing platform governance: A normative perspective on needs, strategies, and tools to regulate intermediaries’. Therefore “transparency obligations are the entry level of any imperative regulation”.
Professor Dr. Birgit Stark and Daniel Stegmann, M.A.
with Melanie Magin, Assoc. Prof. & Dr. Pascal Jürgens
Prof. Dr. Matthias Cornils
Mackenzie Nelson (AlgorithmWatch)
Project manager Governing Platforms