Our response to the European Commission’s planned Digital Services Act

The consultation period for the European Commission's "Digital Services Act" ended on 8 September 2020. Read the submission by AlgorithmWatch that we submitted to the Commission.

Position

9 September 2020

#dsa #publicsphere

The consultation period for the European Commission's "Digital Services Act" ended on 8 September 2020. Read the submission by AlgorithmWatch that we submitted to the Commission (also available as PDF file).

Overview

I. Context
Internet platforms play a central and ever-expanding role in modern society
Towards accountability in the digital public sphere
II. Policy Recommendations
1. Maintain the limited liability regime
2. Develop approaches to effectively audit algorithmic systems
3. Introduce comprehensive data access frameworks for public interest research
4. Establish mandatory and comprehensive universal advertisement libraries

Submission by AlgorithmWatch

On the European Commission’s “Digital Services Act“ (DSA)

September 2020 

As an organization committed to upholding democratic values and fundamental rights, we see an urgent need to commit internet platforms to a higher level of transparency. We welcome the European Commission’s proposed Digital Services Act (DSA) as an opportunity to hold powerful platforms to account; and we strongly urge EU policymakers to put effective transparency at the heart of this “Magna Carta for Digital Services” by (1) maintaining the limited liability regime outlined in the e-Commerce Directive; (2) developing approaches to effectively audit algorithms; and (3) introducing binding data access frameworks for public interest research, including (4) comprehensive advertisement libraries.

I. Context

Internet platforms play a central and ever-expanding role in modern society

In the spring of 2020, AlgorithmWatch found clear indications that Instagram’s newsfeed algorithm was nudging female content creators to reveal more skin, by making semi-nude photos more visible in their followers’ feeds[1]. When we confronted Facebook, Instagram’s parent company, with our findings, they declined to answer our questions, and accused us of “misunderstanding how Instagram works.” They also rejected any requests to back up this statement with their own data, making it impossible to validate their claims.

Instagram’s response to our investigation is emblematic of a much deeper problem:  algorithmically-driven internet platforms play a central and ever-expanding role in modern society. They are inextricably linked to how we coordinate remotely at school or work, how we find and consume information or products, and how we organize our social movements or exercise key democratic rights. From vacation home rental services to social networking sites, large parts of our commerce and communications infrastructure are governed by opaque and proprietary algorithmic curation, optimization, and recommendations systems. Platforms rely on extremely intrusive data collection practices to power these systems, but when independent journalists, activists or academics try to understand the effects of these practices, data for public interest scrutiny is a scarce resource[2].

In recent years, platforms have further restricted access to their public Application Programming Interfaces (APIs), making it nearly impossible to hold companies accountable for illegal or unethical behavior[3].

The trend is especially worrisome with regard to online platforms that serve as content hosting providers. Today, these intermediaries play a key role in democratic deliberation. Algorithmically-driven content curation, on social media platforms in particular, can introduce a host of risks that affect the functionality of communication processes necessary for democracy[4], ranging from hate speech, to extremist content to algorithmic curation that discriminates against certain groups/parts of society.

Towards accountability in the digital public sphere

The e-Commerce Directive from 2000 cannot address the power that online platforms exercise in today’s digital society, let alone make the digital ecosystem fit for the future.

When updating the e-Commerce directive, we need to move beyond approaches that rely on online platforms’ self-regulation. Self-regulatory transparency frameworks have been found “incomplete, ineffective, unmethodical, and unreliable”[5]. And although improved user-facing transparency can offer much-needed insight into the personalized results presented to individual users, it cannot provide insight into the collective influence of platforms. To assess and monitor how platforms apply their community standards, or address collective societal risks like disinformation, polarization, and bias, we must rely on evidence from independent third parties/audits.

From Germany’s Interstate Media Treaty to France’s Avia Law to the EU’s most recent proposal on preventing the dissemination of terrorist content online, previous approaches to tackling platform power have been overlapping, incoherent, or otherwise fragmented[6]. This member state-driven patchwork approach has created insecurity for businesses and weakened civil society’s collective power. Accountability can only be achieved through a consolidated European approach.

The Digital Services Act (DSA) is a great opportunity to fix these regulatory shortcomings and pave the way for a more accountable digital public sphere.

II. Policy Recommendations

Below, we outline our key demands for the DSA. The obligations and enforcement powers of the proposed institution should differentiate between major players and smaller intermediaries. We propose that the scope of the recommendations be limited to dominant platforms[7].

1. Maintain the limited liability regime

Public authorities bound by fundamental rights cannot ban “low-quality” content or demand its suppression as long as it is legal[8]. This is a fundamental basis of freedom of expression and must not be undermined. The limited liability framework outlined in the E-Commerce directive is, thus, the right approach to dealing with illegal user-generated content. However, this framework must be enhanced and refined. Instead of introducing measures that oblige or encourage platforms to proactively monitor speech, the existing limited liability regime should be enhanced through more rigorous, and clear procedural standards for notice and action. Such standards should include complaint management and redress mechanisms, including put-back obligations. When embedded in a co-regulatory approach, independent dispute settlement bodies, such as those proposed by European Digital Rights (EDRi)[9], can play an important complementary role in ensuring that users’ fundamental rights are upheld. Moreover, an updated limited liability regime needs to be complemented by transparency frameworks and systems to effectively audit algorithmic systems that empower independent third parties to hold platforms to account and thereby help to, at least partly, relieve the individual user from the burden of proof.

2. Develop approaches to effectively audit algorithmic systems

AlgorithmWatch calls on the EU Commission to initiate a stakeholder engagement process aimed at developing effective measures for the auditing of algorithmic systems. We understand auditing in this context in accordance with ISO’s definition as a “systematic, independent and documented process for obtaining objective evidence and evaluating it objectively to determine the extent to which the audit criteria are fulfilled.”[10]

Specifically, the Commission should leverage stakeholder input to determine:

Depending on the purpose of the audit and the system audited, different approaches can be appropriate, including code reviews, conformity assessment, examination of training data, “black box testing” / audit studies or other means. Auditing can also be done in the form of risk / impact assessments, especially in cases where algorithmic systems may produce negative externalities (e.g. a vacation rental online marketplace driving gentrification of communities, information intermediaries’ automated content moderation systems putting certain communities at risk).

Both criteria and procedures should be further developed following a multi-stakeholder approach that actively takes into consideration the disproportionate affect ADM systems have on vulnerable groups and solicits their participation. We therefore ask the Commission and Member States to make available sources of funding aimed at enabling participation by stakeholders who have so far been inadequately represented.

The Commission should clarify:

3. Introduce comprehensive data access frameworks for public interest research

The Commission must urgently address the auditing questions outlined above. In the meantime, it is crystal clear that any successful accountability framework will require improved transparency and access to platform data. That’s why AlgorithmWatch proposes that the DSA introduce mandatory data access regimes for public-interest research that empower civil society and pave the way for true accountability.

Learning from best practices in privacy-respecting data-sharing governance models, the DSA should enshrine:

4. Establish mandatory and comprehensive universal advertisement libraries

What kind of data access could such a framework help facilitate? A promising approach in the area of online advertisement are advertisement libraries (“ad-libraries”) for online platforms. These ad-libraries disclose relevant information/data about advertisement on a particular platform. At the moment, such libraries are provided by select platforms on a voluntary basis, but as many studies on the implementation of the EU Code of Practice against Disinformation have shown, false negatives and false positives were rife in the political ad libraries of the signatories of the code: non-political advertisements were erroneously included in the libraries, while many political ads were excluded. The lack of a comprehensive repository of all ads made it impossible to verify whether all political ads were included in the libraries, and the political ad libraries and labelling missed a lot of sponsored content. In a situation where it is difficult to police the labelling of political ads, it is ultimately necessary to ensure the transparency of all ads. This is why we join the European Partnership for Democracy (EPD) and many other partners[16] in calling for the introduction of comprehensive advertisement libraries. Together we are asking that the European Commission develop and legally enact a set of minimum technical standards and protocols with which any ad-library has to comply. This ensures accessibility and compatibility across ad-libraries. As a guidance we suggest to consider the criteria outlined in our joint statement.


[1] Judith Duportail et al (2020): Undress or fail: Instagram’s algorithm strong-arms users into showing skin.

[2] Madeline Brady (2020): Lessons Learned: Social Media Monitoring during Elections: Case Studies from five EU Elections 2019-2020  (Democracy reporting International.

[3]  Researchers depend on private data sharing partnerships and privileged access to platform data which has the effect of further entrenching platform power, leading to chilling effects amongst researchers, who are afraid to lose access to platform data. Researchers who depend on what little data is available complain about its poor quality. Frequently, data is not available in machine-readable format or it is clearly inaccurate. The consequences are twofold. The impact of platforms on society remains severely understudied at a systemic level, and the research that exists skews heavily towards the most transparent platforms, causing substantial distortions. For further details see Nikolas Kayser-Bril (2020) For researchers, accessing data is one thing. Assessing its quality another; and Under the Twitter streetlight: How data scarcity distorts research.

[4]  Birgit Stark et al (2020): Are Algorithms a Threat to Democracy? The Rise of Intermediaries: A Challenge for Public Discourse.

[5]  For further elaboration on the recommendations see Jef Ausloos et al (2020): Operationalizing Research Access in Platform Governance: What to Learn from Other Industries?.

[6] Matthias Cornils et al (2020): Designing Platform Governance: A Normative Perspective on Regulatory Needs, Strategies, and Tools to Enhance the Information Function of Intermediaries.

[7] For  nuanced criteria that characterize dominant platforms/intermediaries, see EDRi (2020): Platform Regulation Done Right. EDRi Position Paper on the EU Digital Services Act, p.16.

[8] Matthias Cornils et al (2020): Designing Platform Governance: A Normative Perspective on Regulatory Needs, Strategies, and Tools to Enhance the Information Function of Intermediaries; at the same time international human rights law (e.g. Art. 15 ECHR) puts very strict requirements for the conditions under which states can restrict freedom of expression and information, notably the principles of legality, necessity and proportionality and legitimacy.

[9] EDRi (2020): Platform Regulation Done Right. EDRi Position Paper on the EU Digital Services Act.

[10]  ISO (2018): Guidelines for auditing management systems (199011).

[11] In general, online services and platforms should be free to choose the terms by which they offer their services and the guidelines they use to moderate content. But they e.g. have an obligation to treat all users equally, so that this treatment needs to be accounted for.

[12] It is essential that disclosure rules remain flexible and subject to updates and revisions by the proposed independent institution.

[13] Beatrice de Graaf et al (2017): The Dutch National Research Agenda in Perspective: A Reflection on Research and Science Policy in Practice.

[14] AlgorithmWatch (2020): Our response to the European Commission’s consultation on AI.

[15] See Findata case study in Jef Ausloos et al (2020): Operationalizing Research Access in Platform Governance: What to Learn from Other Industries?.

[16] See our joint statement here; in a similar vein Mozilla has called for ad archives.

Read more on our policy & advocacy work on the Digital Services Act.

Sign up for our Community Newsletter

For more detailed information, please refer to our privacy policy.