Joint statement

A diverse auditing ecosystem is needed to uncover algorithmic risks

The Digital Services Act (DSA) will force the largest platforms and search engines to pay for independent audits to help check their compliance with the law. But who will audit the auditors? Read AlgorithmWatch and AI Forensics' joint feedback to the European Commission on strengthening the DSA’s independent auditing rules via a Delegated Act.

The European Commission is set to formalize its rules for mandatory second-party audits of very large online platforms and search engines under the Digital Services Act (DSA), Europe’s new law to make powerful tech platforms like YouTube, TikTok, Facebook, and Twitter more transparent and accountable for the risks they pose to society.

Article 37 of the DSA reserves the term “independent audits” exclusively for second-party auditors, i.e. commercial auditors whose services are to be contracted and paid for by the largest platforms and search engines in order to fulfill mandatory annual evaluations. These paid auditors will have privileged access to their clients' internal systems, personnel, and data to check whether the audited companies' systemic risk assessments and risk mitigation efforts pass muster.

Such second-party audits within the DSA framework are a significant undertaking that will likely give a competitive advantage to large, multinational consulting firms—the so-called “Big Four.” It’s important to recognize that these firms often lack expertise in human rights assessment and are primarily accountable to their shareholders rather than the broader public. And because the DSA requires platforms to arrange and pay for their own audits, this could open the door to audit capture wherein auditors end up catering to the interests of their clients to retain lucrative auditing contracts, as continues to happen in the case of financial audits.

Second-party auditing may thus result in audit-washing with self-adopted methodologies and standards, potentially undermining the effectiveness and reliability of the auditing process and calling into question whether such audits can truly be “independent.”

In response to the Commission’s draft delegated regulation on independent audits, AlgorithmWatch and AI Forensics have jointly submitted feedback to the Commission calling for rules that will strengthen a diverse auditing ecosystem for algorithmic risks. Our recommendations follow from our organizations’ firsthand experiences conducting independent, third-party or “adversarial” audits of platforms’ algorithmic systems. Unlike second-party audits, third-party, adversarial auditors do not have a contractual relationship with the audited provider and thus are less at risk of audit capture and washing.

We believe empowering a diverse auditing ecosystem that includes independent, third-party auditors is crucial to exposing risks on social media, raising public awareness, and putting pressure on the platforms to make necessary changes for the good of their users and the public. This ecosystem must be protected given platforms’ track record of hostility toward external scrutiny, and nurtured to ensure that data and compliance reports provided by big tech companies are indeed independently verified.

Our recommendations to the European Commission include:

  • The Delegated Act should expand on its requirements for selecting competent auditors and consider providing for additional oversight
  • Evidence from non-vetted researchers should also be considered by auditors
  • Independent third-party auditors, especially non-profits, should have access to financial resources which are not tied to platform demand and potential conflicts of interest
  • Civil society organizations and other third parties invested in the DSA’s accountability ecosystem (particularly those with relevant expertise e.g. in risk assessment and/or human rights impact assessment) must be given a window into the risk assessment/mitigation implementation process
  • Stronger oversight is needed to ensure that redactions from audit reports are indeed proportional to the risks associated with disclosure of potentially sensitive or harmful information
  • Vetted researchers should be able to gain access to un-redacted audit reports, whether in part or in whole, in order to fulfill their public interest mission and better scrutinize systemic risk assessment and mitigation efforts

You can read AlgorithmWatch and AI Forensics' joint feedback to the European Commission here:

Read more on our policy & advocacy work on the Digital Services Act.