Joint statement

A diverse auditing ecosystem is needed to uncover algorithmic risks

The Digital Services Act (DSA) will force the largest platforms and search engines to pay for independent audits to help check their compliance with the law. But who will audit the auditors? Read AlgorithmWatch and AI Forensics' joint feedback to the European Commission on strengthening the DSA’s independent auditing rules via a Delegated Act.

The European Commission is set to formalize its rules for mandatory second-party audits of very large online platforms and search engines under the Digital Services Act (DSA), Europe’s new law to make powerful tech platforms like YouTube, TikTok, Facebook, and Twitter more transparent and accountable for the risks they pose to society.

Article 37 of the DSA reserves the term “independent audits” exclusively for second-party auditors, i.e. commercial auditors whose services are to be contracted and paid for by the largest platforms and search engines in order to fulfill mandatory annual evaluations. These paid auditors will have privileged access to their clients' internal systems, personnel, and data to check whether the audited companies' systemic risk assessments and risk mitigation efforts pass muster.

Such second-party audits within the DSA framework are a significant undertaking that will likely give a competitive advantage to large, multinational consulting firms—the so-called “Big Four.” It’s important to recognize that these firms often lack expertise in human rights assessment and are primarily accountable to their shareholders rather than the broader public. And because the DSA requires platforms to arrange and pay for their own audits, this could open the door to audit capture wherein auditors end up catering to the interests of their clients to retain lucrative auditing contracts, as continues to happen in the case of financial audits.

Second-party auditing may thus result in audit-washing with self-adopted methodologies and standards, potentially undermining the effectiveness and reliability of the auditing process and calling into question whether such audits can truly be “independent.”

In response to the Commission’s draft delegated regulation on independent audits, AlgorithmWatch and AI Forensics have jointly submitted feedback to the Commission calling for rules that will strengthen a diverse auditing ecosystem for algorithmic risks. Our recommendations follow from our organizations’ firsthand experiences conducting independent, third-party or “adversarial” audits of platforms’ algorithmic systems. Unlike second-party audits, third-party, adversarial auditors do not have a contractual relationship with the audited provider and thus are less at risk of audit capture and washing.

We believe empowering a diverse auditing ecosystem that includes independent, third-party auditors is crucial to exposing risks on social media, raising public awareness, and putting pressure on the platforms to make necessary changes for the good of their users and the public. This ecosystem must be protected given platforms’ track record of hostility toward external scrutiny, and nurtured to ensure that data and compliance reports provided by big tech companies are indeed independently verified.

Our recommendations to the European Commission include:

You can read AlgorithmWatch and AI Forensics' joint feedback to the European Commission here:

Read more on our policy & advocacy work on the Digital Services Act.