Interview

New audits for the greatest benefits possible

Oliver Marsh is the new head of AlgorithmWatch’s project "Auditing Algorithms for Systemic Risks." He told us about his background and the goals that he will be persuing.

Oliver Marsh
Project Lead "Auditing Algorithms for Systemic Risks"

Welcome to AlgorithmWatch! Tell us about your background and what you’ll be bringing to the team.

It's really great to be a part of AlgorithmWatch! I've spent most of my career looking into how online platforms affect society, particularly politics and democracy. After my doctorate which was on reason and emotion online, I moved into the UK government, helping set up the counter-disinformation Rapid Response Unit in 10 Downing Street and working on how the GDPR functions post-Brexit. Later on, I did work against online harms for various think tanks including CASM Technology, The Institute for Strategic Dialogue, and the Tony Blair Institute for Global Change. I've spent so long looking at things that go wrong with online platforms, that it's good to work on a project which tries to stop them before they happen.

You’ll be leading the project Auditing Algorithms for Systemic Risks. Please tell us about systemic risks.

Algorithms can create various risks by making decisions which affect people’s lives, in hiring, justice, and medical treatments, to name just a few examples. The algorithms behind online platforms are an important example for systemic risks as they can be systemically biased towards or against certain kinds of content, or might promote false information or hate speech. In doing so, they can harm the internet as a tool for diverse, democratic, and positive discussion.

The EU’s new Digital Services Act (DSA) is creating a useful focal point for considering how such systemic risks can be addressed, in particular in relation to online platforms. But there are plenty of similar proposals emerging in other parts of the world about other algorithmic systems.

How does the DSA specifically address such systemic risks?

It has measures to empower external researchers with more funding and data access to study online platforms, for example. It also aims to create transparency around recommender systems and content moderation. But most importantly, the DSA defines specific systemic risks that online platforms need to take into account: risks stemming from illegal content and risks which have negative effects on fundamental rights, public health, minors, civic discourse, electoral processes, and public security.

The DSA creates mechanisms by which such risks can potentially be uncovered and protected against. Firstly, the largest online platforms must conduct their own annual risk assessments. The deadline for the first of these, August 25th, has already passed, but the products are sadly only available to the European Commission. Then from February 2024 various additional measures begin, including independent audits of platforms and opportunities for regulators and vetted researchers to request relevant data from platforms. These are exciting developments, particularly given the online world is becoming increasingly opaque and hard to research.

How does our project fit into these developments?

Firstly, although the DSA specifies some systemic risks, there’s still a lot of questions about what makes the risks systemic and how to develop measures of risks. We want to develop ideas of what good audits and assessments could actually look like; AlgorithmWatch’s Michele Loi has already published some early thinking to prompt those discussions. I'm also planning to leverage my background in social media research to reflect on how the DSA can help protect the integrity of upcoming European elections, not least the European Parliament elections in 2024. Finally, we will work with researchers and civil society organizations to make sure these new audits and transparency requirements bring the greatest benefits possible: to citizens, governments, the platforms themselves, as well as to democracies and societies in general.

Sign up for our Community Newsletter

For more detailed information, please refer to our privacy policy.