Blog

Why we need to audit algorithms and AI from end to end

The full picture of algorithmic risks and harms is a complicated one. So how do we approach the task of auditing algorithmic systems? There are various attempts to simplify the picture into overarching, standardized frameworks; or focus on particular areas, such as understanding and explaining the “black box” of models. While this work and thinking have benefits, we need to look at systems from end to end to fully capture the reality of algorithmic harms.

Conference

A Civil Society Summit on Tech, Society, and the Environment

At the “Civil Society Summit on Tech, Society, and the Environment” convened by EDRi, AlgorithmWatch and more than 100 civil society partners and digital rights organizations from around the world come together with EU policymakers to foster digital rights in the EU and create accountability for the good of the people.

In the run-up to the German federal state elections:

Chatbots are still spreading falsehoods

In September 2024, federal state elections will be held in Thuringia, Saxony, and Brandenburg. AlgorithmWatch and CASM Technology have tested whether AI chatbots answer questions about these elections correctly and unbiased. The result: They are not reliable.

AlgorithmWatch is running a new round of its Algorithmic Accountability reporting fellowship

You can now apply to the fourth round of AlgorithmWatch's Algorithmic Accountability Reporting Fellowship. Running since January 2023, the program has connected journalists and researchers from across Europe and unveiled new stories about automated discrimination on the continent.

10 Questions about AI and elections

2024 is an important election year. While citizens all over the world get ready to cast their ballot, many people worry about AI: Are chatbots and fake images a threat to democracy? Can we still trust what we see online? We explain why the hype around AI and elections is somewhat overblown and which real risks we need to watch out for instead.

Campaign: ADM and People on the Move

Borders without AI

29,000 people have died in the Mediterranean over the past ten years while trying to reach the EU. You would think that the EU wanted this tragedy to stop and scientists across Europe were working feverishly on making this happen with the latest technology. The opposite is the case: With the help of so-called Artificial Intelligence, the walls are being raised, financed with taxpayers' money.

EU’s AI Act fails to set gold standard for human rights

Following a gruelling negotiation process, EU institutions are expected to conclusively adopt the final AI Act in April 2024. Here’s our round-up of how the final law fares against our collective demands.

Yet to be delivered: labor rights in the gig economy

Digitally controlled platform work is fundamentally changing working conditions while current legislation is lagging behind this development. Our joint campaign "Liefern am Limit" is advocating for the rights of Lieferando drivers.

On our own behalf

Change of shareholders at AlgorithmWatch

Angela Müller, Head of AlgorithmWatch CH in Zurich, has become a shareholder of Berlin-based AW AlgorithmWatch gGmbH. Lorenz Matzat, co-founder of the organization and long-standing shareholder, has left the NGO to concentrate on his new company JETZT STUDIOS.

DSA Day and platform risks

Got Complaints? Want Data? Digital Service Coordinators will have your back – or will they?

The Digital Services Act (DSA) is the EU’s new regulation against risks from online platforms and search engines. It has been in effect since 2023, but 17 February 2024 marks “DSA Day,” on which many of the regulation’s most impactful provisions come into force.

Third cohort of AlgorithmWatch fellows will investigate discrimination in the financial sector

As part of AlgorithmWatch's algorithmic accountability reporting fellowship, seven journalists and researchers will work together to unveil discriminating outcomes and practices arising from the use of automated decision making systems in Europe's financial sector.

JobAlert: We are looking for a Head of PR & Outreach

AlgorithmWatch is looking for a Head of PR & Outreach for the Berlin office.

Social media

AlgorithmWatch suspends activities on X, formally known as Twitter

Starting today, AlgorithmWatch ceases publication on X/Twitter. The decision follows the continuous and rapid disintegration of the social network since its new owner took over a year ago.

JobAlert: We are looking for a Senior Campaign Manager

Part-time or full-time with 30-40 hours per week

AI Safety Summit

Missed Opportunities to Address Real Risks

The UK did not need to throw its full weight behind the Frontier Risks narrative - there are other approaches it could have taken.

The 5 Best Podcasts on Algorithms and Work

Interested in how algorithmic systems affect us at work? Here are some well-researched podcast episodes to get drawn into.

Game

Can you break the algorithm?

AlgorithmWatch releases an online game on algorithmic accountability journalism. Players act as a journalist who researches the details of a social network’s algorithm.

Apply now for the next round of AlgorithmWatch’s algorithmic accountability reporting fellowship

Last year, AlgorithmWatch ran two successful rounds of the fellowship in algorithmic accountability reporting. Eleven extraordinary professionals from different spheres of civil society participated and gained new skills and new contacts.

Interview

New audits for the greatest benefits possible

Oliver Marsh is the new head of AlgorithmWatch’s project "Auditing Algorithms for Systemic Risks." He told us about his background and the goals that he will be persuing.

The second group of AlgorithmWatch fellows is ready to go

We’re very happy to announce that another group of five fellows will work with us until December 2023 to research and publish stories about algorithmic accountability. They follow into the footsteps of our first six fellows.

Op-Ed on questionable Meta study

Social media algorithms are harmless, or are they?

New research published in Science and Nature suggest that Facebook and Instagram are not causing political polarization. But there are limitations in the research design that need to be discussed.

Help us fight injustice in hiring!

Donate your CV to fight together against automated discrimination in job application procedures!

Investigative journalism and algorithmic fairness

Investigating systems that make decisions about humans, often termed algorithmic accountability reporting, is becoming ever more important. To do it right, reporters need to understand concepts of fairness and bias – and work in teams. A primer.

Platforms’ promises to researchers: first reports missing the baseline

An initial analysis shows that platforms have done little to “empower the research community” despite promises made last June under the EU’s revamped Code of Practice on Disinformation.

What does TikTok know about you? Data donations deliver answers!

Companies like Facebook, Instagram, Google, and TikTok often know about the harmful effects of their algorithmic systems and yet continue to prevent independent research on them. Data donations like DataSkop are one of the few ways to investigate opaque algorithms.