56 Results for „dsa“

Page 1 of 6
Foto von thom masat auf Unsplash

Position, 31 May 2023

Open letter

DSA must empower public interest research with public data access

Access to “public data” is key for researchers and watchdogs working to uncover societal risks stemming from social media—but major platforms like Facebook and Twitter are cutting access to important data analytics tools to study them. The EU must now step in to ensure that researchers aren’t left in the dark.

Read more
Ravi Sharma | Unsplash

Position, 30 March 2022

DSA trilogues in the endgame: Policymakers must prioritize platform transparency

With the trilogue negotiations entering their final phase, key issues remain at stake that will determine the final text of the Digital Services Act (DSA). In this policy paper, we urge EU negotiators to prioritize issues that are central to the DSA’s accountability structure – including third-party data access for public scrutiny, independent audits, and increased transparency for online advertisements.

Read more
Maarten van den Heuvel | Unsplash

Blog, 27 January 2022

EU Parliament approves its negotiating position on the DSA

The plenary vote establishes the European Parliament's position ahead of the trilogue negotiations with the Council of the EU and the Commission, which will start next week. Despite progress by the Parliament on issues like platform transparency, it is far from guaranteed that this progress will be enshrined in the final law.

Read more
Dmitry Ratushny | Unsplash

Position, 14 December 2021

DSA milestone: EU lawmakers have responded to our calls for meaningful transparency for big tech

Over the last months, AlgorithmWatch – supported by dozens of civil society organizations and researchers, and over 6.000 individuals – has advocated for using the Digital Services Act (DSA) to enable meaningful transparency into the way online platforms influence our public sphere. The vote in the European Parliament today shows that our work has made an impact.

Read more
Luke Watkinson | Unsplash

Position, 29 November 2021

Holding platforms accountable: The DSA must empower vetted public interest research to reign in platform risks to the public sphere

The negotiations on the Digital Services Act (DSA) are now at a critical juncture. We have written an open letter to all IMCO Committee Members of the European Parliament asking them to empower a broad base of vetted public interest researchers whose independent scrutiny is vital to holding large tech platforms accountable. It has been signed by 22 international academics and independent researchers and 29 civil society organisations.

Read more
Krzysztof HepnerUnsplash

Position, 16 December 2020

The DSA proposal is a good start. Now policymakers must ensure that it has teeth.

AlgorithmWatch reacts to the release of the EU's Digital Services Act (DSA).

Read more
Foto von Nathalia Segato auf Unsplash

Story, 17 November 2023

Platform regulation

Not a solution: Meta’s new AI system to contain discriminatory ads

Meta has deployed a new AI system on Facebook and Instagram to fix its algorithmic bias problem for housing ads in the US. But it’s probably more band-aid than AI fairness solution. Gaps in Meta’s compliance report make it difficult to verify if the system is working as intended, which may preview what’s to come from Big Tech compliance reporting in the EU.

Read more
Khari Slaughter for AlgorithmWatch

Project, 5 October 2023

New research

ChatGPT and Co: Are AI-driven search engines a threat to democratic elections?

A new study by AlgorithmWatch and AI Forensics shows that using Large Language Models like Bing Chat as a source of information for deciding how to vote is a very bad idea. As their answers to important questions are partly completely wrong and partly misleading, the likes of ChatGPT can be dangerous to the formation of public opinion in a democracy.

Read more

Blog, 19 September 2023

Interview

New audits for the greatest benefits possible

Oliver Marsh is the new head of AlgorithmWatch’s project "Auditing Algorithms for Systemic Risks." He told us about his background and the goals that he will be persuing.

Read more
A data scientist had found that their work (the algorithm depicted on their laptop screen) has ‘jumped’ out of the screen and threatens to cause problems with a variety of different industries. Here a hospital, bus and siren could represent healthcare, transport and emergency services. The data scientist looks shocked and worried about what trouble the AI may cause there.
Yasmin Dwiputri & Data Hazards Project / Better Images of AI / AI across industries / Licenced by CC-BY 4.0

Publication, 1 August 2023

Making sense of the Digital Services Act

How to define platforms’ systemic risks to democracy

It remains unclear how the largest platforms and search engines should go about identifying “systemic risks” to comply with the DSA. AlgorithmWatch outlines a methodology that will serve as a benchmark for how we, as a civil society watchdog, will judge the risk assessments that are being conducting at this very moment.

Read more
Page 1 of 6