Press release
Google AI risks to media pluralism investigated by AlgorithmWatch using brand new data access rules
AlgorithmWatch is requesting data from Google regarding the impact of their “AI Overviews” service on traffic to external websites. The NGO is concerned that AI Overviews pose a risk to media pluralism and freedom of information and is one of the first organizations to use new EU rules to gain access to internal data from major online services.

Google rolled out AI Overviews earlier this year, meaning that many users often see by default an AI-generated short answer to their search above ‘traditional’ search results. However, there are concerns about the quality and accuracy of these overviews, and their effect on the wider information environment when users simply read these summaries rather than visiting the websites offered in the search results. As well as well-known risks of AI providing unreliable answers to users, there may also be systemic risks to organizations which provide reliable information.
In September, AlgorithmWatch was part of an alliance of NGOs, associations, and media industry organizations who sent a complaint to the Bundesnetzagentur, Germany’s regulator for the DSA, against Google's AI Overviews service, calling them a “Traffic Killer” for independent media.
Dr. Oliver Marsh, Head of Tech Research at AlgorithmWatch, said “we are today requesting data from Google about how many people are still visiting websites after searches, versus those simply staying in Google search and reading AI Overviews. Traffic to sources like newspapers and other websites is already drying up a severe risk that reliable information will not be a feasible business model anymore. What will replace such websites? An error-prone, intransparent AI tool.
Google claim they have done risk assessments of the tool, as required under the DSA. But the details are mysterious. This is unacceptable for such a massive risk to our information environment.”
New Internal Data Access Rules
From today, under Article 40.4 of the EU’s Digital Services Act, non-commercial research organizations may apply to national regulators (called Digital Services Coordinators) for access to internal data from Very Large Online Platforms and Search Engines. These are services with more the 45 million users in the EU, amongst them Google Search, Facebook, TikTok, as well as large app stores, e-commerce services such as Temu, and some large adult entertainment platforms.
The data requested must be in the service of investigating so-called ‘systemic risks’, such as risks to freedom of speech and information, public health, electoral processes, or other risks listed in the DSA. If the service refuses to give the data, for instance on grounds of commercial secrecy, the national Digital Services Coordinators can mediate between the company and researchers to find a solution.
Dr. Marsh said “The new data access rules are an exciting addition – they go beyond allowing researchers to use publicly available data, give vetted researchers access to internal data so we can conduct more detailed analysis of impacts from technologies. In particular, companies have continually asked us to trust that they have good data on risk assessments – now we may actually see that data ourselves. These rules also involve new assessment processes, to ensure recipients will handle the data appropriately, and we look forward to working with the German regulators to test their processes.”
Background
Article 40.4 builds on an already implemented Article of the DSA, 40.12, which allows researchers to request data which is already publicly available to research systemic risks – such as social media posts on a particular topic, or information on products available in e-commerce stores. At the beginning of October AlgorithmWatch, alongside the Mozilla Foundation and the DSA 40 Data Access Collaboratory, released a “mass data access request” asking six large platforms to deliver regular lists of their most-viewed content in EU Member States to a coalition of over 20 research organizations.