Left on Read: How Facebook and others keep researchers in the dark

Internet platforms routinely deny researchers access to data or arbitrarily accede to their requests, hampering social science in the process.

Nicolas Kayser-Bril

Users of online services routinely denounce double-standards and abuse. Instagram has been accused of censoring People of Color, TikTok of limiting the reach of videos featuring people with disabilities and Facebook of helping con artists find new victims. But, because usable data that could prove such hypotheses is often inaccessible, definitive evidence is lacking. This prevents researchers from establishing the factual truth society could use as a basis for political action.

Facebook, Google and others “hardly ever cooperate with independent research projects,” as Mario Haim and Angela Nienierza, from the University of Munich, wrote in an academic article last year. Requests from researchers, be they from academia or other institutions, are often “left on read,” internet parlance for when a conversation partner has received a message but not answered it.


Not only are researchers unable to carry out in situ observations or access internal data, platforms also hamper their work when they use official application programming interfaces (APIs) to extract material for their research.

Philip Kreißel, a researcher who is part of the #ichbinhier campaign against hate speech, told AlgorithmWatch that he started monitoring Facebook in 2016. By looking at who commented and liked comments, he showed that a minority of far-right users, probably organized in groups, were responsible for much of online hate speech.

But in 2018, in the wake of the Cambridge Analytica scandal, Facebook blocked the interface that he used to monitor comments, in what some researchers dubbed an “APIcalypse”. Multiple requests by Mr Kreißel to be granted access again were denied.

Social science mummed

In response to mounting criticism, Facebook launched Social Science One, an industry-academic partnership that promised to grant select researchers access to some data. Judith Möller, an assistant professor at the University of Amsterdam, told AlgorithmWatch that although she was granted access to the program, the partnership was marred by delays and communication issues.

Axel Bruns, a professor at Queensland University of Technology, argued in a 2019 article that the terms of Social Science One were such that criticism of Facebook is strongly discouraged. Because Facebook retains the power to terminate a relationship at any time, researchers have a strong incentive not to explore issues that might displease the social network. Ulrike Klinger, a professor at the Free University of Berlin and co-organizer of a conference on data access for research, called Social Science One “a good idea that turned into a disaster”.


In spite of the hurdles set up by online giants, researchers keep trying to measure their effects. One possibility is to ask third parties for data. Simon Kruschinski of Mainz University obtained detailed insights from German political parties on how they advertised on Facebook based on a non-disclosure agreement, although the terms of service prevent them from sharing such information. (Facebook has already threatened to sue researchers over terms of service infringements.)

Others, such as Munich University’s Mr Haim and Ms Nienierza, built browser plug-ins to access data from volunteers. To explore mobile apps, Jack Bandy and Nicholas Diakopoulos of Northwestern University asked workers recruited on Amazon’s Mechanical Turk to take screenshots with their devices at regular intervals.

We, at AlgorithmWatch, are often left on read as well. A request to access the application programming interface (API) of Google’s Perspective, a service for automated content moderation, was never answered. We ended up collecting data manually, a time-consuming process that prevented us from doing a large-scale audit of the service.

Have you been left on read?

As the European Commission is asking for input on the issue for the oncoming Digital Services Act, we would like to better understand the scale of the problem, and perhaps to identify companies that found good ways of cooperating with researchers.

If you have been left on read, or if you have been granted access to data, we would be very grateful if you shared you story with us. You can do so anonymously or leave your name and email for a possible follow-up.

Learn more about our 'left on read' campaign.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.