Instagram algorithm: Süddeutsche publishes results of data analysis

Using thousands of data donations from AlgorithmWatch’s Instagram monitoring browser plug-in, German newspaper Süddeutsche Zeitung showed that posts from the far-right appear higher on users’ timelines.

Felix Hunger/SZ

Nicolas Kayser-Bril
Reporter

Between 13 April and 13 July, hundreds of data donors installed a browser plug-in built by AlgorithmWatch to share data about their Instagram timelines. They were asked to follow a selection of politicians or political parties. The plug-in automatically sent information about how posts by these politicians appeared on their timeline.

The period encompassed the first half of the campaign for Germany’s general election, due to take place on 26 September. The data collection process was interrupted in July after Facebook pressured AlgorithmWatch into shutting down the project.

AfD higher up

An anonymized dataset was shared with the data journalists of Süddeutsche Zeitung, who published their findings on 15 September. They found that posts from politicians of the right-wing Alternative for Germany (Alternative für Deutschland, AfD) appeared much higher, on average, in the newsfeeds of data donors than those of other parties’ politicians.

Some topics, such as crime and rent prices, performed better than others based on an analysis of the likelihood that a post appears in a user’s timeline, taking into account its recency and popularity. You can read the whole story (in German) at sueddeutsche.de (paywall).

These findings do not prove that Instagram favors the far-right, far from it. It could be that data donors were genuinely more interested in far-right content (a very unlikely scenario). Or it could be that far-right politicians better leverage Instagram’s algorithm, by posting more pictures of faces, and less pictures of text, which have been shown in one of our previous investigations to be favored by the newsfeed algorithm.

In that investigation, using the same technical set-up in the run-up to the Dutch general election in March 2021, AlgorithmWatch did not find that Instagram favored the far-right. However, we monitored far less far-right politicians, and those were dominated by a couple of very large and very active accounts, which skewed the data. In Germany, Süddeutsche avoided this pitfall by monitoring the same number of accounts for each political party.

More data needed

According to Facebook, which owns Instagram, 28 million residents of Germany, roughly one in three, used the platform last month. Any bias in the way it prioritizes content can affect many users and, in turn, voters. (By comparison, the country’s leading news show, Tagesschau, averaged about 12 million viewers in 2020.)

Despite their limitations, Süddeutsche’s findings show that not all politicians perform equally on Instagram. Why this is so remains an open question, which can only be answered by investigating its algorithm more thoroughly.

In the current context, investigating Facebook from the outside will not yield enough evidence to answer the question. Civil society researchers and academics face pressure from Facebook, which uses its might to bully them, as happened to AlgorithmWatch, or break their projects by suspending their Facebook accounts, as happened to a team at New York University. Academics who chose to accept Facebook’s terms and work with data provided by the firm realized last week that the data they were given was erroneous. Thousands of hours of work went to the bin.

To shed light on platforms’ algorithmic choices and on their impact on our democracy, public interest researchers need access to meaningful platform data. Only government regulators, equipped with strong enforcement capacity, could enable such access to investigate large platforms. It is high time that they start.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.