In a 2018 opinion piece in the New York Times, sociologist Zeynep Tufekci accused YouTube of being the ‘Great Radicalizer’. Watch a video on the platform, she wrote, and before long its recommendation engine will show you conspiracy theories, far-right memes and other extreme political content.
Since then, Google, which owns YouTube, seems to have taken steps to reduce the amount of extremist content recommended to users. In 2019, the platform said that “consumption on authoritative news partners’ channels has grown by 60 percent” as a result of algorithm changes, but without providing a baseline or a methodology.
5096 data donations
With the DataSkop project, AlgorithmWatch set out to verify such claims and shed light on YouTube’s algorithms. Over five thousand data donors downloaded a program that explored their YouTube recommendations and shared anonymous data with researchers. DataSkop is financed by the German federal Ministry for research and education and brings together AlgorithmWatch, Europa Universität Viadrina, Mediale Pfade, FH Potsdam and Universität Paderborn. The collected data will be used for a series of articles. Der Spiegel had access to the data and published its own analysis (Paywall).
Between 15 July and 25 August, 5096 data donors downloaded the program and shared their data. DataSkop first checked the "top stories" of the “news” YouTube channel, an automatically-generated selection of videos that is recommended to all users when they open the app or the platform’s homepage. The program then automatically clicked on one of the highlighted videos and collected the recommendations YouTube displayed next to it.
Whether data donors were logged in or not, YouTube showed mostly established news sources. All of the channels in the top 10 most-shown channels belong to traditional broadcasters, except for wetternet, a weather channel, and Ultralativ, a channel that mentioned the data donation and from which many data donors came from.
One brand, Welt, outranked all others.
Clicking on a video by Die Welt generates more recommendations from WELT Nachrichtensender as every video from the YouTube news playlist comes with further recommendations from the according channel.
It remains unclear why the WELT Nachrichtensender channel appears so prominently in the news section. Are the videos just well optimized for YouTube or is it due to the enormous output of the channel? We need more research to answer such questions.
Welt is the broadest news brand of media giant Axel Springer (since 2018, the brand encompasses the former news channel N24 as well as the newspaper Die Welt). Springer is known for its flagship tabloid Bild and its closeness to the right-wing party CDU. Angela Merkel, the outgoing German chancellor and herself a CDU member, is believed to be friends with members of Springer’s top brass. Christian Nienhaus, head of print at Springer, is even a candidate in next Sunday’s election. For the CDU, of course.
That Springer’s videos appear twice as often as the next most-shown channel on YouTube in the run-up to the most disputed election in decades should alarm Germany’s media regulators.
Google did not answer our precise questions regarding the choice of channels displayed by YouTube, stating only that they had no official cooperation with Axel Springer or Welt.
In mid-2018, the French media regulator carried out an experiment where 40 employees scraped YouTube recommendations over ten days. Out of the 20 most recommended channels, only one was a YouTuber. Others were shows from mainstream television stations or news outlets.
In Germany, the media regulator of Berlin and Brandenburg (Mabb) published similar results in early 2021. On the one hand, it lauded the platform for the limited amount of falsehoods that were algorithmically pushed to users. On the other, it regretted that the recommended channels lacked diversity and were heavily skewed towards traditional television stations.
While it seems clear that YouTube does not massively recommend extremist content to all users, many questions remain. All of the above-mentioned experiments, including AlgorithmWatch’s, suffer from biased sampling. Recommendations might differ on mobile devices, where most of the viewing occurs, or change based on a user’s past behavior. A more diverse sample of donors would be needed. In particular, fine sampling could assess whether people who are close to becoming extremists can be pushed over the edge by YouTube’s algorithm, as a New York Times investigation showed in 2019.
Regulators must carry out this continuous monitoring, and Google must be made to allow it. Efforts by AlgorithmWatch and civil society are vital to bring facts to the public debate, but organizations such as ours are ultimately too small to scrutinize in details a platform that 36 million Germans use at least weekly.
This article was edited on 5 April 2022.