Germany’s new media treaty demands that platforms explain algorithms and stop discriminating. Can it deliver?

Facebook can’t decide if it’s a tech company, a media company, a telecoms company, or something else entirely. Ahead of talks with European regulators, CEO Mark Zuckerberg said it’s something in between. Now, German regulators might have decided for Facebook: The company will be subject to the country’s newly expanded media regulation regime. 

Facebook can’t decide if it’s a tech company, a media company, a telecoms company, or something else entirely. Ahead of talks with European regulators, CEO Mark Zuckerberg said it’s something in between. Now, German regulators might have decided for Facebook: The company will be subject to the country’s newly expanded media regulation regime. 

In December Germany’s federal states approved the Interstate Media Treaty (Medienstaatsvertrag or MStV for short), draft legislation with rules for TV and radio similar to a federal law. The new Media Treaty expands on the powers outlined in the Broadcast Treaty, which up until now has been the principal regulatory framework for public-service and commercial broadcasting in Germany.  For the first time, the treaty rules will also include social media platforms, search engines, and video portals, subjecting them to independent, non-governmental oversight by Germany’s media authorities (Landesmedienanstalten). The MStV will likely be implemented this fall, after years of contentious legislative debates and two public consultations.

The MStV is Germany’s latest attempt to reign in the de facto gatekeeping power that companies such as Facebook, YouTube and Google wield over millions of citizens. With its focus on media pluralism and safeguarding diversity of information, it represents a significant conceptual departure from other regulatory approaches, which tend to focus exclusively on content removal.

From “broadcast” treaty to “media” treaty

The new media rules expand and update long-standing German broadcasting regulation. In Germany, the federal states are in charge of developing rules for media regulation, which are implemented and enforced by the independent state media  authorities . The media authorities have strong expertise in broadcasting law, and develop and enforce regulations on a range of media issues including advertising standards, accessibility of audiovisual services, and protection of minors. Using the powers granted to them under the Interstate Broadcasting Treaty as well as affiliates such as the Commission on Concentration of Ownership in the Media (KEK), the media  authorities are also tasked with issuing TV broadcast licenses in a manner that prevents market concentration and promotes media pluralism.

The transition away from a treaty focused exclusively on broadcast media to one that encompasses “media” more generally reflects the states’ and the media authorities’ push to modernize broadcast regulation. This modernization is also somewhat self-serving, as the media authorities have an interest in staying relevant amidst a rapidly changing digital media landscape.

The German proposals go much further than the EU requirements laid out in the Audiovisual Media Services Directive (AVMSD), on which the MStV is partly based. The EU rules focus on protection of minors or giving prominence to European film/series and exhibit none of the ambitious regulatory goals seen in the MStV.

Social media and search regulation in the MStV: transparency and non-discrimination rules

While the MStV also introduces provisions for media platforms and streaming services like Netflix, our analysis focuses on the MStV’s approach to dealing with what it calls “media intermediaries.” Media intermediaries are defined as “any telemedia that aggregates, selects, and presents third-party journalistic/editorial offers, without presenting them as a complete offer.” While specific definitions are still being worked out, an earlier draft of the MStV included a list of digital offers ranging from search engines to social networks and news aggregators, so that would probably cover platforms like Google, Facebook and Apple News.

For these US-based, globally operating companies, the MStV introduces some new rules regarding transparency and non-discrimination. Under the transparency provisions, intermediaries will be required to provide information about how their algorithms operate, including:

Information should be made available in easy-to-understand language. If intermediaries make any changes to their criteria, these changes must be made publicly available immediately. Media Intermediaries that act as social networks will also be required to identify and label “social bots.”

The provisions on non-discrimination prohibit media intermediaries from discriminating against journalistic and editorial offers or treating them differently without “appropriate justification.” According to the treaty, discrimination amounts to “systematically hampering” one offer over another. If the provider of journalistic/editorial content believes that their content has been discriminated against, they can file a claim with their state broadcasting authority.

The devil in the details 

At first glance, the treaty’s goals are praiseworthy. Academic literature critiquing “blackbox” gatekeepers and the power they wield over the public sphere abounds, and it is both logical and necessary that media policy be updated to reflect technological change.

Yet, despite pressure from civil society and experts in the field of media law, many of the treaty’s provisions remain fuzzy and at times contradictory. For instance, what exactly qualifies as a media intermediary? How does the draft deal with overlaps among definitions such as media intermediaries and user interfaces? What algorithmic decision-making criteria should be disclosed and how would these disclosures be different from the information that is already self-reported by platforms like Google? Finally, what is a “social bot” and how can platforms identify them? 1

Without precise, yet dynamic specifications of what information what platforms are required to provide to whom, there is a risk that transparency will be less effective than anticipated.

The problems with the lack of clarity around such questions are exacerbated by the fact that the transparency requirements serve as the basis for assessing compliance with nondiscrimination provisions. In light of this, it is difficult to imagine how systematic (non)discrimination could be proven or disproven in practice.

This is especially true on a platform like Google, which offers services specifically designed to discriminate against or in favor of relevant content. As Wolfgang Schulz, Director of the Hans Bredow Institute, notes in a 2018 interview, the whole point of Google’s search service is to discriminate, as “a search engine that sorts information at random would be useless.”

Even if the media authorities are able to successfully identify and prove discrimination, is it really only possible for providers of journalistic or editorial content to launch a complaint? YouTube content creators already criticize that the platform’s trending algorithm is biased in favor of publishers and traditional media, and such provisions could end up shutting out important voices from independent creators who also make valuable contributions to the digital public sphere.

We reached out to state media authorities to ask for clarification on these issues. While they did not respond to our specific questions, they indicated that they are still in the process of refining definitions and statues.

What does this mean for Europe?

The German media treaty comes at a critical time for Europe. EU policymakers are preparing to take up the issue of platform regulation through their review of the decades-old E-Commerce Directive, and it wouldn’t be the first time they looked to Berlin for inspiration. Because the recommendations from Germany’s Data Ethics Commission have already shaped the EU Commission’s thinking on AI policy, it is reasonable to think that the ideas the Ethics Commission  proposed on tackling media intermediaries (more or less copy-pasted from the media treaty) could make their way to the European stage. Tobias Schmid, leader of one of Germany’s biggest media authorities, was recently elected chair of the European Regulators Group for Audiovisual Media Services (ERGA). Under Schmid’s leadership, ERGA will take up the issue of “media diversity and findability, focusing on both the transparency of access and ease of retrieval of media content on platforms.”

But given some of the open questions, it is crucial to carefully consider what, if any, provisions could or should be translated to other European contexts. Iva Nenadic, a media policy expert at the Centre for Media Pluralism and Media Freedom, notes that the levels of democratic checks and balances and media landscapes differ widely across Europe. While Germany’s media regulators enjoy a high level of independence, the same cannot be said of other member states. According to the Center’s research on the Media Pluralism Monitor (MPM) project, more than half of EU member states lack safeguards for political independence in appointment procedures. “Before entrusting media authorities with additional regulatory powers, we need to make sure that merit-based staffing and independence standards are met,” Nenadic said.

Ready or not, here it comes 

Despite its shortcomings, the MStV has important symbolic value. It signals regulators’ willingness to take on big tech, and pushes the policy debate beyond questions of content moderation and deletion. The core goals of the MStV, such as improving transparency and the ability for people to freely form their opinions in the digital (media) age, are laudable. Its success will depend on whether the media authorities will succeed in clearly defining the rules of the game, and whether they can muster the resources to enforce these rules.

1 This particular provision has been the subject of much debate. Researchers have shown that many of the accounts identified as social bots were actually human beings, raising doubts on the practicability of this measure.

Read more on our policy & advocacy work on ADM in the public sphere.

Sign up for our Community Newsletter

For more detailed information, please refer to our privacy policy.