Mastodon could make the public sphere less toxic, but not for all

The open-source social network gained millions of new users following Twitter’s takeover. While some of its features could improve the quality of public discourse, disadvantaged communities might be excluded.

Jack Sharp - unsplash

Nicolas Kayser-Bril
Reporter

Social media companies have a tricky balancing act. On the one hand, they must keep users active on their app or website as long as possible in order to show them advertisements. Divisive, emotional or hateful content works best to that effect. On the other hand, they need to maintain some level of online safety—in order to appease their advertisers, at least. Social networks therefore encourage aggressive user behavior while simultaneously suppressing the most egregious content (lest advertisers complain), often using heavy-handed algorithmic detection systems.

Design choices

These automated systems have decidedly not improved the quality of the public sphere. Yes, academics still debate the precise role of technology in the rise of the generalized distrust that pervades societies in Europe and the United States—after all, the newspapers of Rupert Murdoch or Axel Springer also stoked fear and anger to sell ads decades before YouTube and TikTok were branded as “great radicalizers”. It’s possible that these platforms simply host people who are already radicalized; that they do little more than hold up a mirror to society.

However, new research published this month points out that technology does play a role. A controlled experiment showed that Facebook and Twitter users who saw toxic content were more likely to publish toxic content themselves. In other words, toxicity is contagious. Toning down extremist content on a platform could allow for more meaningful conversations.

Another experiment done in lab conditions showed that people were very apt at finding out and moderating false and potentially inflaming information. By some measures, this collaborative approach works better than centralized algorithmic filters. Keeping discourse civil might be done better by giving agency to users, rather than by deploying algorithms that censor content that make advertisers uncomfortable.

The two experiments are just a fraction of the research conducted in the area of radicalization and misinformation. But they show that design choices probably have an influence over the quality of the discourse produced on a given platform.

Mastodon

As Twitter continues its erratic course in the hands of its new owner, millions of users moved to other services. One of these is Mastodon, an open-source, decentralized piece of software that is part of the “fediverse” (a portmanteau for “federated universe”). The service gained two million users in November.

Unlike for-profit social media services, Mastodon has no algorithmically curated timeline seeking to maximize emotion. It has no automated filters that remove content considered too violent or too immodest for advertisers. Mastodon servers, the gateways through which users access the service, do not even have advertisers. Most rely on small donations. Moderation is not centralized. Each server can have its own rules. Some might allow the discussion of sex work, for instance, while others do not. Regular users can be co-opted to work as moderators.

These features could, in theory, enable people to share constructive arguments online. But Mastodon is no utopia, either. Manuel Biertz is a doctoral research in political science who works with argumentation theory. He was one of the few academics who regularly used Mastodon before the current wave of migration. He told AlgorithmWatch that “Mastodon will not so much foster argument-based
conversation as it will attenuate polarisation”. Mastodon users are not exposed to emotional content as much as on other services, but the lack of algorithmic selection is such that timelines are rushing through now more than before. “Fostering deliberation would require platforms that encourage long-term and in-depth interaction with arguments and less 'communicative plenty'“, he added.

Elitist

Jon Bell, a former designer at Twitter, said that only 3% of its users opted to enjoy non-algorithmic timelines. While the number cannot be independently verified, Bell argued that users feel lost without an automated selection of content. This claim could be disputed (Instagram was successful with its non-algorithmic timeline) but is, in the end, irrelevant to the question of how technology shapes the public sphere.

Twitter’s importance did not result from its millions of users alone. As Biertz said, it has been—in most European countries—“a very elitist medium used mainly by politicians, journalists, and academics”. Tongue-in-cheek, he added that Mastodon does not need millions of users to be relevant, “it only needs Joe Biden”.

While the White House is not (yet) active on Mastodon, some governments are. Germany’s federal institutions can be found at social.bund.de and European institutions at social.network.europa.eu. Several political parties, like the German Greens and social-democrats, or the Czech Pirates, also have their own Mastodon instance.

At least among decision-makers and academic, Mastodon could become an alternative to Twitter. For them, the calmer interactions of the fediverse could translate into a less inflamed public sphere.

Unequal participation

But the features that make Mastodon calmer for some also exacerbate inequalities. Because each server is autonomous, each is vulnerable. Servers for communities that are discriminated against could suffer attacks without other Mastodon users noticing. This is already happening. Some Black users have reported levels of racist abuse unseen on commercial platforms. While some did move to dedicated servers, these users bear the extra cost of defending against attacks alone.

Johnathan Flowers, a philosopher of technology at California State University Northridge, argued that Mastodon inherited “structures of whiteness” from the people who created it. This makes the service especially unsuited for groups that have been minoritized, he said.

Besides moderation, the cost of running an instance leaves most internet users excluded from the fediverse. Social media costs a few euros per year and per user to run. The amount might be relatively benign for middle-class Europeans, but represents a not-insignificant sum for others. There are only a handful of Mastodon servers outside of Europe, the United States and Japan. And although content moderation in languages like Swahili, Oromo or Telugu might be poor on for-profit social networks, it is virtually absent from Mastodon.

For all its flaws, Twitter was the closest thing to a global public sphere, Kenyan writer Nanjala Nyabola argued. Several global movements, from #BlackLivesMatter to #BringBackOurGirls, started on the service.

Whether Mastodon can play the same role remains to be seen.

Edited on 30 Nov 10:30 to better reflect the findings of Stalinski et al.

Did you like this story?

Every two weeks, our newsletter Automated Society delves into the unreported ways automated systems affect society and the world around you. Subscribe now to receive the next issue in your inbox!

Get the briefing on how automated systems impact real people, in Europe and beyond, every two weeks, for free.

For more detailed information, please refer to our privacy policy.