Meta’s elections dashboard: A very disappointing sign

On 3 June, Meta released an EU election monitoring dashboard, responding to investigations by the EU Commission under the Digital Services Act. It is riddled with basic errors, raising severe concerns about Meta’s engagement with risks of electoral interference.

Oliver Marsh
Project Lead "Auditing Algorithms for Systemic Risks"

Building election monitoring tools can be complicated. Nonetheless there are basic markers of quality. I used to lead team building dashboards for potential crisis situations in Number 10 Downing Street in the UK. If my team had come up with the product Meta has published, I would have asked them to redo it – because of the issues and missing basic quality standards. It is unclear if Meta would redo their product.

Before I outline these issues, it is important to acknowledge: Platforms must be allowed to share flawed “beta tests” with wider communities, for critique and iteration. Transparency and cooperation from platforms, even if proposed solutions are imperfect, are to be welcomed. But my concern is that the substantial issues with Meta’s new product suggests minimal engagement with the issue at hand – and little consideration of expertise which could have been used, to the benefit of both Meta and society at large. There are clear and simple improvements that Meta could make quickly. But I am not confident that even strong criticism will lead to improvements – certainly not before the elections next week. It feels like a PR distraction, not real assistance. This is particularly concerning given Meta is also planning to close its Crowdtangle monitoring tool prior to the US elections this year despite strong appeals by researchers and civil society. Since this tool has been invaluable for monitoring online risks, it is not a good sign of real, genuine engagement.

Why is it bad?

To explain the issues first requires a brief social media monitoring 101. There are two main approaches. The first is to search for particular keywords. The second is to follow particular accounts. Let’s consider keywords first. For instance, you can ask Crowdtangle to provide a feed of public posts which use words or phrases like “EU elections,” “European Parliament,” “Europawahlen” in German, and so on. Those terms might be too specific, so you could also search for words like “vote,” but then you risk getting lots of irrelevant posts about other elections, game-shows which have votes, etc. It’s always a trade-off. Your dashboard might pick up some irrelevant content, but it shouldn’t be flooded with it.

Meta’s Instagram keyword dashboards are flooded with irrelevant material. For example, Austria shows a lot of content for Germany, and so does Luxembourg. This is because the keyword list copy-pastes the same German-language terms across all of them. The German list also contains Lithuanian words. There are probably other errors that speakers of other languages would spot. The Ireland search is full of English-language content about every election. Ireland itself barely appears. Many keywords are very short and generic words like “ID” and “MP” – very poor practice in social media monitoring – which is probably why posts about Ethiopia and South Africa appear in the Sweden dashboard, to give just one example. For the Facebook search a country-specific location filter is applied, which cannot be removed. It is unclear why Instagram does not have this filter at all, or why it cannot be removed for Facebook. This filter means the content is more specific to the given country, which is helpful. But location filters remove lots of potentially relevant things, as Meta’s own Crowdtangle documentation notes. From a quick check of Austrian content, I’d estimate that at least 50% of probably relevant material is cut out. It also raises the very important question whether foreign interference from outside the country – one of the key concerns of election monitoring – would be caught.

What could have been done?

These problems – which, make no mistake, drastically reduces the usefulness of the boards – could have been easily solved. It should be acknowledged that Meta's keyword lists run to hundreds of terms, so clearly they have spent time compiling them. But many election monitoring organizations maintain their own lists of keywords, or could be asked to produce them, drawing on their local expertise. Working with these partners instead would have reduced the work for Meta – and also produced much better outcomes. Meta could also have added simple keyword filter options to the boards, allowing users to cut out irrelevant material themselves and focus on their topic of interest. This ability to filter and focus is vital for proper monitoring and analysis – otherwise you basically rely on luck. It is unclear why such approaches were not taken.

"One of the richest companies in the world, with unique technical access to the Crowdtangle tool, has produced something which is of lower quality than an understaffed fact-checking organization would accept."

For the second general monitoring approach – following particular accounts – Meta has dashboards showing content from European Parliament candidates, European Political Parties, and Institutions. You can sort the posts by most recent, most “engagements” (likes, shares, etc.), or other options. But again, this relies on a lot of luck. Unless the top posts happen to also be disinformation, this doesn’t serve the purpose of protecting election integrity. Again, a simple keyword filter would mean that analysts could dig into what accounts are actually saying, focus on topics of their expertise, and find actual disinformation narratives. But in their current form, we will just get "which parties get most engagement" news stories, not defenses against election interference.

To step back again to the broader issue: Such errors run deeper than a few mistakes or bugs. One of the richest companies in the world, with unique technical access to the Crowdtangle tool, has produced something which is of lower quality than an understaffed fact-checking organization would accept. These dashboards could have been shared as proof-of-concept, looking for (much needed) improvements. But they have been offered, one week before the elections, as Meta’s response to serious concerns of election integrity. These elections don’t come as a surprise. Meta has had time to prepare and it has worked with European partners before. The issues’ extent makes you wonder if they actually expect the tool to be used, and checked it accordingly, or if this is a simply PR move. Civil society, researchers, and others stand ready to work together – including with those good, dedicated people we know are within companies wanting to help address the problems their platforms create. We cannot do that work well if this is the standard of engagement we can expect from platforms.

Read more on our policy & advocacy work on the Digital Services Act.