Making sense of digital contact tracing apps for the next pandemics

In an interview with AlgorithmWatch, Prof. Susan Landau discusses why we need to resist fear in the face of pandemic uncertainty and the normalization of health surveillance technologies — and why the time to have a broad democratic discussion about their future uses is now.

Fabio Chiusi
Research Associate

In her book, ‘People Count. Contact-Tracing Apps and Public Health’ (MIT Press), Prof. Susan Landau provides the first comprehensive overview of digital contact-tracing (DCT) apps deployed during the response to the COVID-19 pandemic. Questions around their efficacy, whether and how they have been fruitfully integrated with public health policies, and the overall health surveillance infrastructure that is being built in many countries—in Europe and beyond—are tackled in her book with historical and conceptual sophistication, and with plenty of examples. AlgorithmWatch spoke with Prof. Landau over a Zoom call as a complement and addition to the early evidence-based analysis of DCT apps and their efficacy gathered over the course of the pandemic so far for the ‘Tracing The Tracers’ project. Susan Landau is Bridge Professor in Cybersecurity and Policy at the Fletcher School of Law and Diplomacy at Tufts University.

AW: Prof. Landau, in your book you write that the “deployment” of DCT “during the COVID-19 pandemic occurred in record speed.” However, “the public discussion of whether this is an appropriate public policy step did not. It’s time to start”. Where from, though? What’s the most pressing issue highlighted by the deployment of technological solutions to the pandemic, in your opinion?

SL: One of the things that I read, as I was writing the book, was a National Academy study on the social impact of AIDS. It was published in 1993—so a dozen years after AIDS began being studied in the US. It said, an epidemic or pandemic is both a social and a medical occurrence. In the early days, we were looking at it entirely as a medical occurrence. But anything you do, whether it's the apps or testing, or how you do testing, any kind of health care intervention, changes how people are treated.

And so that's the part that we didn't think about. It seems fairly clear from the data that the apps can be useful in lowering exposure, and, therefore, spread; it’s less clear now that the apps are going to be useful against the Delta variant. But that issue aside, what we haven't done is say: “Okay, we've got this medical intervention, that's going to decrease spread, but it's also going to change who gets ill and who gets medical attention.”

So, I can see from our interview: you're working from your home, I'm working from my home. If I get an exposure notification, pre-vaccine, it means that instead of going to the supermarket once a week I call somebody, I log online, I get somebody to go to the supermarket, that's the entire way my life has changed. I'm not a bus driver. I'm not a food service worker. I'm not anybody who can't go to work, if they've potentially been exposed, I can continue to live my life.

But imagine I get an exposure notification. I would then call up my doctor, and ask: where do I get a test? And she’d say: “I want you to get a test today. And I want you to get a test again, in three days.” Even if the tests are negative, this means using medical resources, and medical resources are finite. And it just points out, the same way that the whole disease pointed out, how inequitable medical care is. And I'm not even talking about “first world” versus “third world”: I'm talking within the United States, and presumably within pockets in Europe as well.

AW: In the book, you argue that digital contact tracing apps can further entrench such inequalities. In fact, you say, “the current generation of contact tracing apps was not built to address the specific needs of marginalized communities.” This reminds us of the many social justice dimensions involved in an issue—health surveillance—that already disproportionately affect minorities. Is it possible to imagine—and deploy—DCT apps that instead address those needs?

SL: I'll go back to something I said a moment ago, which is: if you decrease the spread, that benefits everybody. But when you decrease the spread in different communities, and you're designing the app, if your consideration is “how do we do public health?”, well, then you spend time talking to public health experts. Public health experts will say: “Here are the communities of greatest need. You're helping over here (prof. Landau points at a different place). And that's useful, because it's going to cut the number of cases. But I have to be careful about where I put my resources.” The app has to be part of a holistic healthcare solution. It can't be this thing sitting on its own.

AW: On the contrary, you write that apps developed within the widely adopted Google/Apple) Exposure Notification (GAEN) framework “effectively circumvented the public health system”, and were “not intended to work with contact tracers” —rather, to “bypass” them. In what sense?

SL: Well, they didn’t have to circumvent the health system. The Irish app, for example, tells you: “when you register for the app, you can register anonymously, or you can provide your phone number. If you provide your phone number, then if you get an exposure notification be aware of that, because it comes to your phone, and we will call you when contact traced.” So for me, if I were to get an exposure notification, I would immediately call the doctor.

But there might be people who, for whatever reason, aren't sure they are able to isolate, are scared, or want to wait for a couple of days and see whether they really get sick. If they happened to give their phone number when they registered for the app, then they will get a call from a contact tracer. Who doesn't start with: “Who are your contacts?” They start with: “How are you feeling? Have you gone to get tested? Do you want to get tested? Do you need food? Are you safe?”—all the kinds of human touch that are really important. And, especially for the communities who are most affected, who were marginalized socially, often in economic distress, and so on, having that support is really critical.

When Google and Apple did their 'Exposure Notification Express’, which is essentially a checklist, they eliminated that possibility: you couldn't provide your phone number, an option for the healthcare app designers. From the point of view of privacy, that's better, although it was a choice of the user to provide the phone number or not. But from the point of view of helping certain types of communities, it's not clear that that was better—I would argue that it probably is not. And I'm not a health care provider. I'm not a public health expert. But what I hear from the contact tracers is how important that human communication and human support is. I heard it whether I was talking to people who did it in Liberia, for Ebola, or people who did it in the United States for AIDS, or for people doing it for SARS-CoV-2.

AW: Is that a lesson we will need to learn if and when we will have to adapt DCT apps for the next pandemic, or possibly even the next waves of this pandemic: app developers will have to talk to people on the ground more?

SL: I would say that, but I would put it in a slightly different phrasing, I would say: this is a public health intervention, and you have to think of it as a public health intervention. This means you have to ask: how is the app changing the dynamic on the ground? And what compensatory things have to be done? Oh, and you have to come in humbly to the public health officials, because they know these things that we don't know.

AW: And yet, much of the debate is focused on how crucial it would be to adopt a DCT app, even—as claimed by many institutional and mainstream figures in Italy—at the expense of privacy and fundamental rights.

SL: Which is why we should have this conversation. Now, another aspect of all of this is that I made a timeline for something that I may write later, looking at how quickly we understood things. And it was in January (2020) that we understood people-to-people transmission. We understood in March that there were “super-spreaders”, and that enclosed spaces were really problematic. It was around that time that we began to understand that aerosols mattered, not just droplets. Once you say aerosols matter, that changes the whole way of thinking about the apps, because the aerosols can be in the air for a very long time.

AW: Still, that hasn’t been part of the conversation. There’s a broader consideration you make in the book around the use of DCT apps, though: “If apps aren’t efficacious, then there is no reason to consider them further.” At AlgorithmWatch, we tried to provide an early evidence-based overview, but only found contradictory results, and methods that make them hardly comparable, if at all. Do you think that these apps are actually “efficacious”? And should we then “consider them further” or not?

SL: I think we should, because the issue will come up again. I come at this work from having worked on encryption policy since the mid-1990s. And it's very attractive to say: “Oh, if we could listen to conversations,” or “Oh, if we could track cell phones, we could cut crime,” and so on. But then you have to ask the efficacy questions. And you have to ask them on the other side: what's the cost of making it easier to surveil somebody's phone—the currently breaking NSO story (the “Pegasus Project”) really plays that out.

But when there's something unknown and scary, everybody reaches for anything that can help protect them. And we know that given the way we live—encroaching dense populations right at the edge of wild areas—we're going to have more of these in the future. And so, we need to have that conversation now—to tease apart all the pieces. I mentioned the difference between aerosols and droplets, which hasn't been part of the app conversation—when it should be. I mentioned the difference between providing phone numbers or not. I mentioned the issue of testing, and ensuring that you can test frequently on uptake and whether it's valuable to the community. That hasn't been part of the conversation.

I don't doubt that if there is another pandemic, there will be more “we need to use these apps,” “we need to use other kinds of surveillance tools to cut the spread,” and so on. So, having the conversation when we're in a dispassionate time is the right time to have it, with getting all the facts and issues out.

AW: Surveillance is, of course, one of the main issues raised by tracing technologies deployed in response to COVID-19. Many of the fears in this respect, in Europe at least, have now moved from DCT apps to apps that couple a check-in function with some kind of COVID certificate or “pass” scheme. We’ve already seen something similar in places like Singapore

SL: Singapore check-in apps, because they're connected to Trace Together (the national DCT app), completely identify you. I was working at Sun (Microsystems) in 2000, when we began doing identity management systems. And, what you want in an identity management system is to be able to show, via an app on your phone, that you're over 18 and can go into the bar—you don't want to show anything else. You don't want to show when your birth date is, and so on. You want to be able to show that whatever it is, you're safe to go into a building. What you want is an app that does it in a de-identified way. The Singapore app does not do that.

AW: Which brings us to what is arguably the main issue in this field of research, especially given your past studies and work: the risks of normalizing health surveillance. “If we’ve normalized the idea of collecting proximity data,” you write, “the idea that such data could be used to make us feel safe from bad things—criminals, terrorists, kidnappers—might not seem so outlandish.” After all, “One has only to reflect on the 9/11 attacks to recall how rapidly surveillance powers can expand during an emergency.” And, it’s not just proximity data: it’s vaccination data, it’s the digital identity infrastructure we’re building in Europe, which might also be relevant here. It’s the broad and mostly unchecked deployment of biometric databases, something we have increasingly seen even over the course of the pandemic, for example in India and many African countries. What to do about this? How to de-normalize surveillance?

SL: So, a little bit of history. In the 1970s, as we began investigating Richard Nixon in the United States, the Senate put together a committee that looked at wiretap and government surveillance. It was run by Senator Frank Church, and it became known as the “Church Committee”. Its findings, among other things, formed the basis of our Foreign Intelligence Surveillance Act, but many other things as well. And one of the important quotes from that report is: “Persons most intimidated” by surveillance “may well not be those at the extremes of the political spectrum, but rather those nearer the middle. Yet voices of moderation are vital to balance public debate and avoid polarization of our society.”

What I think you have to do—and this is the good thing that DP-3T (Decentralized Privacy-Preserving Proximity Tracing, the open protocol on which Google/Apple exposure notification apps are based) did—is say: can we build this in a privacy protective way? Can we build the vaccine passports or certificates in a privacy protective way? We then have a much bigger long-term problem of building a surveillance infrastructure in ways that are much too convenient for people and hard for them not to use. And society makes it hard for them not to use it.

The classic example is the parents who don't want to be on Facebook, but all the notices about soccer practice being canceled for the kids because it's raining too hard only come on Facebook. The kids get mad each time they get driven to the field and nobody's there and they go home and say, “Mom, why can't you be on Facebook like everybody else?”

Another is the fact that you can't not have a mobile phone, because the payphones have disappeared. Already, 15 years ago, that became absolutely the case.  So how do you design the technology, so as to minimize the surveillance? Because you have to have both the laws and the technology. You want to have the laws and the policy; you can't not have the laws and the policy. But, as a safety check, you also want the technology to be built in a way that is really privacy protecting. And there I have to applaud the DP-3T people, as well as the groups in the US and elsewhere.

AW: Did they succeed?

SL. They have designed a privacy protective technology. What they didn't do—partially, I'm sure, because of the pressure of time, and partially probably because they're not so accustomed to interfacing at that rate of speed in a public forum—is ask: “What are the changes this is going to engender? How do we think about that?” And, of course, public health and epidemiologists were ridiculously busy a year ago. It's not to say they're not busy now, but what was happening a year ago was beyond crazy.

AW: At AlgorithmWatch, we have been investigating automated decision-making systems that have a social impact for years now, both within and outside the context of the pandemic. A consistent finding, one that is shared among all of our researchers, is that an ideological premise seems to lie at the heart of the deployment of such systems: namely, technological solutionism. Pragmatically, this, for example, means conceiving DCT apps as the ironclad “solution” to the pandemic, which, therefore, becomes a purely technological problem. And, when a device is conceived of as a “solution”, questions around transparency, efficacy, and the balancing of the rights at play tend to suddenly disappear, our experience shows. Did you find this ideological framing to be relevant in your experience as well?

SL: This is something Bruce Schneier, who works in cryptography policy, talks about a lot; and Ross Anderson too. Far is perhaps our most powerful emotion: it trumps everything. And so, when somebody says “I have a way to slow spread and find out if you're exposed,” that seems like a really useful thing. Also, here we faced fear in the face of uncertainty. And it's one thing to be fearful when you understand what the enemy is, and another when you don't even understand what the enemy is—which is where we were in January, February, March, and April. I still walk into shops that asked me to spritz my hands with some sort of sanitizer. Guys, it doesn't spread that way! So, I think this comes from fear, which is just a tremendously powerful emotion. And that's exactly why we need to think carefully, before the next pandemic, about what the balance is, and how you institute this the right way, if you're going to institute it, and in what situations it's useful.

AW: You mention fear. Trust is also a powerful bond that is necessary, you argue, to have effective public health interventions. And yet, both feelings are constantly mediated by the media and our political and institutional figures, which in many cases have been shown to thrive on sensationalism and even conspiratorial thinking— powerful drivers of fear and distrust. If those are the roots of solutionism, then it looks like we’re in for more of that.

SL: So, a long time ago, in the middle of the crypto wars in the 1990s, the FBI was arguing that they needed to be able to listen to encrypted conversations, in cases of kidnapping. There are a couple of things wrong with what the FBI was saying. One was that when you don't know who the kidnapper is, you can't listen to the conversation. The other thing is, after a lot of effort, I got numbers from the FBI that showed that there were 450 kidnappings a year, of which they used wiretaps four times. So: efficacy? But what the FBI started asking when they went around to Congress people was: “You have kids, don't you? How would you feel if they were kidnapped?” Because every congressman or woman had kids in their district, fear is incredibly powerful. And that's why we have to think about these things when we're not in a fearful time.

Sign up for our Community Newsletter

For more detailed information, please refer to our privacy policy.