Privacy and a Return to Normal Part 2: Contact Tracing Apps Are Doomed
Part 2 of a series exploring privacy concerns about COVID-19-related contact tracing. Read Part 1 here.
I knew contact tracing apps were doomed on May 15.
On May 15, Washington Governor Jay Inslee retracted his requirement that restaurants keep a daily log of people who decided to test the brave new world of in-restaurant dining during Phase 2 reopening. (He still requested that restaurants and patrons do so voluntarily.)
Critics from all sides were not willing to see people give their local restaurant their name and phone number so they could be contacted if they had been in the restaurant at the same time as someone known to have COVID-19.
Here’s my question: if people are not willing to trust a restaurant with their name and phone number, who might they trust? A uniformed public-health representative? An elected governor? A police officer? An app?
To ask these questions is to realize the widespread adoption of a contract tracing app—heralded as a key to returning us to normality after months of shut-down—cannot happen in our current political and social climate of distrust.
We Have a Problem with Trust
We live in a world where we are deeply confused about who to trust.
It’s a world where we sign up for any free service and share shockingly intimate secrets about ourselves, but where we don’t answer the phone if we don’t recognize the number.
A world where we offer some identifying data very easily—credit card numbers, for example—but hold tight to our phone numbers, mis-believing that one can be used to contact us and the other can’t.
We mistake what identifies us and are mistaken again about how easy it is to re-identify us from seemingly anonymous data points.
In short, when it comes to understanding how to protect our personal information, we don’t know what the hell we are doing.
Amid such confusion, how can we expect people to judge whether a contact tracing app actually protects their identity? What technical guarantees could be made that would be sufficient? Which policies would signal appropriate practices? What laws would have to exist to assure us we were protected? (That’s a question being asked on Capitol Hill—the virtual version—as I write.)
It’s not just confusion about what constitutes personal data or how the technology would work that will keep contact tracing apps from working. It’s a more fundamental suspicion of information gathering in general, such that there is no public agency, governmental body, or elected official whose sponsorship would inspire sufficient trust to enlist broad public support.
The Promise of Technology...
Many tech people smarter than I am have spent a lot of energy considering the technical solutions to the contact tracing problem. Like many, I was excited about the options, and I continue to use a de-identified symptom tracking app today.
But debate has raged on whether such contact tracing technology will even work and be secure enough to protect private data. Here are just a few of the concerns I’ve seen:
- Bluetooth may not be accurate enough to provide precise tracking
- The apps may generate too many false positives and negatives to work efficiently
- The Google/Apple API effort is too private to be of any use to public health officials (Want it direct from the tech giants? Here’s their FAQ)
Even then, such tech tools will only help on the margins of the effort to stop the spread of disease.
...Runs Aground on Trust
But all these questions run aground on the problem of trust.
Put simply, not enough people will trust whichever contact tracing solutions finally make it to market if politicians, technologists, and health officials can’t trust each other enough to get behind a single usable contact tracing solution.
And if not enough people elect to use the apps, then the entire system just won’t work. (The most commonly quoted source cites 60% as the required rate of adoption to allow the insight needed to stop transmission.)
Already, we’ve seen adoption rates that are far too low to ensure the apps actually achieve their goals. In Iceland, a “small, socially- cohesive, and geographically isolated nation,” only 38% of citizens signed up for and used a contact tracing app.
What makes us think if this is how contact tracing fares in smaller, higher-trust cultures, it will work any better in the sprawling, decentralized, and politically divided culture of the United States? Recent polling makes it clear that Americans are not yet in agreement on any of the key issues that would drive adoption.
Imagine we’ve solved the technical problems of building a secure, accurate, and privacy-protecting app. Which public figure would have the kind of credibility to rally the public around its use?
You don’t have to subscribe to any particular political party to acknowledge that there is no one in public life today with the credibility to rally Americans around mass adoption—though there are certainly some with the power to rally their supporters against adoption! Which leads me back to a slightly more precise restatement of my original claim: widespread adoption of a contact tracing app is doomed.
A Local Solution
I had hoped the release of contact tracing apps—built collaboratively by tech giants and known public health agencies—might ignite a positive public debate about how privacy protections get built into technology..
Naively, I thought we might even get to consider things like data minimization, re-identification, etc. (If you hadn’t yet pegged me as a hopeless idealist, you will have now.) But in our hyper-partisan culture, there’s no room for such debate at the state or national level.
Short of some paradigm shift in the way we handle data (which is not out of the question, if you look at the work of companies like Oasis Labs), the only way I see people adopting contact tracing with today’s technology is if it’s done at the level of the community organization, the workplace, or even the small community. Only there do you see the levels of trust and open discussion needed to bring the risks and the benefits of contact tracing into the light.
We may yet see such community-based solutions come into play—but only after we watch our infatuation with technical solutions run aground on the rocks of distrust.