Hurdles to making digital contact tracing work

  • by

Digital Contact Tracing or “Exposure Alerting” has been touted as one of the key weapons in fighting coronavirus and potentially bringing an end to lockdown in countries currently keeping their populations at home and indoors to flatten the curve of infections and coronavirus deaths.

Expert opinion is that 50% of the population need to be signed up to a digital contact tracing platform for it be to effective. This is a huge number – the contract tracing system in Singapore, which has been touted as a potential model for the system in the West, had just a 12% uptake.

By contrast, contact tracing in South Korea has been a huge factor in the rapid reduction of cases whilst also being decried by privacy and civil rights campaigners as being incredibly invasive. The system, which uses cellphone location data, CCTV, and credit card records, provides the government with an incredible amount of data according to a report in Nature:

In some districts, public information includes which rooms of a building the person was in, when they visited a toilet, and whether or not they wore a mask

Mark Zatrow, reporting in Nature

So, in light of the concerns that many privacy campaigners have, how achievable is the required 50% uptake and what are the hurdles we need to overcome to get there?

How might Contact Tracing work?

This graphic from the FT sums it up nicely…

Now we’ve got the theory down… what are the issues?

Phones are designed not to do this

The current design strategy is to use Bluetooth technology to exchange pseudo-random anonymous keys between phones when they are in close proximity with each other.

This requires an app to have access to the Bluetooth “stack” when the phone is idle. As this is a security risk, it’s something that phone operating systems currently don’t allow phones to do.

This is why a collaborative approach between Apple and Google is required – an interoperable standard needs to be agreed that will circumvent this security feature on both the host phone and the phone(s) it wants to collect IDs from. The latest iteration of this is decentralised and never collects geographic data, but this is just an API – a means for the application developers to build applications that can use the contact tracing. It’s not a contact tracing app in and of itself.

And this isn’t a quick fix.

Some reports indicate that this isn’t going to be an easy change for either Apple or Google to make. There are concerns about other apps being able to access this functionality, concerns about how the change will be rolled out (especially on Android devices, where operating system changes are not controlled by Google but by individual phone vendors), and questions about how it will affect things like phone battery life.

There may be as many as 2 billion phones that lack the necessary chipset or operating system version to be able to use the API, predominantly in the possession of older users who are most at risk from Covid-19 infection.

The underlying technology limitation is around the fact that there are still some phones in use that won’t have the necessary Bluetooth or latest operating system … If you are in a disadvantaged group and have an old device or a [basic] feature phone, you will miss out on the benefits that this app could potentially offer.

Ben Wood, analyst at CCS Insight, reporting in the Financial Times

Interestingly Huawei, the Chinese phone maker banned from using Google services by the US government, have confirmed that most of their handsets will receive the update but the position for other manufacturers is less clear.

But, let’s assume Apple and Google can get this to work…

Who holds this data?

There are competing models in terms of how a contact tracing system will store data.

The option favoured by privacy advocates stores data on the phone and the phone only. If a user reports that they have been diagnosed with COVID-19, their ID is then sent to a central server which either broadcasts it out to all subscriber devices, or from where those devices can periodically download an “infection list”, where it is then compared to the keys stored on the individual phones.

As Apple and Google are only providing an API for contact tracing, this means that somebody, somewhere, has to provide and maintain the server that will store the IDs of any who self-reports. Trust is a massive factor here; although the IDs are anonymous there are still huge questions about what a “bad actor” could do with this sort of data.

Apple has a strong position on privacy advocacy, but have historically been the victims of serious security breaches. Google has also had serious data breaches and has a less than stellar record when it comes to respecting users privacy.

In my opinion, Apple and Google may be ducking their responsibility by creating an API and leaving the utilisation of the API to others, a move that creates significant risk for the end-user, as privacy expert and campaigner Jaap-Henk Hoepman explains:

However any decentralised scheme can be turned into a centralised scheme by forcing the phone to report to the authorities that it was at some point in time close to the phone of an infected person. In other words, certain governments or companies — using the decentralised framework developed by Apple and Google — can create an app that (without users being able to prevent this) report the fact that they have been close to a person of interest in the last few weeks. The platform itself may be decentralised. But the app developed on top of it breaks this protective shield and collects the contact information centrally regardless. This effectively turns our smartphones into a global mass surveillance tool

https://blog.xot.nl/2020/04/11/stop-the-apple-and-google-contact-tracing-platform-or-be-ready-to-ditch-your-smartphone/

Contact Tracing Malware is Inevitable

Even if we do trust Apple, Google, and our government with this data, it seems inevitable that malware will be created that can use this API to track users without their knowledge or consent. At the moment, there is a hard wall around the Bluetooth stack – we’re about to punch a hole in it to make a door. Even a locked door is not going to be as secure as that wall used to be, and that should be a concern for any smartphone user.

Problems don’t have to start on the user’s cellphone either. Android can be installed on a wide range of devices – including Bluetooth beacons that could be installed in any location. CCTV and other surveillance technology could take a massive, and dangerous, leap forward with Android-based contact tracing applications able to track the movements of individuals.

It won’t matter than this data is anonymous. Given enough data points, anyone’s identity could be deduced even from a changing anonymous ID.

Trolls will target Contact Tracing Apps

But, let’s assume that Google and Apple find a way to provide this new API in a very secure fashion. There is no malware, only highly secure and rigorously approved applications. That would be OK, right?

Sadly, I think the final area in which contract tracing will fall down is when people start to realise that it’s wide open to abuse. Human beings, as a whole, have a history of behaving very badly once they know that they are anonymous. The more secure the contract tracing API is, the more anonymous and untraceable we become – and that leaves the system vulnerable.

Earlier this year, a German artist caused a traffic jam by faking slow traffic using 99 cellphones connected to Google Maps. Google Maps saw the slow moving phones connected to its system, assumed there was a traffic bottleneck, and people who received this updated traffic information started to avoid the road in question (which happened to run right outside Google’s offices).

Given that the whole point of a contact tracing app is to make people aware that they have, potentially, been exposed to someone with Covid-19 so that they can go into self-isolation, the potential for using a contract tracing app to cause disruption and mayhem are obvious.

Anyone who’s worked on abuse will instantly realise that a voluntary app operated by anonymous actors is wide open to trolling. The performance art people will tie a phone to a dog and let it run around the park; the Russians will use the app to run service-denial attacks and spread panic; and little Johnny will self-report symptoms to get the whole school sent home.

Security expert Professor Ross Anderson, of the University of Cambridge

Got a problem with a business? Hang around outside the offices for a few days, make sure you go to the same Greggs as some of the people who work there, then self-report with Covid-19. Got a dispute with your local council? Take a wander around the council offices and then self-report with Covid-19.

Any centralised registration of reports, or requirement for an official “Covid-19 Number”, defeats the idea of keeping the system anonymous, but without this the risk of malicious and erroneous self reporting is high.

If it’s so broken, why do it?

With whole countries on lockdown, economies under immense pressure, and people struggling to comply with social distancing measures long term, the desire for a way out of the current situation is high.

Would people trade privacy and civil liberties for the more tangible and urgent freedom of being able to move outside their own home, return to work, see friends and relatives? It’s certainly tempting.

The question each and every one of us will face is – what is my privacy worth? You may think that the answer to this is simple. You may not care if the government tracks you, you may not care if Google and Apple know where you are (chances are they already do), especially when you weigh this against the ability to leave your house, do you job, etc.

If you think that way, I will leave you with this final thought – one of the wisest things I’ve ever had said to me but one of the wisest people I’ve ever met:

Never make the mistake of assuming the system will always be benevolent.

Wes Packer

Leave a Reply

Your email address will not be published. Required fields are marked *