Can COVID-19 Contact Tracing Apps Work Without Opening the Door to Major Security Vulnerabilities and Privacy Concerns?

Several months in to the COVID-19 pandemic, it’s starting to look like there are a few different formulas for beating the disease. Countries like Taiwan and Japan used their previous experience and cultural cohesion to flatten the curve without resorting to lockdowns or wide-scale testing; New Zealand went into lockdown early and closed the borders to let the infection burn itself out. But one key ingredient that has made a difference in countries as different as Germany and South Korea has been relentless, effective contact tracing of infections.

Performing contact tracing at scale is a massive undertaking, traditionally requiring boots on the ground and door-to-door and telephone contact with sometimes hundreds of individuals who may have been in contact with or in proximity to a confirmed COVID-19 case. It’s the kind of problem that seems like it should have an inexpensive and effective technical solution, and, courtesy of the powerful portable computing devices we all carry around with us constantly, it does: app-based contact tracing.

By Definition, Contact Tracing is Anti-Privacy

But smartphone manufacturers and app makers were already in hot water in cybersecurity circles for previous abuses of privacy stemming from location services tracking. Strava’s data visualization of user running and biking patterns in 2017 set off an uproar as secret military bases and individuals in rural areas were exposed, for example.

So the big question in cybersecurity right now is: can effective electronic COVID-19 contact tracing happen without major privacy and security compromises?

In the most extreme cases, tracing applications clearly impede privacy and security. Hong Kong’s StayHomeSafe app comes with a paired wristband that uses geofencing technology to ensure users complete home quarantine periods. South Korea’s Corona 100m offers real-time alerts if a diagnosed patient wanders within 100 meters of any user, simultaneously populating the screen with info like diagnosis date, nationality, age, gender, and location history.

Solutions being contemplated in Western nations are not so extreme and have more security built in to the proposals. But it’s not clear that will be enough for cybersecurity experts.

At the heart of the problem is that contact tracing, by definition, must uncover and disseminate identity in order to work. Individuals have to be identified and contacted to be tested or warned.

Apps can automate much of this by using location or proximity data available on individual devices to record and then correlate locations when users have recently been in places close to anyone who ends up testing positive for COVID-19. That can offer early warning to those people to take extra steps to prevent spread based on the possibility they may have contracted it, while prompting them to go get tested immediately themselves.

The concern with app-based contact tracing is two-fold, however:

  • Uninfected users may be identified just because they were in proximity at some point with somebody who eventually tested positive
  • That data gathered will be retained and re-used for non-pandemic purposes

It’s not hyperbole to say that the fate of the economy and the lives of thousands hangs on this question.

Can Private Industry Protect Users Better Than Government?

App development has been scattershot, with many incompatible versions being cobbled together by public health agencies with little expertise in either programming or security.

That’s resulted in some really basic and terrible security mistakes. In England, the National Health Service put together an app that allowed alerts to trigger based on self-reported symptoms rather than a positive scientific diagnosis, creating massive DOS concerns. In Germany, initial designs for a system that put all data into a centralized database raised concerns that a single compromise could deliver the personal location data of millions to hackers.

In an era where many governments face record low trust from citizens, can private industry offer assurances that people will put their faith in? Two titans of cell phone technology, Apple and Google, have put together their own proposal to find out.

The Apple/Google framework has a number of innovations to both prevent abuses of the collected personal data and maintain the effectiveness of the tracing protocols:

  • Most of the data collected is retained only on the individual phone, and only released if the user tests positive and elects to release it
  • The data is cryptographically compressed to save both size and preserve privacy
  • Proximity-based rather than location-based; no movement information saved, simply whether user was close to another infected user at some point

It’s important to note that the API is not a complete app; the companies do not propose to handle judgement on notifications or to store any data themselves regarding positive cases. They do pledge to delete the system when the crisis is over; given Apple’s track record, particularly in resisting government invasions of individual privacy, some cybersecurity researchers find this more credible than federal assurances.

But there remain some potential attacks.

In the first place, some privacy advocates have concerns with the Bluetooth Low Energy beacon tracking in the first place, recommending that most users keep Bluetooth turned off except when it needs to be used. For contact tracing, “when it needs to be used” is all the time, which defeats that workaround.

Many beacon trackers used for advertising already have your device, so it would be a simple matter for them to also grab your COVID positive status from the published data. Depending on what and how much information they’ve collected about you otherwise, all that could be correlated to individually identifiable data.

Bluetooth really isn’t the ideal tool anyway, with potential to miss contacts in crowded scenarios, creating false negatives.

Some cybersecurity professionals have also identified the positive key publication mechanism as a path to Denial of Service attacks on phones with the app installed.

COVID-19 Presents Unprecedented Challenges to Cybersecurity Professionals

With every other industry and profession facing earth-shattering changes in patterns and assumptions of business, it would be too much to imagine cybersecurity was going to get a pass. COVID-19 may change some dearly held assumptions about privacy and security that have long been central to the profession.

Cybersecurity professionals always have to balance utility and privacy and security concerns. The case with COVID-19 trackers is no different, but the stakes are a lot higher.

It’s fairly clear that contact tracing in any form is a privacy impact. Yet it may be a necessary one, in the same way that we consider it important to reveal where convicted sex criminals live, or require mandatory reporting of HIV infection status. It also has to be compared to the alternatives; if tracing such a huge outbreak is unachievable with traditional means, lockdowns and quarantine may be the only viable responses, with all the psychological and economic damage that comes with them. The fact that COVID-19 is an event with a far greater scale than those examples only supports the argument that the privacy trade-off is appropriate.

In some ways, however, it also makes it that much more important that security and privacy advocates can find a compromise they can support. The only way such an app can work at scale in a free society is with widespread public adoption. Addressing people’s concerns about their personal data so that they have the confidence to install and use those apps will be a critical element of their success. The bottom line is that cybersecurity pros will have to work with public health officials and app developers to find common ground if technology is going to have any chance of helping us beat COVID-19.

Featured Programs:
Wiley University Services maintains this website. We are an advertising-supported publisher and are compensated in exchange for placement of sponsored education offerings or by you clicking on certain links posted on our site. This compensation may impact how, where and in what order products appear within listing categories. We aim to keep this site current and to correct errors brought to our attention. Education does not guarantee outcomes including but not limited to employment or future earnings potential. View Advertiser Disclosure
Wiley University Services


©2024 https://www.cybersecurityeducationguides.org All Rights Reserved.