Plans to introduce a Covid-19 contact tracing app in Australia have sparked concerns among privacy advocates and data security experts, with fears a lack of transparency in the development phase could undermine public trust in the system.
Due to be launched in a fortnight, the app will enable health authorities to trace the contacts of infected people through their smartphones and alert them of their potential exposure to the virus.
The app, modelled on Singapore’s own TraceTogether app launched in March, utilises smartphones’ Bluetooth function to determine relative proximities, as well as create a de-identified record, of nearby users.
TraceTogether uses Bluetooth Relative Signal Strength Indicator readings across time to approximate the distance and duration of an encounter between devices.
Contact data is stored on individuals’ phones for 21 days on a rolling basis – anything beyond that is deleted, with no location data collected, according to the Singapore Government.
Speaking with FST Gov, Dr Katharine Kemp, senior lecturer at UNSW, said that while the Government has yet to reveal particulars on the Australian iteration’s architecture, it is understood the app will also use Bluetooth to ping and the capture proximity (but not location) signals between users.
Like the Singapore app, Kemp said Australia’s reporting network will be built on a centralised system – a point of contention for privacy advocates.
“In the interests of privacy, decentralised approaches are preferable. But the Government has indicated this will be a centralised system,” she said.
“That means that if you download the app and later test positive for Covid-19, you’ll be asked to contact the central registry, which will then be able to decrypt the list of app users you’ve been in contact with in the relevant period and notify them that there’s now a risk they’ve been infected, whereupon they’ll be warned to get tested and isolate.”
Kemp noted that Government could hold onto potentially identifiable data – including indirect contacts – for future use, without explicit consent.
“If the Government has these decrypted logs of people we’ve been in contact with over those days once we test positive, it could decide to use those details for other purposes – say by law enforcement or [data] analysts – to reveal other health or social issues.”
To retain public confidence in the system, Kemp urged the Government “strictly limit” where data is used and ensure that it is deleted once subscribers opt out.
In order to make the contact tracing system viable, the Government suggested a minimum uptake rate of 40 per cent for the app – a figure Kemp sees as little more than a shot in the dark. Yet it is a figure that, she believes, also serves to undermine the Government’s overall mission for its app.
“I don’t think there’s much science behind the Government’s suggestion that 40 per cent of people need to download the app for it to be effective. Others overseas have said 80 per cent or 60 per cent, and now the government is saying even 1 per cent is better than nothing.
“We do know that more coverage is better for tracing purposes. And maximum coverage requires maximum trustworthiness.”
Dr Hasan Asghari, a senior lecturer in computing at Macquarie University, further questioned the centralised architecture of the system, believing the app may fail to provide sufficient privacy from a central authority – a situation which could leave captured data liable for use by Government even after the crisis abates.
“The server can store the private data of a user even if [individuals] are not infected,” he said in comments to FST Gov.
“The app can potentially be misused for surveillance.”
Ultimately, he believed the app was being implemented too hastily, without due security checks, potentially leaving the door open to nefarious cyber actors to mine captured user data.
“There is a possibility that security loopholes may exist due to a potential lack of thorough testing because of time constraints,” he said.
Data security, he said, will be predicated on how data is stored in servers and the Government’s efforts to enforce ‘best practice’ security measures.
“In that case, standard best security practices apply. This is similar to protecting sensitive data in any other application. For instance, consumer data stored within government agencies.”
It was also likely new privacy concerns would emerge the longer the app was active, Asghar added.
He thus urged the Government to “ensure the use and retention of [Covid-19-related] data only for a short period and for this specific purpose.”
Transparency in the process, he said, also remains key if the Government is to gain the confidence of privacy advocates and citizens.
Ultimately, however, Asghari questioned the whole conceit of the project.
“[An] important thing to consider is that there is no clear evidence whether the usage of the app is providing us benefits to ease the lockdown restrictions.”
“So, we may be paying a price for privacy for no real benefit. However, this stands currently as mere speculation.”
Meanwhile, the Office of the Australian Information Commissioner is working on a privacy report on the Government’s Covid-19 tracing app due for release later this week.
The app’s source code will also be made publicly available and scrutinised by the Australian Signals Directorate (ASD), Australian Cyber Security Centre (ACSC) and Cyber Security Cooperative Research Centre.
Turning on Bluetooth
The choice of short-distance Bluetooth connection standard, rather than cellular signals (4G or 3G networks, for instance), as the go-to data collection medium for the app appears a deliberate effort by the Government to obviate fears that it can track or geolocate users.
The Bluetooth wireless standard typically used on today’s smartphones supports only short-distance (roughly 10 metres) data exchanges between mobiles. Using Bluetooth’s limited signal strength, the app is expected to track only the distance between users, rather than hone in on a particular mappable location, whilst also capturing data on times of contact and individually assigned user IDs, which will be temporary, random, and anonymised.
However, questions will be raised on whether captured data can be de-identified and patched together by analysts, forming a cohesive picture of an individual’s whereabouts, habits, and even social circles.
Users’ trust in the app will be largely vested in the effectiveness of its in-built encryption and data anonymisation processes.
Lessons, ultimately, will be taken from Singapore’s nearly month-long live experiment with its own contact tracing app, TraceTogether.
While the TraceTogether system requires users to submit their mobile number to a centralised server at the Singapore Ministry of Health, on-device data is secured through encryption.
The mobile number generates a temporary use ID that is stored on the Ministry’s server using a private key. Where a user is diagnosed with Covid-19, they are asked for consent to upload the app’s encrypted data logs to the server.
The data logs are then decrypted by the server that holds the private key. Through these data logs, the temporary IDs in the logs can then be used to contact other users who were in contact with the infected user.
Since these temporary IDs are generated by the server, the server can look them up against its user ID-mobile number database to determine the identity of these users. These IDs are periodically refreshed and transmitted to a corresponding user.
A number of other countries have developed similar apps to track Covid-19’s spread.
Israel has released Hamagen (The Shield) that retains time and location information on mobile devices and cross-references these details with the Israel Ministry of Health’s updated epidemiological data, including the time and location of infected people.
The Massachusetts Institute of Technology’s Media Lab has also released an open-source app, called Safe Paths, that similarly uses location history.