Are the prospects of the app really that bright? And will it be the panacea that it is marketed to be?
The app itself
The government has touted the COVIDSafe as its own, which is partly true. It is based on the source code of the Singaporean Government’s TraceTogether app, while it was developed by Amazon Web Services (AWS) engineers. It is hosted, along with the data captured, in Amazon’s Sydney data centre, which has been certified by the Australian Signals Directorate with the highest classification level for domestic cloud services.
The rationale for choosing AWS over an Australian-owned data centre, was due to its capacity to do-it-all, according to Foreign Affairs Minister Marise Payne: “The contract with AWS is a combination of hosting, development and operational services, which is more extensive than services provided by pure hosting providers,” she said at the time of the announcement.
Essentially COVIDSafe turns a smartphone into a proximity monitor that communicates with other smartphones using the app by using Bluetooth Low Energy (BLE) technology to measure the distance between users, and the time they were in contact.
Initially, user privacy, data security and government surveillance were highlighted as potentially problematic, but with the passing of the Privacy Amendment (Public Health Contact Information) Bill 2020, the majority of these concerns were eased.
Yet, it seems, the privacy concerns are less of an issue than the app’s suitability and the effectiveness of its underlying technology, particularly BLE, and the way that it has been marketed to the Australian public.
For COVIDSafe to be effective, more than 40% of the population will need to download it, according to the official messaging. Yet, downloads only tell part of the story, according to Associate Professor Adam Dunn from the University of Sydney.
‘If 40% of the population download the app and we optimistically assume that half of those people are using the app properly at all times, then the likelihood of registering a random contact or a contact in a random encounter is 4%.
“Fewer than one in 20 potential contacts will actually be captured by the app and that is making some big assumptions about the technology it is based on.”
A/Prof Dunn is the head of discipline for Biomedical Informatics and Digital Health at Sydney University with his primary expertise being in data driven methods of public health surveillance. In 2019, he and his team published an experiment analysing BLE signals to track proximity in indoor settings.
They found BLE to be problematic to accurately measure proximity when indoors.
“Even in relatively controlled conditions, it’s very hard to work out which two signals are closest together, or which room people are in. We found that even the direction in which you hold your device makes a big difference to whether or not the app thinks you’re closer or further away.”
Ultimately, the inherent inaccuracy of BLE for proximity measurement could lead to false positives, where users are incorrectly identified as being in contact with a COVID-19 positive user, and false negatives where users are not notified, according to A/Prof Dunn.
A/Prof Dunn questions the government claim that 40% or more of the population need to download COVIDSafe for it to be effective, saying it is a vast understatement.
“The effectiveness of the app is likely to be very low, however, if 70% of Australians downloaded the app, and all of them were using it properly, and it worked reliably well, then it would capture up to half of the contacts.”
However, no matter the actual effectiveness of COVIDSafe, A/Prof Dunn is keen to emphasise that any assistance that can be provided to human contact tracers is valuable.
“If the app helps to contact trace better and it’s more of a help than a hindrance, then there’s nothing wrong with downloading it.”
A major problem with COVIDSafe is how it operates on iPhones as Apple’s iOS limits apps in the background from using Bluetooth.
“There are certainly some issues with making sure that the app is actually useful on iPhones,” he said. “To use it properly, people really need to have their phone unlocked and have the app in the foreground.”
Although the underpinning technology may be flawed, it is the messaging of the app that is problematic in his opinion.
“The Government has been communicating this in terms of the ‘app will keep you safe and our aim and target is to get as many downloads as we possibly can’. One of the problems with measuring downloads means that everyone just has to download it and then they can leave it off,” he said.
“We should not tie the downloading of the app to the relaxing of social distancing, or market the app as a form of sunscreen, because we need to remind people that they need to continue with the precautionary behaviours that have been so successful in Australia.
“This is to make sure that the contact tracers have as much time as we can to be able to get on top of outbreaks.”
Concerns about user privacy and data use of the app have been partly addressed by the release of the source code of TraceTogether by the Singaporean Government. While the Digital Transformation Agency has put the COVIDSafe source code on GitHub for public scrutiny (https://github.com/AU-COVIDSafe), However, the COVIDSafe server code was not released, despite the Singaporean Government releasing its server code. This would have provided further transparency, particularly on how the captured data is encrypted.
There are, however, aspects surrounding COVIDSafe and privacy that require clarification, according to Prof Dali Kaafar, Executive Director and Chief Scientist at the Optus Macquarie University Cyber Security Hub, New South Wales. Prof Kaafar is an expert in privacy enhancing technologies and risk analysis.
Prof Kaafar told Medical Forum there were several misconceptions about the app that needed to be de-mystified.
“The app needs permission to access location because that’s how Google’s Android and Apple’s operating systems are designed. The app needs to ask permission to access location but the GPS coordinates are never accessed.”
Another is the susceptibility of user information to be accessed between devices, which Prof Kaafar says is not possible as each device is given a random anonymised ID every two hours.
“There is no way to continuously monitor on a regular basis a particular individual from another peer device,” he said.
Prof Kaafar also addressed the potential for snoopers or hackers to compromise a device by accessing information collected by COVIDSafe, such as which users the device has been in contact with. He says this is not possible because information is uniquely encrypted to each device and can only decrypted if permission is specially given and, even then, it is complicated to extract it.
Another privacy limitation is COVIDSafe’s centralised approach to data management, which essentially means that data captured from devices is not private to the central authority (the Federal Government) in charge of it.
“When someone is diagnosed with COVID-19, the central authority will ask for consent from that individual to share information from their device … and will upload all the COVIDSafe app users they have been in contact with over the preceding 21 days, which are stored on the device,” he said.
The hitch with this type of data is that it can also reveal who has been together at certain periods of time without their consent, which can lead to more information revealed than simply tracing COVID contacts.
For example: a user has been diagnosed with COVID-19 (user A), the central authority requests their contact information, they analyse it, and find the user was at a café at a certain time with others (users B, C, D). The central authority can now co-locate through user A’s data that users B, C and D were together at a particular time and yet they don’t know the central authority is aware of this.
Now imagine another user has been diagnosed with COVID-19 (user Z), the central authority analyses their data and finds that they were at a park at a certain time and in their contact data, users B and C are co-located together again.
“The central authority can infer information about users B and C as the app immediately reveals their identities to the central authority without their knowledge or consent,” Prof Kaafar said.
“This is a fundamental issue I have with this app – that implicit consent is given and a central authority knows a lot of information about people without their knowledge or consent.”
Why is this important?
“It might not be important for many people. The fact that my wife and I were together in a coffee shop or a restaurant at some point in time is really not a problem at all from my privacy point of view.”
However, he said if it were a meeting that was sensitive, such as between a journalist and a contact or whistle-blower who wished to remain anonymous, the privacy of that information was more critical and could have safety and security implications.
Privacy is personal
Privacy is, by its nature, an individual consideration and perspective: A privacy risk for one, may be completely different for another.
Ultimately, the success of an app such as COVIDSafe rests on trust in those responsible for running it to protect the privacy of users, their protocols in relation to data usage, and their transparency and competency to develop and deploy the technology.
“Underlying its success is trust,” says Dr Kaafar. “When we are designing privacy preserving technologies, the point is to ensure there is a level of trust being maintained and that it is being created between the users of a service and whoever is running and deploying the app.”
“This is incredibly important. If you build this trust, you will have more people adopting this technology and this app.”
The Department of Health was contacted for comment on the effectiveness of COVIDsafe and the privacy issues raised in this article. No response was received before publication.