leidenlawblog

Giving Privacy a Bad Name

What is all the fuss with contact tracing apps about?

Contact tracing apps are a public health initiative to combat the spread of COVID-19. Users install an app on their mobile phone that sends out short, little coded bursts of gobbledygook. When someone tests positive for COVID-19, the app can tell if someone else has come in close contact with that person, will send the user a notice, and then that person can choose to self-isolate. Importantly, the system is purported to be only feasible if a large percentage of the population download the app. Deployment and implementation will require an unprecedented level of trust in governmental authorities and assurances that users who download the app will not regret handing over the power of 24-hour monitoring and surveillance to the State.

Understandably, governments are looking to technology to help fight the battle against COVID-19 and contact tracing has emerged as a tool that may help to do so. As discussions have heated up, the Netherlands and other EU Member States have been grilled by privacy advocates who have expressed concerns regarding the compatibility of such apps with human rights to privacy and data protection. There are also concerns that these apps will fail to comply with the European Union’s General Data Protection Regulation (GDPR). Much of this debate comes down to whether the data is stored locally (decentralised) or whether it is stored centrally. So how apps are designed really matters to how much privacy we are willing to give up.

At the core is whether this obvious interference with human rights is justified, and whether the interference is proportional to the aims of States faced with an unprecedented public health crisis. The debate is somewhere between whether ‘we design the app for privacy’ (privacy-preserving) or whether ‘we design in a way that ensures efficiency in tracking’ (designing for life-preserving functionality first). The latter rightly prompts serious questions about government surveillance.

The extent of privacy invasiveness of a contact tracing app is very much dependent on its in-built features, the system underlying the technology, and how it is actually deployed. But in order to comply with the GDPR, certain legal tests must be satisfied. The privacy impact assessments for contact tracing applications must entail a case-by-case analysis: does the app collect health data that is adequate, relevant and limited to what is necessary in relation to the purpose of combating the spread of the disease? Is the collected data only to be used for the specific purpose of contact tracing? What rules have been implemented to ensure erasure of data within specified time-frames? Will the data be processed and stored only within the European Union? What privacy and data protection safeguards have been implemented? Are the GDPR’s privacy and data protection ‘by design’ principles built into the ‘code’ of the app? For example, is the data stored locally, on users’ phones? Or is it stored centrally by some government agency?

There is still a lot to learn about this virus and how it spreads. Yet contact tracing apps could be a vital tool in keeping the right people at home, as governments look to starting up their economies again. Therefore, deployment of these tools must be undertaken quickly and without all the necessary evidence to justify their invasion of our privacy. Importantly, we might not even know what functionality is needed to ensure efficiency and accuracy. Accordingly, privacy advocates would be unreasonable in insisting that an app not be deployed pending conclusive evidence of the efficacy of the deployment of such an application to contain the virus. The UN Special Rapporteur on the Right to Privacy, Joseph Cannataci, is reported as asking:

There are many confounding variables which can distort analysis, but what is going to happen if, in a year or two, we would realise that the introduction of such apps, or indeed even more invasive contact tracing using mobile phone data, did not significantly arrest the impact of COVID-19?”

But what if contact tracing did significantly arrest the impact of COVID-19, and facilitated quicker return to normality? Most people would conclude that the interference with our right to privacy would have been justified. While the ringing of a precautionary bell may well be expected, it would be wrong to assume that all contact tracing apps are a disproportionate or an unjustified interference with our fundamental rights to privacy and data protection.

What is of utmost importance is transparency and accountability: The detailed description and information about in-built privacy and security safeguards must be made publicly available and analysed. This would support community confidence in the app. The GDPR requires app providers using personal data deemed high risk to the rights and freedoms of natural persons to undertake a data protection impact assessment. This includes processing on a large scale of health data, which would certainly be in the case with a contact tracing app.

The source code should be made available and the deployment of the app should be put on a lawful basis. In this manner, the safeguards enacted in law and coded into the contact tracing system would be known to the public, who will have reasonable expectations about how and for what purposes the data processing will be carried out, including the impact upon their rights to privacy and data protection.

Governments would do well to heed advice and guidance as they consider the privacy issues through the data protection impact assessment. The Data Protection Regulator should have an enhanced role watching the implementation closely, auditing the system, and investigate any complaints from the public.

It is also important to not let privacy stand in the way of technology. If we let our concerns stand in the way of effectively controlling a pandemic, we could end up giving privacy a bad name. Yes, crises like the 9/11 terror attacks or the London Bombings have sparked surveillance legislation in the past. But the fundamental rights to privacy and data protection are not absolute. We give up some of our privacy to feel safe and secure all the time. This poses challenging questions about trade-offs. States are also tasked with ensuring fundamental rights like the rights to life and security, while promoting other countervailing public interests like public health.

The uptake of contact tracing apps will require an unprecedented act of trust by users. Protocols should be followed to ensure fundamental rights are protected and, if necessary, law should be enacted to put the app’s use on a lawful basis. But if the app does what it promises, trusting technology will have played a fundamental part in controlling a pandemic. Some of the language coming from the privacy side seems to suggest they have forgotten that the right to data protection does not trump the state’s obligation to protect our lives. Accordingly, some critical perspective is needed. What we should avoid doing is enacting new surveillance laws in a time of crisis. Laws have the annoying habit of actually outliving the crisis itself.

The author would like to thank Dr Mireille M Caruana of the University of Malta for her contributions to this blog.

0 Comments

Add a comment