With the announcement that NHSX have begun testing their COVID-19 contact-tracing app ‘in the wild’ on the Isle of Wight, what lessons can be drawn from the way in which this project has been conceived, developed and implemented?
Undoubtedly it is impressive that NHSX have managed to develop and deploy an app so quickly and it is clear that privacy concerns amongst the public are high. They have been at pains to reassure the public about its privacy-conscious approach to the design. What has become clear however is that even in a situation where the stakes are high – the health and wellbeing of loved ones, public services under unprecedented pressure, an economy at significant risk, public borrowing at record levels – privacy concerns remain at the forefront of people’s minds. If privacy is truly valued to a similar degree, are the health and care sector prepared to adopt the privacy-conscious approach which is clearly needed to reassure and positively engage with the public?
In this blog I will look at some of the key themes which have emerged from the way in which the NHSX contact-tracing app has been developed and what lessons can be drawn from this to help ensure the privacy concerns of the public can be suitably addressed in the future.
Necessity and Proportionality
One of the most surprising aspects of the public’s response to the NHS contact-tracing app is how the idea of necessity and proportionality is viewed by the public. If protecting lives and the NHS are not considered sufficiently compelling grounds for infringing on privacy, then what is? If we can take nothing else from this, it is clear that a paternalistic approach and an assumption that the public are willing to sacrifice their privacy will likely need to be reconsidered.
If there has been limited engagement with the public about a new initiative or project, it will be difficult to justify any assumptions made about the willingness of individuals to give up their privacy. Often there is a heavy focus on how products, initiatives or projects will benefit professional users. This can lead to the creation of echo chambers where the strong and compelling views of a one group are amplified above those which should arguably carry equal weight. In the health and care industry in particular, the focus is often on how health and social care professionals will benefit based on the assumption that if professionals are more efficient and can improve the quality of services and care, the recipients of those services will then vicarious benefit. There is often little emphasis on testing those assumptions to ensure that the benefits which service users will receive are considered to be sufficiently compelling to the public and that the means used to achieve those benefits have been suitably balanced against their views or concerns around privacy. Assumptions need to be tested to ensure you are accurately gauging the ‘mood’ of the public. Only then can any accurate assessment of the necessity and proportionality of an approach be made.
If the fight against Covid-19 isn’t considered a convincing-enough argument for the public to relinquish their privacy, can you be confident that your aims and your approach will be well-received? Fail to consult with the public at your peril!
Privacy by Design and by Default and Transparency
The concerns which the public might have in relation to the NHS contact-tracing app were clearly identified by NHSX at an early stage. Anything which could be seen as an attempt by the state to implement real-time location tracking was undoubtedly going to raise numerous questions. Experts advocating the use of such an app were cautious in their suggestion that usage should be voluntary, qualifying their stance by stating that this should be the case ‘at least in the first instance’. This was always going to be a difficult position to maintain as it quickly became clear that large-scale uptake was central to effectiveness of the app. NHSX were quick to try and dismiss any accusations that the app would be privacy-intrusive, initially stating that alerts would be sent anonymously, and later emphasising that the app had been “designed with privacy and security front of mind”.
This approach has put the government and NHSX in a good position when they have been challenged on this. Addressing privacy concerns from the outset demonstrated a good understanding of public concern and no doubt encouraged the app developers to adopt a privacy by design and by default approach, helping to reassure the public. Unfortunately, despite promises that key security and privacy designs would be published alongside the source code to support independent scrutiny, at the time of writing this blog, this detailed information has yet to materialise. Transparency is therefore key and committing to the publication of security information or Data Protection Impact Assessments (DPIAs) is an excellent way of building public confidence in your activities and demonstrating a commitment to privacy. It is however equally important to follow through on these commitments as a failure to do so can lead to difficult questions as to why this information is not forthcoming.
The purpose of the contact-tracing app was originally to help notify individuals who may hadbeen exposed to an increased risk of infection as a result of their social interactions. This required only a ‘peer to peer’ arrangement where individuals’ devices communicated with each other in a de-centralised model. The scope was quickly expanded to also support the wider response to the pandemic by providing information centrally about ‘hot spots’ or areas in which a large number of infections were being identified so that services could be scaled up or down as needed to deal with any ‘flare ups’. This involved sharing information between app users and centrally. Research was then identified as a further way in which data collected through the app could be used. The current position is that the data will only be used for NHS care, management, evaluation and research, however NHSX have been careful not to exclude the possibility of other changes occurring in the future by saying “If we make any changes to how the app works over time, we will explain in plain English why those changes were made and what they mean for you.” It is clear therefore that further changes could be on the cards, with specific concerns being raised by The House of Commons’ Human Rights Select Committee about plans to extend the app to record location data.
The shifting scope of the contract-tracing app over such a short period of time is an excellent demonstration of how initiatives can morph from something fairly innocuous to somethingfar more difficult to justify and defend. Normally this happens over a much longer period of time making it more difficult to observe, but having been compressed into a matter of weeks, the contact-tracing app provides a stark example of how changes in scope can result in significant changes in privacy risk. There clearly needs to be a focus on the potential benefitsof incremental developments or larger transformation and innovation should of course not be unreasonably stifled. It is however equally important to ensure that any privacy implications are considered, both in response to specific changes in scope, and periodically, so there can be confidence that the ends do indeed justify the means. The Data Protection Impact Assessment (DPIA) process allows privacy implications to be identified at an early stage to help prevent scope creep occurring without the necessary focus on their impact on privacy.
The contact-tracing app has sought to minimise the volume of data being processed and ensure that wherever possible the real-world identity of users is not used. Analysis conducted has indicated that despite these privacy-enhancing techniques being employed, the data being collected and processed is still classed as personal data under the General Data Protection Regulation (GDPR) and Data Protection Act (DPA). This shows how broad the definition of personal data is under the law and easy it is to unwittingly process personal data under the misapprehension that it is in fact anonymous. This isn’t to say however that the use of personal data is unreasonable or unjustified, or that the use of privacy-enhancing techniques is a waste of time. The situation is in fact quite the reverse – where personal data is needed the law is flexible enough to support this, particularly where steps have been taken to minimise this as far as possible.
The contact-tracing app is a prime example of how significant value can be derived without the need to use vast quantities of personal data. By limiting the data collected to a partial postcode and a unique ID, the contact-tracing app is still able to provide a highly valuable service and functionality. Innovation which derives maximum value from the bare minimum of data is likely to be viewed more positively by the public than alternatives which perhaps gather vast quantities of data, most of which may never be needed or used. Careful design which encourages ‘doing more with less’ is essential in building trust amongst those who use your products and services. It also has the effect of reducing the volume of data which needs to be stored, and, by association, reduces the volume of data which is then at risk from a data breach or similar incident. The less personal data you process, the less of target this is to cyber criminals, the less there is to lose and the less there is to store and maintain over time.