In a case which has wide implications for the health care sector, the ICO has ruled that the Royal Free NHS Foundation Trust failed to comply with the UK Data Protection Act when it provided personal data of about 1.6 million patients to Google DeepMind. The personal data was provided as part of a trial to test an alert, diagnosis and detection system for acute kidney injury. The ICO did not impose a fine for these failures. Indeed it is hard to see how the Royal Free’s failure could have been held to be of a kind likely to cause substantial damage or substantial distress to individuals, a test that would have to have been satisfied were the ICO to have imposed a fine. The Royal Free were asked to agree to the terms of an Undertaking, allowing the data to continue to be used by the Streams application whilst the required compliance measures are put in place.
Attention has focussed on what has been described as the ‘data sharing’ or ‘information sharing’ arrangement between the Royal Free and DeepMind but it is important to remember that what we have here is actually a data controller making use of the services of data processor. Significantly Elizabeth Denham confirmed, in her letter to the Royal Free, that, “The relationship between the Royal Free and DeepMind is one of a data controller to a data processor”. She went on to say that, “It is my view that the Royal Free has retained its data controller responsibilities throughout my office’s investigation and continues to do so”. It is therefore clear that the testing and deployment of the Streams application did not involve what might be commonly understood as data sharing, an arrangement whereby two or more organisations share personal data so that they can each use the data for their own purposes. It is clear that DeepMind were not provided with access to patient records for their own purposes and there has been no suggestion that they did anything other than process the data, on the instructions of the Royal Free, for clinical safety testing and for the live Streams application.
Although the involvement of DeepMind has featured heavily in discussion this is, in many ways, a red herring. The points that the ICO has been critical of and that feature in the Undertaking signed by The Royal Free would, for the most part, be equally applicable had the Royal Free developed, tested and deployed the Streams application itself or if DeepMind had provided the application to the Royal Free for the hospital to develop and deploy itself. For example the key shortcomings that patients were not properly informed that their data would be used in the clinical safety testing of the Streams application, that clinical testing does not come within the ambit of ‘direct care’, and that there is no obvious condition for processing that would remove the need for patient consent, would all have been equally applicable had the clinical testing been undertaken by the Royal Free itself without the involvement of DeepMind or, indeed, any other processor.
Elizabeth Denham has taken the opportunity to put forward her views on work of this type more generally, using this as a chance to educate. She is clear that the ICO supports appropriate use of personal data for the purposes of research, development and clinical improvements. They are committed to supporting technological advances in a way that “locks in good data protection practice”. The benefits that can be achieved by using patient data for wider public good are recognised and the ICO is keen to support this as long as data protection law is complied with. This means ensuring, for example, that one of the grounds for lawful processing exists (such as consent), that the processing is not excessive, and that individuals know enough to be able to exercise their rights. This applies whether or not a third party processor is involved in the project. She also reminds us that Privacy Impact Assessments should be carried out at the start of this type of innovation. As she stated in her blog: Four Lessons NHS Trusts can learn from the Royal Free case, “The price of innovation didn’t have to be the erosion of legally ensured fundamental privacy rights”.