Data protection and management

Definition of `health data'

What constitutes ‘health data’? Is there a definition of ‘anonymised’ health data?

Health data

Under the UK General Date Protection Rules (GDPR), ‘data concerning health’ means personal data related to the physical or mental health of a natural person. This definition includes the provision of healthcare services, which reveal information about a person’s health status. The Information Commissioner’s Office (ICO) has confirmed that ‘data concerning health’ can also relate to healthy individuals, and includes data from medical devices and fitness trackers (eg, the number of steps taken by the user or athletic performance). Data such as appointment details, reminders and invoices may also constitute health data if it reveals or could in combination with other data reveal information about a person’s health through ‘reasonable inference’.

Additionally, the UK GDPR uses the concepts of ‘genetic data’ and ‘biometric data’. ‘Genetic data’ means personal data relating to the inherited or acquired genetic characteristics of a natural person that give unique information about the physiology or the health of that natural person. Such data results, in particular, from an analysis of a biological sample from the natural person in question. ‘Biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person. Biometric data is an open category and can include a broad set of identifiers such as DNA matching, iris and retina recognition, facial recognition, and fingerprint and voice recognition.

 

Anonymous data

The preambles to the UK GDPR describe ‘anonymous information’ as ‘information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable’. Therefore, genuinely anonymised information does not constitute personal data for the purposes of, and is not regulated by, the UK’s data protection regime.

Companies should bear in mind that identifiability is a continuum and is evaluated taking into account the full commercial context of the processing. Fully identifiable data (eg, data including a person’s name) sits on one end of the continuum, whereas fully anonymised data, (ie, data from which it would be impossible to identify an individual) sits on the other. Key-coded (or, in the terminology of the UK GDPR, ‘pseudonymised’) data, as is commonly used in many healthcare and research contexts, sits in between fully identifiable data and fully anonymised data. Unlike anonymised data, pseudonymised data is considered personal data for data protection law purposes. The same data set may be anonymised in the hands of one party, but identifiable in the hands of another party. For example, a key-coded result of a patient’s test may be anonymous in the hands of a data analytics company that has no access to the key, but may be identifiable in the hands of that patient’s treating physician who does have access to the key.

Data protection law

What legal protection is afforded to health data in your jurisdiction? Is the level of protection greater than that afforded to other personal data?

Data concerning health, genetic data and biometric data is among a list of ‘special categories of personal data’ under the UK GDPR. Such data can only be processed if one of a limited number of conditions are met, which are exhaustively set out in law. Those conditions most likely to be applicable to a digital health company may include  one or more of the following:

  • The data subject has given their explicit consent.
  • The processing is necessary for the purposes of preventive or occupational medicine, the assessment of the working capacity of an employee, medical diagnosis, the provision of health or social care, or treatment of the management of health and social care systems and services.
  • The processing is necessary for reasons of public interest in the area of public health.
  • The processing is necessary for scientific research purposes in the public interest.

 

A number of the conditions listed above trigger the application of further requirements under the Data Protection Act 2018, and in many circumstances, an ‘appropriate policy document’ will also be required.

The ICO recognises health data as ‘some of the most sensitive personal data’. This assessment is likely to play a part in the regulator’s analysis of a company’s obligations, such as whether: (1) security measures applied to the data are appropriate in light of the potential risk to the rights and freedoms of natural persons; and (2) security incidents with respect to personal data are notifiable to the ICO and to data subjects.

Anonymised health data

Is anonymised health data subject to specific regulations or guidelines?

Anonymised data falls outside of the UK data protection regime’s scope, as it no longer constitutes ‘personal data’. However, controllers of anonymised data should keep in mind that the act of anonymising could in itself constitute data processing within the meaning of the UK GDPR.

It should always be borne in mind that anonymisation is typically considered together with subsequent processing purposes, such as machine learning or other forms of data analytics. If the use of patients’ data post-anonymisation is contemplated, patients may be entitled to understand what further uses will be made of their data, whether such data will be commercialised and in what ways. While such post-anonymisation activities are not within the scope of the UK data protection regime, patients or users may legitimately expect to receive at least a high-level information notice explaining what will happen to the data post-anonymisation. Failure to adequately do so may reduce take-up if the organisation in question is seeking the consent of such persons, or may lead to reputational harm if it becomes known that health data was inappropriately or unexpectedly used after being anonymised.

Enforcement

How are the data protection laws in your jurisdiction enforced in relation to health data? Have there been any notable regulatory or private enforcement actions in relation to digital healthcare technologies?

The ICO has broad investigative powers and can issue sanctions, including administrative fines up to the higher of £17.5 million or 4 per cent of the company’s total worldwide annual turnover. To date, ICO enforcement activities have mainly been triggered by data breaches, and there have not been any notable enforcement actions against digital healthcare technologies.

Additionally, any person who has suffered ‘material’ or ‘non-material’ (eg, emotional) damage as a result of a data protection violation has the right to compensation.

The ICO’s Assurance team carries out audits across a broad range of health organisations. Breaches found during the audit can lead to ICO investigations, which in turn may lead to the ICO mandating remedial actions by the breaching party. One notable ICO enforcement against digital healthcare technologies concerned TPP’s SystmOne, the second-most widely used GP electronic patient record system in England. In 2017, the ICO raised concerns around the software’s ‘enhanced sharing’ function. This function allowed authorised users at hospitals and other care organisations to access, and add to, a patient records. Following the ICO investigation, new controls were implemented in 2018 giving GP data controllers more control in how they share patient records for the purposes of patient care.

The ICO relaxed advice and enforcement during the covid-19 pandemic in the public interest. For example, the ICO began allowing clinicians to use consumer video conferencing solutions at the same time that NHSX, a government unit responsible for leading IT policy across NHS, began allowing healthcare professionals to use messaging tools such as Skype, WhatsApp and FaceTime in the course of executing their duties.

Cybersecurity

What cybersecurity laws and best practices are relevant for digital health offerings?

The EU Directive on security of network and information systems (the NIS Directive), which aims to ensure the security of critical IT systems in central sectors of the economy, was implemented in the UK by the NIS Regulations 2018 (the NIS Regulations). The NIS Regulations’ requirements for relevant entities include to:

  • take appropriate technical and organisational measures to ensure the security of their network and information systems;
  • consider the latest developments and potential risks facing their systems;
  • take appropriate measures to prevent or minimise the impact of security incidents; and
  • notify the relevant supervisory authority without undue delay if any security incident occurs that has a significant impact on service continuity.

 

Within the healthcare sector, the scope of the NIS Regulations is limited to:

  • providers of non-primary NHS healthcare in England;
  • local health boards and NHS trusts in Wales;
  • the 14 territorial health boards and four special NHS boards in Scotland; and
  • health and social care trusts in Northern Ireland (paragraph 8, Schedule 2, NIS Regulations 2018).

 

However, this scope is set to widen with the entry into force of the revised NIS Directive (the NIS 2). In its draft proposal released on 16 December 2020, the European Commission recommended expanding the scope of NIS 2 to include entities that manufacture pharmaceutical products (including vaccines) or critical medical devices. The UK is not required to implement NIS 2 post-Brexit, and it is unclear if steps will be taken to maintain harmonisation. However, if the UK were to enact legislation to align with NIS 2, more manufacturers and companies operating in the life sciences space in the UK would become subject to the regime’s requirements.

UK entities may also be subject to the NIS Directive extraterritorially. After Brexit, if a digital service provider is no longer established in the EU but offers digital services into the EU, it will be subject to the obligation to designate a representative in an EU member state in accordance with article 18(2) of the NIS Directive.

There is no legal requirement in the UK for companies to obtain cybersecurity insurance. 

Best practices and practical tips

What best practices and practical tips would you recommend to effectively manage the ownership, use and sharing of users’ raw and anonymised data, as well as the output of digital health solutions?

Companies engaged in the digital health space should bear in mind the concepts of ‘privacy by design’ and ‘privacy by default’, which are built into the UK data protection regime and also the ICO’s stated priority on records management in the healthcare space.

In practical terms, this means implementing technical and organisational measures that secure the data and ensure it is processed in a manner commensurate to the purposes for its processing. For example:

  • Companies should collect as little personal data as is necessary for their purpose. For example, if the user’s age suffices, the user’s full date of birth should not be collected.
  • Companies should anonymise and aggregate personal data when possible. For example, if a company is trying to build an analytics model of how many steps users in a particular city take on average, it can aggregate that information and not hold the exact number for each user.
  • Companies should, when possible, only obtain access to pseudonymised data and when accessing data from a third-party source, a digital health company should build organisational and contractual safeguards that ensure that it has no ability to re-identify the pseudonymised data to which it has access.
  • Companies should make sure that any consents obtained from data subjects are freely given, specific, informed and unambiguous. Where possible, separate consents should be obtained for separate processing purposes. While the UK regime allows some level of generality when obtaining consent for future research, companies should explain to data subjects what the company proposes to do with the data in as much detail as possible at the outset.
  • Companies should maintain visibility over the personal data they process across the organisation. One of the easiest ways to achieve this is to maintain a fulsome ‘record of processing’, as is required in accordance with article 30 of the UK GDPR (sometimes also referred to as a data inventory or asset register).

 

In 2020, we saw a continuation of the trend of ransomware attacks increasingly targeting companies with large amounts of electronic health records or profiles. Defending against and responding to a ransomware incident, particularly one with multi-jurisdictional impact, is complex and requires consideration of a number of regulatory areas, including data protection, cybersecurity, law enforcement, industry-specific regulation and sanctions (in relation to ransom payments). The UK National Cyber Security Centre (NCSC) has prepared a guidance note on mitigating such attacks. The NCSC recommends using layers of defence across an organisation in what is known as a ‘defence-in-depth’ approach, which includes:

  • making backup copies of information;
  • implementing technical measures that prevent malware from being delivered to devices in the first place;
  • implementing technical measures that only permit trusted applications to run on devices; and
  • preparing your organisation for an eventual attack by having a response plan in place.