What's the issue?
Facial recognition technology (FRT) is developing rapidly although it remains imperfect. Increasingly, governments, public authorities and private companies are looking to use it, in particular, for security and law enforcement, but also, potentially for research and marketing purposes. While the applications of the technology are many, there are restrictions on its use dealt with under a framework of legislation and guidelines, the usefulness of which is beginning to be tested in the courts as the debate around the impact of FRT on privacy continues.
What's the development?
The Divisional Court has turned down an application for judicial review in R (Bridges) v Chief Constable of South Wales Police and Others  EWHC 2341 (Admin). The application was brought by Mr Bridges in relation to the use by South Wales Police of 'AFR Locate', a pilot project which used Automated Facial Recognition (AFR) technology to capture digital images of the public and compare the biometric data from those images against a database of 'persons of interest'. If there was no match, the data would be immediately deleted, with the source CCTV footage retained for 31 days as permitted.
The Court held that while the use of the technology interfered with the Claimant's Article 8(1) right to privacy under the European Convention on Human Rights, that interference was justified, and that SW Police had complied with both then applicable and current data protection law in its use of the technology.
What does this mean for you?
This decision involved a public authority so it does not transfer directly to commercial use of FRT. There are, however, a number of useful points which are relevant to wider uses of FRT:
- Confirmation that FRT involves processing biometric data which is classed as special data under the GDPR.
- The importance of ensuring you have an appropriate lawful basis on which to base the processing – something which may prove difficult.
- The necessity of carrying out a DPIA prior to using FRT and not commencing processing before taking steps to mitigate any high risk to the rights of individuals. Remember that if this cannot be determined, you will need to consult the ICO.
- The importance of transparency – in this instance, the Court determined that there had been a high level of engagement with the public through social media and other notices.
- The importance of data minimisation – it was significant to the data protection claim (although not the Article 8(1) claim) that the personal data of individuals who did not match those on the list held by SW Police was deleted very quickly.
- Purpose limitation – another factor in favour of SW Police was that AFR Locate was used for very limited and specific periods of time.
Following the judgment, the ICO said it welcomed the Court's conclusion that the use of LFR systems involves the processing of sensitive personal data and compliance with the Data Protection Act 2018 (DPA18). The ICO has finished its investigation into the first police pilots of LFR and will consider the Court's judgment when finalising its recommendations and guidance. In the meantime: "any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply". The ICO may also take the Court's reasoning into account when conducting its recently announced review into use of LFR in the King's Cross area.
While this judgment involves the use of LFR by a public authority for law enforcement purposes, it is relevant to the use of FRT more widely in its consideration of what constitutes personal and sensitive personal data, its assessment of the application of the data protection principles when using FRT, the use of DPIAs and the relevance of retention periods. We can expect much more on this issue for businesses as well as in relation to the public sector and law enforcement. Additional policies and guidance from the ICO will be an essential aid to compliance.
Mr Bridge's application which was supported by Liberty and others, was made on the grounds that the use of the AFR technology:
- Infringed the claimant's right to a private life under Article 8(1) of the European Convention on Human Rights (ECHR).
- Breached s4(4) of the Data Protection Act 1998 (DPA98) by not complying with the data protection principles. And
- Breached ss 35, 42 and 64 of the DPA18 – strictly speaking the DPA18 was not relevant but the parties asked the Court to give its views. Note that this part of the DPA18 deals with the Law Enforcement Directive rather than the GDPR.
There was also a claim under the Equalities Act which is beyond the scope of this article.
Article 8 ECHR
The Court concluded that Article 8(1) ECHR was engaged. AFR is significantly intrusive and involves processing of biometric data which is "intrinsically private". The use of the Claimant's biometric data went beyond the expected and unsurprising (a test set out in S v United Kingdom). The fact that the biometric data was retained for a very short period unless there was a match was not relevant; even momentary processing of the data would be sufficient. Accordingly the use of AFR locate did interfere with the Claimant's Article 8 rights.
The Court then considered whether the use of AFR was in accordance with the law under Article 8(2) ECHR and concluded that it was because:
- The police have sufficient common law powers to take steps to prevent and detect crime and maintain public order.
- The method of obtaining the biometric data was not intrusive.
- There was a clear legal framework governing the use of AFR Locate comprising primary and secondary legislation, codes of practice and police policies, all of which provided legally enforceable standards. The Court did comment that the law might need to change in future to accommodate further technological developments but thought that there was a sufficient legal framework in this instance.
The Court went on to conclude that the use of AFR Locate by the police struck a fair balance and was not disproportionate (using the test in Bank Mellat). In particular, AFR Locate was used:
- In an open and transparent manner with significant public engagement (on social media, posters, on police cars at the scene and various privacy notices).
- For a limited time and with a limited footprint.
- For a specific and limited purpose ie identifying particular individuals.
In addition, any interference with Article 8(1) rights would have been limited owing to the prompt deletion of biometric data and its use in accordance with granular data retention periods set out in the associated DPIA.
The Court first considered whether the use of AFR Locate involved processing personal data as defined in the DPA98. It concluded that it was not personal data under the 'indirect identification test in Breyer, but that it did involve the processing of personal data using the test discussed in Vidal Hall – in other words that the person was sufficiently identified by the processing as to be 'individuated'.
Given that personal data was processed, it had to be processed in accordance with the first data protection principle that personal data be processed fairly and lawfully. The Court held that the processing did comply with that principle and that it was in the legitimate interests of the police to process it taking into account the common law obligation to prevent and detect crime.
The Court found that the use of AFR involves the processing of biometric data which means it is sensitive processing within the meaning of s35(8) DPA18. The processing in this case could be justified on the basis that it was strictly necessary for law enforcement purposes and for the common law duty of preventing and detecting crime and was, therefore, in the public interest. The police also satisfied the s35(5) requirement of having an "appropriate policy" in place which satisfied the s42(2) requirements. While the Court commented that the document was brief and lacking in detail, it did not give a view on the adequacy of the policy.
The Court also concluded that the police had satisfied the requirement under s64 DPA18, to carry out a DPIA. Interestingly, the Court said it was not up to them to determine whether or not the assessment met the requirements of s64 where the data controller had exercised reasonable judgment based on reasonable enquiry and consideration: "When conscientious assessment has been brought to bear, any attempt by a court to second-guess that assessment will overstep the mark".