Last month, the UK Information Commissioner (ICO) released a statement announcing an investigation into the use of facial recognition technology (FRT) in public spaces and expressing concern over its "potential threat to privacy" and "people's…most sensitive personal data". The ICO reassured individuals that it "will not hesitate to use our investigative and enforcement powers to protect people's legal rights".

On 4 September 2019, the first court judgment in the world considering the legality of FRT was published (available here). The High Court of England and Wales considered, in a judicial review challenge brought by civil rights group Liberty, whether an FRT pilot scheme run by South Wales Police (SWP) is consistent with the law, including data protection legislation.

Put simply, FRT maps faces by measuring the distance between facial features. That biometric data can then be compared with a database of existing photographs, which themselves have been mapped using FRT, to identify potential matches.

SWP's pilot scheme kicked off when the UEFA Champions League Final was staged at the Principality Stadium in Cardiff in June 2017. The proceedings concerned two later uses of FRT, when the individual represented by Liberty was in the proximity. On 21 December 2017, FRT was live from 8am – 4pm on Queen Street in Cardiff, with the aim of locating and detaining "Priority and Prolific Offenders". On 27 March 2018, FRT was live from 8.30am – 4pm at the entrance of a Defence Exhibition in Cardiff. This was due to the event's history of protesters causing criminal damage and disruption.

FRT mounted on a police van compared faces in public crowds with faces in existing SWP watchlists of around 900 and 500 people respectively. Possible matches were flagged to SWP officers for review but, save for a confirmed match, images and data were deleted immediately.

The court found that SWP's use of FRT amounted to the processing of special category data, which is subject to specific conditions under Article 9 of the General Data Protection Regulation. However, SWP had compiled a detailed Data Protection Impact Assessment of the use of FRT, with particular regard to data protection, and had implemented clear strategies to address data retention and notification to the public of data processing. As a result, the court found that SWP was lawfully processing data for legitimate reasons and in compliance with section 35 of the Data Protection Act 2018.

In the context of law enforcement, whilst there may be extreme exceptions, it is apparent that the key question is how to ensure a sensible balance between private rights and the public interest in harnessing new technologies, as opposed to whether or not they can be used at all.


Biometric technology has been harnessed commercially for years and companies have only scratched the surface of its true application. It has been predicted that the global biometrics technology market will be worth $59 billion by 2025. We can therefore expect further judicial commentary on the use of this technology in the near future. Particularly regarding voice recognition technology with most tech companies having this high up on their agenda.

While legal and regulatory considerations, such as the ones outlined above, are critical when it comes to biometrics, reputational issues attached to the use of this technology are equally so. Key to navigating that landscape will therefore be a strategy that places transparency at the forefront if trust and confidence is to be maintained with the public. It is encouraging in this case that SWP's use of FRT was found to be open and transparent and in accordance with the current data protection framework for law enforcement. Separately, it will be interesting to monitor any future challenges to the commercial use of related biometric technology.