R (on the application of Edward Bridges) v The Chief Constable of South Wales  EWHC 2341 (Admin)
In Bridges, an application for judicial review, the UK High Court (Lord Justice Haddon-Cave and Mr. Justice Swift) considered the lawfulness of policing operations conducted by the South Wales Police force (“SWP”) which utilised Automated Facial Recognition (“AFR”) technology. The Court rejected Mr Bridges’ allegations that the SWP’s conduct was unlawful as contrary to the European Convention on Human Rights (“ECHR”), Article 8, the Data Protection Acts 1998 and 2018 (“DPA 98 and 18”), and the Equality Act 2010. In this blog post we consider several key aspects of the case.
Among other uses, AFR can help to assess whether two facial images depict the same person. A digital photograph of a person’s face is taken and processed to extract measurements of facial features. That data is then compared with similar data from images contained in a database.
AFR has utility for policing and security operations, as it has the potential to increase the capability of the police or security services to locate potential criminals or security threats as they move through the general population. At the same time, AFR can pose human rights and data security concerns, as it may open the door to increased state surveillance of the general public, and could result in false-positive identifications. The Court in the SWP matter recognized the need for this balance, explaining that:
“The central issue is whether the current legal regime in the United Kingdom is adequate to ensure the appropriate and non-arbitrary use of AFR in a free and civilized society […] The raw power of AFR – and the potential baleful uses to which AFR could be put by agents of the state and others – underline the need for careful and on-going consideration of the effectiveness of that framework as and when the uses of AFR develop.” (Paragraphs 1 and 7).
In the UK, the SWP are pioneers in the use of AFR for policing; the SWP has utilised AFR since 2017 as part of a trial of the technology. The Home Office has provided funding to the SWP to develop AFR and has created an Oversight and Advisory Board to co-ordinate consideration of the use of facial images and AFR technology by law enforcement authorities. AFR has increased the SWP’s effectiveness: use of AFR has resulted in arrests or disposals in 37 cases where the individual in question had not been capable of location by more traditional methods, and it has reduced the resources required for searching for individuals. (Paragraph 106).
Bridges arose out of a SWP operation — “AFR Locate” — which involved the deployment of surveillance cameras to capture digital images of members of the public and the processing and comparison of those images with digital images of persons on watchlists compiled by SWP for the purpose of the deployment. In the event of no match, the biometric data about the individual was immediately deleted (with the underlying CCTV footage retained for 31 days as normal). In the event of a match, the SWP decided what action to take.
At the time of the hearing, SWP had deployed AFR technology as part of AFR Locate on about 50 occasions at a variety of large public events, including on the day of the 2017 UEFA Champions League Final, at various international rugby matches at the Principality Stadium in Cardiff, at music concerts, and at an Elvis Presley Festival.
Mr Bridges is a former Liberal Democrat local politician. He said that the SWP captured and processed his image on two occasions in the course of AFR Locate: (1) on 21 December 2017, when he was shopping; and (2) on 27 May 2018, when he was attending a peaceful protest against the arms trade. As Mr Bridges was not on a watchlist, his image was deleted shortly after it was taken. Supported by Liberty, an NGO, Mr Bridges brought an application for judicial review, alleging that the SWP’s conduct was unlawful as it involved breaches of the ECHR, the DPA 98 and 18, and the Equality Act 2010.
The Court’s Decision
Mr Bridges said that SWP’s use of AFR breached ECHR Article 8, which provides that everyone “has the right to respect for his private and family life, his home and his correspondence.” Public authorities may not interfere with the exercise of this right, except: (1) where the interference is in accordance with the law; and (2) the interference is “necessary in a democratic society” in the interests of a limited number of objectives, including national security, public safety, or the prevention of disorder or crime.
The issues were: (1) whether ECHR Article 8 was engaged; (2) whether the SWP’s activities were “in accordance with the law”; and (3) whether the SWP’s activities were “necessary in a democratic society” in the interests of one of the objectives stated in Article 8(2), in accordance with the four-part test set out by the UK Supreme Court in Bank Mellat v Her Majesty’s Treasury (No 2)  AC 700.
At to (1), the Court held that Mr Bridges’ Article 8 rights were engaged, even though the surveillance took place in public spaces and Mr Bridges’ image was automatically deleted immediately following the matching exercise. The Court placed weight on the fact that the intervenors, the Information Commissioner and the Surveillance Camera Commissioner, supported Mr Bridges’ position. In this respect, the Court quoted from the Information Commissioner: “18. … The automated capture of facial biometrics, and conversion of those images into biometric data, involves large scale and relatively indiscriminate processing of personal data. If such processing is not subject to appropriate safeguards, such data … could be collected … in a manner amounting to a serious interference with privacy rights.” As to (2), the Court held that the SWP acted lawfully because its common law powers to keep the peace and prevent crime gave it the power to deploy AFR, and because there is legislation (such as the GDPR), practice codes (such as the Surveillance Camera Code of Practice), and policy documents which provide standards against which the lawfulness of SWP’s use of AFR can be assessed.
As to (3), the Court held that no less intrusive measure than AFR was reasonably available to the SWP and that the SWP’s use of AFR struck a fair balance between the rights of the individual and those of the community. The Court’s reasoning is set out at paragraph 101:
“AFR Locate was deployed in an open and transparent way, with significant public engagement. On each occasion, it was used for a limited time, and covered a limited footprint. It was deployed for the specific and limited purpose of seeking to identify particular individuals (not including the Claimant) who may have been in the area and whose presence was of justifiable interest to the police. On the former occasion it led to two arrests. On the latter occasion it identified a person who had made a bomb threat at the very same event the previous year and who had been subject to a (suspended) custodial sentence. On neither occasion did it lead to a disproportionate interference with anybody’s Article 8 rights. Nobodpolicey was wrongly arrested. Nobody complained as to their treatment (save for the Claimant on a point of principle). Any interference with the Claimant’s Article 8 rights would have been very limited. The interference would be limited to the near instantaneous algorithmic processing and discarding of the Claimant’s biometric data. No personal information relating to the Claimant would have been available to any police officer, or to any human agent. No data would be retained. There was no attempt to identify the Claimant. He was not spoken to by any police officer.”
There was no issue between the parties as to the other parts of the Bank Mellat test, i.e. whether SWP’s objectives were sufficiently important to justify the limitation on Mr Bridges’ rights and whether AFR was rationally connected to that objective.
Given the above, the Court rejected the ECHR claim.
The DPA 98
Mr Bridges also claimed that the SWP breached the DPA 98, Section 4(4) because it failed to act in accordance with the principle set out in the DPA, Schedule 1 that personal data may only be processed fairly and lawfully.
The Court held that Mr Bridges’ image, as captured by the SWP’s surveillance cameras, amounted to personal data, notwithstanding the fact that SWP could not identify Mr Bridges by name, because the AFR technology “individuated” Mr Bridges, or singled him out from all other individuals. It took measurements of Mr Bridges’ face for the purpose of comparing him with the individuals on the watchlist, which data was distinct from similar data extracted from the images of other persons captured on camera.
The Court held that the SWP’s processing of Mr Bridges’ image was fair and lawful. For the same reasons which led the Court to conclude that the interference with Mr Bridges’ ECHR, Article 8 rights was justified, the Court was satisfied that the condition set out in Schedule 2, paragraph 6 of the DPA 98, applied, i.e. that the processing was necessary for the legitimate interests of the SWP and was not unwarranted by reason of interference with Mr Bridges’ rights, freedoms, or legitimate interests.
The Court thus rejected the DPA 98 claim.
Mr Bridges said that the SWP breached Section 34(3), which obliges data controllers to be able to demonstrate compliance with Section 35 as to personal data. He said that the SWP could not demonstrate compliance with Section 35, which provides that the processing of personal data for any of the law enforcement purposes set out in Section 31 must be lawful and fair (the purposes are prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security).
The issues were: (1) whether the SWP’s processing of Mr Bridges’ data was “sensitive processing” under Section 35(8) (in which case the test for lawfulness and fairness is more stringent); (2) whether the processing was “strictly necessary” for one of the law enforcement purposes; (3) whether the processing was (a) necessary for the exercise of the SWP’s common law duty to prevent and detect crime; and (b) necessary for reasons of substantial public interest; and (4) whether the SWP policy document “Policy on Sensitive Processing for Law Enforcement Processes” dated November 2018 was an “appropriate” policy document (Section 35 requires a data controller to have an “appropriate” policy document in place when carrying out sensitive processing; Section 42 sets out the standard as to “appropriate”).
At to (1), the Court held that the processing was sensitive, because the SWP’s purpose was to identify members of the public (“Although SWP’s overall purpose is to identify the persons on the watchlist, in order to achieve that overall purpose, the biometric information of members of the public must also be processed so that each is also uniquely identified, i.e. in order to achieve a comparison. This is sufficient to bring processing of their biometric data within the scope of section 35(8)(b) of the DPA 2018” (Paragraph 133). The Court was satisfied as to (2) and (3), for the reasons why it concluded that interference with Mr Bridges’ Article 8 rights was justified. As to (4), the Court observed that the policy document lacked detail and did not include systematic identification of the relevant policies or a systematic statement of what those policies provide, and did not address at all the position of members of the public. However, the Court nevertheless found that the document was appropriate, because “the development and specific content of that document is, for now, better left for reconsideration by the SWP in the light of further guidance from the Information Commissioner” (Paragraph 141).
The Court rejected the DPA 18, Section 34 claim.
Mr Bridges said that the SWP breached Section 64(1), which requires a data controller to conduct a data protection impact assessment before it embarks on a type of processing which is likely to result in a high risk to the rights and freedoms of individuals. A data protection impact assessment must: (1) describe the processing operations and assess the risks arising from those operations to the rights of data subjects; (2) identify any measures it proposes to take to address those risks; and (3) identify any measures it proposes to put in place as safeguards to help ensure the protection of personal data. The Court found that the SWP’s impact assessment, Version 5.4 dated 11 October 2018, met the requirements of Section 64(1): “There is a clear narrative that explains the proposed processing. This refers to the concerns raised in respect of intrusions into privacy of members of the public when AFR Locate is used. Although it is no part of the requirements of section 64 that an impact assessment identifies the legal risks arising from the proposed processing, the SWP’s assessment specifically considers the potential for breach of Article 8 rights.” (Paragraph 148).
The Court rejected the Section 64 Claim.
Equality Act 2010
Mr Bridges said that SWP breached its duty under Section 149(1) of the Equality Act 2010. Under Section 149(1), public authorities must, in the exercise of their functions, have due regard to, inter alia, the need to eliminate discrimination and the need to foster good relations between different people. Mr Bridges said that the SWP did not have due regard to these objectives, because it did not consider the possibility that AFR might produce results that were indirectly discriminatory on grounds of sex and/or race because it produces a higher rate of false positive matches for female faces and/or for black and minority ethnic faces.
The Court held that there was no evidence that AFR did in fact produce results which were discriminatory in the way alluded to by Mr Bridges. It dismissed the Equality Act 2010 claim.
Further Guidance from the Court
The Court gave some additional guidance as to the lawfulness of the use of AFR technology:
- a factor weighing in favour of the Court’s conclusion that SWP’s use of AFR was lawful was that the software’s decisions as to identification were always reviewed by a human police officer (“In our view, the fact that human eye is used to ensure that an intervention is justified, is an important safeguard”) (Paragraph 33).
- the inclusion of a person’s image on a watch list without sufficient reason (sufficient reason could include, inter alia, that the person has an outstanding arrest warrant or there is evidence that they have committed a crime) would likely amount to a breach of that person’s ECHR, Article 8 rights (paragraph 105).
- the Court may give public authorities a degree of deference as to a data protection assessment under DPA 18, Section 64: “when determining whether the steps taken by the data controller meet the requirements of section 64, the Court will not necessarily substitute its own view for that of the data controller on all matters. The notion of an assessment brings with it a requirement to exercise reasonable judgement based on reasonable enquiry and consideration. If it is apparent that a data controller has approached its task on a footing that is demonstrably false, or in a manner that is clearly lacking, then the conclusion should be that there has been a failure to meet section 64 obligation. However, when conscientious assessment has been brought to bear, any attempt by a court to second-guess that assessment will overstep the mark.” (Paragraph 146).