On 4 September 2019 the Administrative Court (Haddon-Cave LJ and Swift J) handed down judgment in the case of R (Bridges) v Chief Constable of the South Wales Police [2019] EWHC 2341 (Admin).  The Court held that it was lawful for the police to use automated facial recognition software (“AFR”).

Background

The case concerned the South Wales Police’ (“SWP”) use of AFR in two instances where they allegedly recorded the image of the Claimant. Once on 21 December 2017 at Queen Street Cardiff and another at the Defence Procurement, Research, Technology and Exportability Exhibition (“the Defence Exhibition”) on 27 March 2018.

The Claimant contended that the use of AFR was unlawful for three reasons:

  1. That the use of AFR was an interference with the Claimant’s rights under Article 8(1) of the European Convention on Human Rights and was neither “in accordance with the law” or “necessary” or “proportionate” as required by Article 8(2).
  2. That the use of AFR was contrary to s4(4) Data Protection Act (“DPA”) 1998 and s35 DPA 2018. Further, that AFR use falls within s64(1) DPA 2018 and therefore, that a data protection impact assessment must be carried out.
  3. That under s149(1) Equality Act 2010 that the SWP failed to take into account of the fact that the use of AFR would result in a disproportionately higher rate of false-positive matches for women and minority ethnic groups. Therefore, the use of the program would indirectly discriminate. Accordingly, the SWP failed to take into account the relevant considerations from s149(1)(a)-(c) of the Act.

A specific type of software, AFR Locate, was used in both instances. AFR Locate takes digital images of faces of members of the public from live CCTV feeds and processes them in real-time to extract the biometric contours needed. The system then compares those contours to a watchlist formed for the specific deployment.

Watchlists are formed from images retained on the SWP’s database and comprise people wanted on warrants and suspected of committing an offence through to vulnerable persons. When the AFR system highlights a possible match this is communicated to an officer who reviews the match to ensure intervention is justified. This is then communicated to intervention officers, who use a traffic light system to address each match.

AFR Locate processes large amounts of data

  1. facial images;
  2. facial features (i.e. biometric data);
  3. metadata, including time and location; and
  4. information as to matches with persons on a watchlist.

AFR retains racial alerts for 24 hours, matches reports and the CCTV live feeds for 31 days and immediately deletes all other elements of data.

Judgment

Article 8 claim

In relation to the claim under Article 8 the Court held that AFR infringe privacy rights on the grounds that it automatically, without consent, collected the biometric data of members of the public and then processed this data. This brought the functions of AFR into the broad reach of Article 8(1), which encompasses a right to a persons’ image (S v. United Kingdom (2009) 48 EHRR 50, at [66]; Von Hannover v. Germany 40 EHRR 1, at [50]).

Even though the simple taking of an image has been held not to interfere with Article 8(1) the storing of data relating to the private life of an individual could.

The Court noted the similarities between the approach of the Court of Human Rights in the case of S v United Kingdom  and the present case, S involved the lawfulness of the police force retaining biometric data in the form of fingerprint and DNA samples.

Further, the instantaneous nature of the processing and short storage of processing did not have a bearing on this. The fact of processing biometric data in and of itself was sufficient to cause interference with Article 8(1) rights. The Surveillance Camera Commissioner’s AFR Guidance was cited persuasively as the “potential for intrusion arising from AFR is arguably consistent with that arising from some forms of covert surveillance tactics and capabilities”. Those whose images were stored on a watchlist had their Article 8(1) rights infringed.

Was use of AFR in accordance with the law?

In considering whether the SWP’s use of the AFR is in accordance with the law the Court first highlighted the fact there is no explicit legislative lawful basis for use. Instead, the use of AFR relies upon well-established common law principles.

In particular, the court considered the deemed lawful taking of photographs by police officers as in R (Wood) v Commissioner of Police of the Metropolis [2010] 1 WLR 123 and R (Catt) v Association of Chief Police Officers [2015] AC 1065. The central justification for the taking of such photographs being the maintenance of public order and the prevention and detection of crime.

This central purpose provided the justification for the lawfulness of AFR ,. Applying Hellewell v Chief Constable of Derbyshire [1995] 1 WLR 804 at 810F the Court considered that such use was reasonable.

The Court noted that legislative mandates are required for the interference with individual’s rights, such as what would otherwise be assault in obtaining DNA swabs and taking fingerprints. In the Court’s opinion the use of AFR is less invasive in this sense, so that common law powers could be relied on . The Court, therefore, equated the use of AFR to CCTV.

As a result, the Court placed the use of AFR within the existing legal framework :

 “The fact that a technology is new does not mean that it is outside the scope of existing regulation, or that it is always necessary to create a bespoke legal framework for it. The legal framework within which AFR Locate operates comprises three elements or layers (in addition to the common law), namely: (a) primary legislation; (b) secondary legislative instruments in the form of codes of practice issued under primary legislation; and (c) SWP’s own local policies.” – [84]

The Courtconsidered,s35-42 DPA 2018, the Surveillance Camera Code of Practice and SWP’s Standard Operating Procedure, Deployment Reports and Policy on Sensitive Processing. Cumulatively these provided a framework for the legal underpinning of the use of AFR, ensuring that the interference with Article 8 rights was in accordance with law.

Could the interference with Article 8(1) rights be justified?

  1.   The Court applied the well-established test for proportionality (see Bank Mellat v Her Majesty’s Treasury (No 2) [2014] AC 700),whether the objective of the measure pursued is sufficiently important to justify the limitation of a fundamental right;
  2. whether it is rationally connected to the objective;
  3. whether a less intrusive measure could have been used without unacceptably compromising the objective; and
  4. whether, having regard to these matters and to the severity of the consequences, a fair balance has been struck between the rights of the individual and the interests of the community.

The police’s use of technology in the prevention and detection of crime is well documented as being in compliance with the first two criterion. The Court applied strict scrutiny to the question of adherence with the third and fourth criteria.

It was held that, on the facts, these criteria was satisfied. ; AFR was used for a limited time, specific purpose, covered a limited area and led to the detection of criminality. In the instance of the Defence Exhibition, disruption had happened at the previous years’ event and an individual who had made a bomb threat was detected by the system. The Court considered the safety of the public, the lack of impact on the Claimant (as they were not on a watchlist), the targeted nature of the watchlist and the success of AFR (the use of the software in 50 instances had resulted in some 37 arrests or disposals).

The Data Protection Claims

The Court considered the AFR use as if the DPA 2018 had come into force despite the events themselves happening prior to this.  The Court considered three grounds here:

  1. the claim under the DPA 1998;
  2. the claim under section 34 of the DPA 2018; and
  3. the claim under section 64 of the DPA 2018.

The DPA 1998

The extent to which personal data was processed was at issue here. The Court addresses here whether AFR indirectly identifies individuals or individuates. The Court utilized a wide definition of indirect identification advocated in Breyer v Bundesrepublik Deutschland (Case C-582/14). As to individuation, the Court applied the approach taken in Vidal-Hall v Google Inc. [2016] QB 1003 relating to browser generated data; that the process singles out and distinguishes an individual from others.

The Court, unsurprisingly, determined that AFR individuates people. The creation of the facial contours identified an individual distinguishing them from others and enabling almost immediate identification.

Although AFR did involve the processing of personal data, that processing was lawful. AFR was utilized for the purpose of detecting and preventing crime and for specified limited reasons ensuring its processing was in adherence with the first data protection principle, requiring fair and lawful processing.

Section 34 DPA 2018

The Claimant contended that AFR involves sensitive processing of biometric data of individuals. The Defendant argued that identified individuals should be limited to only those on the watchlist.

The Court accepted the Claimant submissions on this point.  Each individual had their biometric data processed and they were identified to determine whether they were on the watchlist. AFR takes a digital image, applies a mathematical algorithm to it to produce a comparable biometric template. This brought processing within s35(8)(b) of the DPA 2018 as processing biometric data for the purposes of uniquely identifying an individual.

As a result, the use of AFR had to comply with the three requirements in s35(5).  It had to be shown:

(a)   the processing is strictly necessary for the law enforcement purpose,

(b)   the processing meets at least one of the conditions in Schedule 8 (necessity), and

(c)   at the time when the processing is carried out, the controller has an appropriate policy document in place (see section 42).”

Conditions (a) and (b) had already been made out in relation to the Article 8 claim. In relation to condition (c) the SWP had formulated a Policy on Sensitive Processing for Law Enforcement Purposes. The Court was sceptical as to the adequacy of this document but held that it was for the Information Commissioner to provide guidance as to the content of such documents.

Section 64 DPA 2018

This is the requirement that a data protection impact assessment be undertaken. The SWP had provided an assessment. The Court considered that this set out a clear narrative specifically considering Article 8 rights whilst establishing safeguards.

The public-sector equality duty claim

This claim relies upon s149(1) of the Equality Act 2010 which prescribes that public authorities are required to put in place systems to eliminate discrimination, advance equality of opportunity and foster good relations between those who have protected characteristics.

In adherence to these requirements, the SWP had prepared an Equality Impact Assessment – Initial Assessment, showing it had considered its obligations at an early stage. This was criticised by the Claimant on the basis that it failed to consider that the AFR software may produce results which were indirectly discriminatory due to results that state it is more likely to falsely match female and minority ethnic faces.

The Court noted that there was no firm evidence that AFR produces indirectly discriminatory results. The Court placed reliance on the safeguard of having an officer make their own determination of any match the system provides. Accordingly, the Claimants arguments as to the equality ground failed.

Therefore, the Claimant’s claim for judicial review was dismissed on all grounds.

Comment

Given previous caselaw on the use of police powers, this judgment is perhaps unsurprising. The oft referred to S case provides a framework for the decision.

What is perhaps surprising is the fact that AFR can be integrated into law enforcement activities without the establishment of separate regulatory guidelines. The Court regarded the DPA 2018 are providing key safeguards.  This legislation was intended to provide rules relating to new developments.  Nevertheless, it is perhaps surprising that such a n important new technology can be justified by pre-existing frameworks and guidelines.

The advanced and invasive nature of AFR in harvesting and generating biometric data was acknowledged by the Court, as was its technological complexity. Although there are many different angles which remain to be explored. A nuanced consideration weighing the prevention and detection of crime against Article 8 rights would be welcome.

Suneet Sharma is a junior legal professional with a particular interest and experience in media, information and privacy law.