In the case of R (on the application of Bridges) v Chief Constable of South Wales Police ([2020] EWCA Civ 1058) the Court of Appeal held that the live automated facial recognition technology (“AFR”) used by the South Wales Police Force (“SWP”) was unlawful as it was not “in accordance with law” for the purposes of Article 8 of the ECHR.

In addition, the SWP had failed to carry out a proper Data Protection Impact Assessment (“DPIA”) and had not complied with the public sector equality duty (“PSED”).  This is the first successful legal challenge to AFR technology and an important decision in relation to the regulation of state surveillance.

The issues raised by the appeal were of particular importance because if AFR had been used nationally – without any limits on the discretion as to who to target and where to use it – it would have radically changed the nature of policing.  It could have been used to identify very large numbers of people and to track their movements around the country.  Linked to CCTV cameras it could have provided almost complete surveillance of the UK population.  This decision recognises that clear legal limits must be placed on the use of such technology.  How those limits will be applied in the future remains to be seen.

Background

The case concerned a type of AFR technology known as AFR Locate.  This works by extracting facial biometrics captured in a live feed from a camera and automatically comparing them to faces on a police watchlist – a database of individuals of interest for various reasons. If no match is detected, the software automatically deletes the facial image captured from the live feed. If a match is detected, the technology produces an alert and the person responsible for the technology, usually a police officer, will review the images to determine whether to make an intervention.

The SWP deployed AFR Locate on about 50 occasions between May 2017 and April 2019 at a variety of public events. These deployments were overt, rather than secret. The watchlists used in deployments included persons wanted on warrants, persons who had escaped from custody, persons suspected of having committed crimes, persons who may be in need of protection, vulnerable persons, persons of possible interest to SWP for intelligence gathering purposes, and persons whose presence at a particular event causes particular concern.

AFR Locate is capable of scanning 50 faces per second. Over the 50 deployments undertaken in 2017 and 2018, it is estimated that around 500,000 faces may have been scanned. The overwhelming majority of faces scanned will be of persons not on a watchlist, and therefore will be automatically deleted.

The Appellant, Edward Bridges, is a civil liberties campaigner who has been supported by Liberty, the civil liberties membership organisation. Mr Bridges was in the vicinity of two deployments of AFR Locate by SWP in Cardiff one of which was at a protest at an arms fair.  Mr Bridges brought a claim for judicial review on the basis that AFR was not compatible with the right to respect for private life under Article 8 of the European Convention on Human Rights, data protection legislation, and the Public Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010.  His Skeleton Argument can be found here [pdf].

On 4 September 2019 the Divisional Court dismissed Mr Bridges’s claim for judicial review on all grounds. They found that although the right to privacy under Article 8 of the Convention was engaged in respect of not only people who were the subject of a match but anyone whose face was is scanned by AFR equipped cameras. This established that the use of AFR entails the processing of personal data and of sensitive processing  in respect of anyone whose face is scanned.  However the Court held that the interference with rights was in accordance with law and proportionate.

The Divisional Court dismissed both data protection claims, brought under the Data Protection Act 1998 and Data Protection Act 2018 (“DPA 2018”). The Court held that the PSED was not breached because there was no suggestion in April 2017 when the AFR Locate trial commenced that the software might operate in a way that was indirectly discriminatory.

Mr. Bridges challenged AFR Locate on the basis that it was unlawfully intrusive, including under Article 8 of the European Convention on Human Rights (“ECHR”) (right to respect for private and family life) and data protection law in the UK. His appeal was based on the following five grounds:

  1. The High Court had erred in its conclusion that South Wales Police’s use of AFR and interference with Mr. Bridges’ rights was in accordance with the law under Article 8(2) of the ECHR.
  2. The High Court had incorrectly concluded that the use of AFR and interference with Mr. Bridges’ rights was proportionate under Article 8(2) of the ECHR.
  3. The High Court was wrong to consider the DPIA carried out in relation to the processing sufficient for the purposes of Section 64 of the DPA 2018.
  4. The High Court should not have declined to reach a conclusion as to whether South Wales Police had an “appropriate policy document” in place regarding the use of AFR Locate that was within the meaning of Section 42 of the DPA 2018 for carrying out sensitive data processing.
  5. The High Court was wrong to hold that South Wales Police had complied with the Public Sector Equality Duty (“PSED”) under Section 149 of the Equality Act 2010, on the grounds that the Equality Impact Assessment carried out was “obviously inadequate” and failed to recognize the risk of indirect discrimination on the basis of sex or race

The Appellant’s Skeleton Argument in the Court of Appeal can be found here [pdf]

Judgment

In relation to Ground 1, the Court held that SWP’s interference with Mr Bridges’s Article 8(1) rights was not “in accordance with the law” for the purposes of Article 8(2). The Court held that although the legal framework comprised primary legislation (DPA 2018), secondary legislation (The Surveillance Camera Code of Practice [pdf]) and local policies promulgated by SWP, there was no clear guidance on where AFR Locate could be used and who could be put on a watchlist.  The Court held that this was too broad a discretion to afford to the police officers to meet the standard required by Article 8(2). The Court said

“The fundamental deficiencies, as we see it, in the legal framework currently in place relate to two areas of concern. The first is what was called the “who question” at the hearing before us. The second is the “where question”. In relation to both of those questions too much discretion is currently left to individual police officers. It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR can be deployed”. [91]

As a result AFR Locate failed to satisfy the “in accordance with law” requirements of Article 8(2) because it involves two impermissibly wide areas of discretion: the selection of those on watchlists, especially the “persons where intelligence is required” category, and the locations where AFR may be deployed (see [152]).

The appeal failed on Ground 2, agreeing with the Divisional Court that SWP’s use of AFR was a proportionate interference with Article 8 rights under Article 8(2). The Court held that the Divisional Court had correctly conducted a weighing exercise with one side being the actual and anticipated benefits of AFR Locate and the other side being the impact of AFR deployment on Mr Bridges. The benefits were potentially great, and the impact on Mr Bridges was minor, and so the use of AFR was proportionate under Article 8(2) ([143]).

The appeal succeeded on Ground 3, that the Divisional Court was wrong to hold that SWP provided an adequate “data protection impact assessment” (“DPIA”) as required by section 64 of the DPA 2018. The Court found that, as the DPIA was written on the basis that Article 8 was not infringed, the DPIA was deficient ([153]).

The appeal failed on Ground 4, that the Divisional Court was wrong to not reach a conclusion as to whether SWP had in place an “appropriate policy document” within the meaning of section 42 DPA 2018. The Court held that the Divisional Court was right to not reach a conclusion on this point because it did not need to be decided. The two specific deployments of AFR Locate which were the basis of Mr Bridges’s claim occurred before the DPA 2018 came into force.

The appeal succeeded on Ground 5, that the Divisional Court was wrong to hold that SWP complied with the PSED.  The two protected characteristics that were relevant in the present case are race and sex. The Court held that the purpose of the PSED was to ensure that public authorities give thought to whether a policy will have a discriminatory potential impact. SWP erred by not taking reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds. The Court did note, however, that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex.

The SWP confirmed that they were not seeking to appeal against the Court of Appeal’s judgment.

Comment

This is the first case to consider the use of AFR technology.  Its use has profound consequences for privacy and data protection rights and, as this case demonstrates, there is no clear legal framework governing its use.

Civil liberties and human rights NGOs were concerned that AFR would be used for nationwide intelligence gathering, particularly around protests.  The ICO expressed concerns in an Opinion [pdf] issued in October 2019.

The uses of new surveillance technologies in the United Kingdom have often not been accompanied by proper legal safeguards.  Over nearly four decades the Court of Human Rights has found that English law provided inadequate safeguards for the privacy rights of individuals.  Examples include, the targeted interception of communications and the obtaining of call logs (Malone v United Kingdom (1985) 7 EHRR 14), the placing of listening devices in domestic premises (Khan v United Kingdom (2001) 31 EHRR 45) and the bulk interception of communications (Liberty v United Kingdom (2009) 48 EHRR 1 and Big Brother Watch and others v United Kingdom (2018) app nos. 58170/13, 62322/14 and 24960/15).

This lack of sensitivity to privacy issues may have a historical explanation.  As Lord Red said in R (T) v Chief Constable of Greater Manchester ([2015] AC 49)

The United Kingdom has never had a secret police or internal intelligence agency comparable to those that have existed in some other European countries … There has however been growing concern in recent times about surveillance and the collection and use of personal data by the state. … But such concern on this side of the Channel might be said to have arisen later, and to be less acutely felt, than in many other European countries, where for reasons of history there has been a more vigilant attitude towards state surveillance. That concern and vigilance are reflected in the jurisprudence of the European Court of Human Rights in relation to the collection, storage and use by the state of personal data. The protection offered by the common law in this area has, by comparison, been of a limited nature”  [88].

In contrast the earlier cases, the Court of Appeal in Bridges has recognised the need for a proper legal framework to be in place to regulate the use of a new surveillance technology.  The “discretions” of individual police officers as to who to target and when were essentially unfettered and, as a result, the deployment of AFR was not “in accordance with law”.

In relation to the PSED, the Court noted that the SWP had never sought to satisfy themselves that the software program did not have an unacceptable bias on the grounds of race or sex ([199]).  They noted evidence from computer expert Dr Anil Jain (see here and here) that AFR can sometimes have such bias.

One important legal point arising from this decision should be mentioned.  The Claimant argued that in assessing the proportionality of an interference with Article 8 rights the Court should look at not just the actual use of the technology but also at potential uses.  This argument was rejected by the Court of Appeal which refused follow a dictum of Lord Kerr in Beghal v Director of Public Prosecutions [2015] UKSC 49 ([102]) and said that it would not look at “hypothetical cases” ([60]).  The difficulty with that approach in relation to new surveillance technologies is obvious: the interference with the privacy rights of one individual may be relatively minor but the real issue is the general interference with privacy rights of large groups or even the whole population.   This is an issue which is likely to arise for reconsideration in future cases.

The decision was rightly described by Liberty as “a major victory in the fight against discriminatory and oppressive facial recognition”.  It remains to be seen how the SWP and other police forces will react to the decision and what protections will now be put  in place.  It seems likely that this will only be the first round in an extended struggle to restrict the use of intrusive surveillance technologies by law enforcement bodies.

Hugh Tomlinson QC is a member of the Matrix Chambers media and information practice group and an editor of Inforrm.