The use of facial recognition software (“FRS”) in security and monitoring was thrust into the spotlight by the London Mayor Sadiq Khan, taking issue with a London developer over its installation in a King’s Cross site.
In this post on we consider the privacy and data protection issues with integrating FRS into security systems, an issue currently before the courts.
Human rights NGO Liberty has commented on the recent dispute qualifying it as “a disturbing expansion of mass surveillance”. Instances of FRS being used by the South Wales Police to scan large crowds was legally challenged by the group. The software used by the force works by mapping the facial contours from subjects and matching them to those retained on a “watch list”. This system brings matters of accuracy, discriminatory algorithms and personal data acquisition into the fore.
The Information Commissioners Office which issued a statement on the police forces use of FRS:
“Legitimate aims have been identified for the use of live facial recognition. But there remain significant privacy and data protection issues that must be addressed, and I remain deeply concerned about the rollout of this technology.”
Accordingly it is recommended that a data protection impact assessment be undertaken before any software is used, a bespoke policy document drafted and algorithms are bias checked.
The R (Bridges) v South Wales Police case
A key test case on the use of FRS has been heard by the Administrative Court. The case concerns the South Wales Police’ efforts to use FRS when the complainant, Mr. Ed Bridges, questioned its lawfulness. The case has been heard and is currently awaiting judgment.
Mr. Bridges, has challenged the practice on three grounds:
- Privacy rights: FRS is criticized for being overly invasive in monitoring people indiscriminately and during their daily lives, the nature of which if more intrusive than other forms of surveillance.
- Equality laws: The algorithms used by FRS and its processes have been criticized for disproportionately misidentifying BAME individuals and women.
- Data protection laws: FRS collects biometric data without the consent of data subjects scanned. Under the General Data Protection Regulation, this issue could have massive potential liability.
Specific attention has been given to the watch list images FRS uses to compare the facial maps it takes. Particularly, there are no guidelines as to how the images are stored and sourced. The Information Commissioners Office has intervened in the case and highlighted this concern.
In spite of the pending legal case and concern from regulators the South Wales police force has moved forward with plans to provide officers with FRS software on mobile devices.
A significant series of questions arise in relation to the FRS program which may have a bearing on whether it is lawful:
- How is the data used by FRS protected, retained, removed and managed?
- What safeguards are there to the system in terms of accessibility and use?
- What is the accuracy of the software used? Are there alternative, more accurate mechanisms, which can be used?
- Are there any alternative, more proportionate means to achieve the prevention and detection of crime (for example increasing the number of CCTV cameras)?
- How long are scans of facial images retained if successful and unsuccessful?
- How many officers are issued with the FRS technology and where/for what purpose is it used?
- Can/is the use of the system limited to instances where it is vital for the maintenance of public order?
All of these factors will have an impact on whether the use of FRS can be seen as a proportionate means of achieving the aim of the prevention and detection of crime. Further, they will also be significant at determining the risk posed to the liberty of citizens by the potential invasiveness of the use of FRS. Therefore, the determination of the Bridges case is the most significant awaited ruling on the use of FRS to date. It is likely to determine the scope of use of FRS and provide guidelines for the integration of further developments in technology.
This post originally appeared on The Privacy Perspective blog and is reproduced with permission and thanks