Now, private security companies are also increasingly using these technologies for ensuring security over private owner areas. While these technologies are promising for security purposes, they also pose significant human rights and public law challenges when installed in quasi-public spaces such as museums, shopping malls, public parks. This article discusses the data protection and privacy issues raised by security systems that use two such technologies – facial recognition (‘FRT’) and gait analysis (‘GA’).
Facial recognition systems operate by comparing select facial features from the images they capture with other facial images on their database. Gait analysis extracts a person’s silhouette from video and analyses the silhouette’s movement to create a model of the way the person walks. It then compares this model to the other models it has on its database. Private security companies claim that through gait analysis, they can identify troublemakers even with their backs are to the camera or their faces covered.
In this article, I analyse the legality of FRT and GA based security systems under the General Data Protection Regulation (‘GDPR’). I study the grounds on which their use may be challenged, and lastly, highlight safeguards that must be adhered to if individual EU Member States specifically allow the use of such systems.
Governing Legal Framework
Security systems installed at quasi-public spaces capture the images and personal data of hundreds of persons a day. Though these may be privately owned premises, they welcome public access and thus affect the rights of hundreds (if not thousands) of data subjects. The GDPR would apply to the images captured by such systems as under Article 2(1), the Regulation applies to the processing of ‘personal data’ wholly or partly by automated means. Images collected through FRT and behaviour information collected through GA, are ‘personal data’ as they contain information by which natural persons are identified or are identifiable. Facial images and behavioural characteristics are considered biometric data under Article 4(14), which is entitled to additional protection.
To govern the processing of personal data for law enforcement/security purposes, the European Parliament has passed the Law Enforcement Directive (‘LED’). However, the LED would not be applicable in cases where private security companies work in their capacity and not under government authorisation. This is because the LED only applies to the processing of personal data by ‘competent authorities’ that is public authorities engaged in crime prevention/investigation or any other body entrusted by the Member State to exercise public authority for these purposes. Unless the State assigns the crime prevention responsibility of a quasi-public space to a private security firm, the LED will not apply.
Private companies may cite one of three bases under the GDPR for justifying the use of FRT and GA based security systems:
- That they are entitled to the household exemption under Article 2(2)(c) since the security system is installed on private property
- That processing is necessary for the general public’s interest in security and preventing unlawful activities on properties which the public accesses under Article 6(1)(e).
- That the owners of quasi-public spaces are pursuing their legitimate interest in protecting their premises under Article 6(1)(f).
The feasibility of these arguments is analysed below.
1. Household Exemption
As per Article 2(2)(c) and Recital 18, the GDPR’s ambit excludes the processing of personal data if it is in the course of a purely personal or household activity. Owners of quasi-public spaces may argue that the GDPR does not apply as the security tapes are strictly meant for protecting their private property. The CJEU has, in the Rynes case, discussed the data protection implications of CCTV cameras installed on private properties and the applicability of the household exemption. In the Rynes case, an individual house owner had installed CCTV cameras pointed at his house. One of the cameras was partially recording the public’s movement on the footpath in front of the house. The question was whether the GDPR would apply. The CJEU held that the scope of the household exception is narrow. Further, if a video surveillance system covers a public space even partially, it cannot be regarded as a purely personal activity. Based on this rationale, private security companies would not be able to avail the protection of the household exemption.
2. Public’s Security Interest
The GDPR generally prohibits the processing of biometric data under Article 9. However, Article 9(2)(g) allows the processing of sensitive personal data (such as biometric data) if a Member state’s law permits it. Article 9(2)(g) specifies that such a law passed by the Member State must be proportionate and provide for suitable measures to safeguard the fundamental rights of data subjects. It is unlikely that private security firms could resort to this exception for the following reasons. First, the interest here is not the public’s interest in security but the interest of protecting private property. Second, assuming that the general public has a security interest and Member state’s law permits such security activities, FRT or GA is not the most proportionate measure do secure such properties. Further, it is possible to fulfil the owner’s security interest even without such intrusive techniques such as through traditional security measures such as general CCTV devices, security officers, barbed wires etc.
Thus, private security companies cannot claim this as a basis for using FRT and GA security systems unless a Member state explicitly permits the processing of special category data for security/prevention of unlawful activities and lays down the law with appropriate safeguards do so.
3. Legitimate Interest
Article 6(1)(f) and Recital 47 state that a data controller may process personal data by arguing that it has a legitimate interest in processing that data. Pertinently, Article 6(1)(f) specifically states that personal data may be processed provided that the fundamental rights of the data subject are not overriding. FRT and GA potentially intrude not just the data subject’s right to privacy, but also their right against discrimination. These technologies have tested to be significantly less efficient for women and persons with colour. The false positives these systems create for persons of colour are discriminatory and affect the dignity of the persons who are incorrectly detected by these systems. Thus, unless these technologies significantly reduce the risk of discrimination for non-white persons, a strong case can be made for the subjects’ fundamental rights overriding the use of these disproportionate technologies.
Further grounds for challenge
In addition to the challenging the legal bases for the use of these technologies based on the arguments discussed above, data subjects could also challenge them on the following additional grounds. First, for violating the consent requirement under the GDPR. Second, subjecting persons to automated decision making.
1. Lack of Consent
Article 9(2) says that sensitive personal data may be processed if the data subject has given explicit consent for doing so. If the owners of these quasi-public spaces were to claim that by their premises, persons are implicitly consenting to the processing of their biometric data, such an argument would be faulty. Under Article 9(2)(a), the standard for consent for processing biometric data is ‘explicit consent’. The onus to demonstrate consent under Article 7(1) rests on the controller. Unless the data subjects have granted explicit consent, perhaps by signing a declaration at the entrance of such a premise, such security systems violate Article 9 that prohibits regulating sensitive personal data.
2. Automated individual decision-making
Article 22(1) states that data subjects shall have the right not to be subject to a decision based solely on automated processing, including profiling which produces legal effects. Recital 71 clarifies that this right not to be subject to a decision includes a measure that evaluates the personal aspects relating to a person. Visitors in these quasi-public spaces may be stopped, searched, or denied entry based on FRT and GA based security systems. As these alerts are measures which tangibly create legal effects and are created based on personal and behaviour characteristics (faces and gait), they are violative of the protection to data subjects under Article 22 GDPR.
Safeguards and Conclusion
Despite the data protection and discriminatory concerns raised above, EU Member States may allow the use of such systems by private security companies under Article 6(1)(e). If they were to do so, private security companies would still be obligated to take some safeguards under the GDPR. These would include, conducting a Data Protection Impact Assessment [Article 35(3)(c)] before the installation of a new system, the appointment of a data protection officer [Article 37(1)(b)], among others. Further, in the interest of transparency and full disclosure, private security firms should erect warning signs on and around their premises to inform passers-by that such intrusive technologies are operational on their premises.
To conclude, it is indisputable that members of the public do stand to benefit from more sophisticated security systems. However, the benefit of increased security cannot be at the risk of violating the right to privacy and data protection principles.
Raghav Mendiratta is a qualified lawyer. He is a Member of the Teaching Committee at LSE and is a Researcher for Stanford University’s Centre for Internet and Society. He Tweets at @raghav_mendirat.