Retailers are increasingly using facial recognition technologies to track customers in-store. This technical innovation has positive connotations for both, retailers and customers, by targeting loyal clients and higher spenders, and improving users’ store-buying experience. However, a special emphasis should be placed on privacy issues, so as not to compromise data subjects’ fundamental right to data protection. The Spanish Data Protection Agency (AEPD) issued some guidance on this hot topic (0328/2012 and 0392/2011). In this post, we look at the issues retailers need to factor in order to stay on the right side of data protection law.
Facial recognition systems may be considered highly invasive, since images can be captured and processed from a range of viewpoints without the knowledge of the data subject. As pointed out by the Article 29 Working Party, even when a data subject is aware that a camera is operating, there may be no clues to differentiate between a CCTV system and a lens capturing images for a facial recognition sensor. Besides, obtaining informed consent from data subjects and the exercise of data subjects’ rights, is particularly challenging when dealing with facial recognition technologies.
Application of data protection regulations
Biometric data are in most cases personal data. The legal framework for the processing of such data is grounded in Directive 95/46/EC. Consequently, the Spanish data protection regulations apply, and the processing of biometric personal data shall be adequate, relevant and not excessive in relation to the purpose or purposes for which they are processed.
According to the Spanish Data Protection Agency (“AEPD”) criteria, two stages could be identified for the purpose of data protection: (1) image capture and processing, which involves face detection and its classification; (2) non-graphic information processing, once facial images have disappeared. Different conclusions might emerge from the analysis in each category.
- Stage I: Image capture and processing.
During this first scenario, the image is captured and processed by a device or sensor. We are, undoubtedly, facing the processing of personal data, since any alphanumeric, graphic, photographic, acoustic or any other type of information concerning an identified or identifiable person is considered personal data (Art. 5.1.(f) RDLOPD) under Spanish law. It is therefore necessary to analyse whether the (a) purpose, (b) proportionality and (c) legal grounds for which the biometric data are collected and processed comply with the protection of fundamental rights and freedoms of individuals:
1.a. Purpose limitation and data retention: special risks regarding device’s recording capability
- Biometric personal data shall be collected for a specific purpose. Such purpose must be explicit and determined at the time of collection of data. The purposes of processing further to collection shall not be incompatible with the purposes as they were originally specified. In the case at hand, for instance, when images are collected for audience measurement purposes only, further categorisation or segregation for target add purposes would be incompatible with the original purposes. In addition, according to the principle of data minimisation, only the required information and not all available information should be processed, transmitted or stored.
- Retention period for biometric data should not be longer than necessary for the purposes for which the data were collected, and profiles derived from such data, must be permanently deleted after that justified period of time. In the AEPD’s opinion, this means that face recognition systems should not have recording capabilities. This recording capability, even when deactivated, directly conflicts with the principle of proportionality, given the potential harmful consequences for the persons concerned.
1.b. Proportionality: the importance of sensor’s location.
The use of biometrics raises the issue of proportionality of each category of processed data in the light of the purpose for which the data are processed. This implies, according to the Spanish Constitutional court, a strict assessment of the adequacy, necessity and proportionality of the processed data.
- The capture of images properly meets the audience measurement objective (adequacy) and is also favourable from the viewpoint of convenience and cost effectiveness (necessity).
- Regarding proportionality, however, in facial recognition systems, location is crucial. Whereas the use of this mechanisms in public indoor spaces (malls or leisure centres), where the commercial activity is closely tied to the advertising or promotional activity, potential buyers can reasonably expect to receive advertising with the subsequent audience measurement. Thus, in the AEPD’s opinion, when analysing the balance between the advantages deriving from the system and the damage caused to the individuals, the former prevails. In contrast, the use of facial recognition devices in outdoor spaces would cause a higher damage, and therefore, the use of facial recognition systems would not be proportionate.
1.c. Legitimate ground: legitimate interests of the data controller.
Direct effect of Article 7(f) of Directive 95/46/EC allows the processing of personal data when two cumulative conditions are met: (i) processing of personal data is necessary for the purposes of the legitimate interests pursued by the controller (or by the third party or parties to whom the data are disclosed); and (ii) such interests are not overridden by the interests for fundamental rights and freedoms of data subjects.
As the EU Court of Justice stated in case C-468/10, this balance “depends, in principle, on the individual circumstances of the particular case in question and in the context of which the person or the institution which carries out the balancing must take account of the significance of the data subject’s rights arising from Articles 7 and 8 of the Charter of Fundamental Rights of the European Union”. Thus, there is no one-size-fits-all solution, but legitimate interests should be analysed on a case-by-case basis.
- Stage II: Non-graphic information processing
At the second stage, when biometric personal data (facial images) are completely destroyed, non-graphic information obtained (gender, age, viewing time) cannot be understood as personal data, since in themselves and individually, these cannot identify data subjects. Thus, the analysis of such information for audience purposes does not fall within the scope of the data protection regulations. However, it should be noted that biometric personal data must be rendered absolutely and irreversibly anonymous. For instance, when non-graphic information is linked to a unique identifier, allowing data subject identification in the future, then the information concerns to an identified or identifiable person, and data protection regulations will also apply at the second stage.
- Legal requirements
In addition to the principles laid down in the previous sections, once the facial recognition system has been considered proportionate and there are legitimate grounds to implement such system, all obligations set out in the data protection regulations shall apply, in particular the following:
- Information duties. In compliance with Article 5 LOPD (Organic Law 15/1999, dated 13 December, on the protection of personal data), controller must provide to data subjects with express, precise and unambiguous information on (i) the processing of personal data, the purposes of such collection, and of the recipients of such information; (ii) the identity of the controller; and (ii) the existence of the right of access to and the right to rectify the data concerning them, while it is true that the exercise of the data subject rights cannot be addressed in practice, taking into account that that face recognition systems could not have recording capabilities.
- Sensitive data. It should be noted that the processing of biometric data could be linked to determine sensitive data, in particular those with visual cues such as race or ethnic group. This is particularly important when implementing the security measures, since controller shall implement the technical and organizational measures set forth in the RDLOPD (Royal Decree 1720/2007, dated 21 December, implementing the Organic Law 15/1999, dated 13 December, on the protection of personal data) to ensure the security of personal data and avoid its alteration, loss, unauthorised access or processing.
- Impact assessment. Before the implementation of these systems, it will be necessary to carry out a Privacy Impact Assessment (PIA), that is, a process in which the entity carries out an evaluation of the risks associated with a processing of personal data and a definition of additional measures designed to mitigate all associated risks.
Although the Article 29 Working Party in the Opinion 3/2012 on developments in biometric technologies already pointed out the importance of the impact assessments, PIAs are especially relevant in the new data protection framework designed by the EU General Data Protection Regulation (GDPR). Indeed, once the GDPR enters into force, the use and development of such technology shall also meet certain other conditions imposed by the new Regulation, such as the principle of transparency and data minimisation, data protection by design and data protection by default.