Generally, facial recognition involves identification based on the comparison of newly captured images with images stored in a database, which also may be linked to information that identifies the individual. Facial recognition differs from traditional camera surveillance particularly since it is not a mere passive recording. Instead, the biometric data that is obtained is compared to data stored in databases and the data can also be efficiently updated.

From the perspective of data protection, data generated by facial recognition is classified as biometric data

The General Data Protection Regulation (“GDPR”) classifies biometric data as a special category of personal data because it makes it possible to uniquely identify a person. Biometric data is particularly significant in connection with the protection of individual privacy because (1) it is impossible to erase or change the data and (2) it is strongly identifying. GDPR defines biometric data as personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allows or confirms the unique identification of that natural person, such as facial images or fingerprint data.

Generally, the processing of biometric data, which is classified as a special category of personal data, is prohibited under GDPR in the absence of consent or direct legal grounds based on GDPR or other legislation. The implementation of certain measures or contractual procedures also may be required. This means that the use of facial recognition technology requires legal grounds in accordance with GDPR, such as explicit consent, a legal obligation or the public interest.

A Cautionary Tale: Swedish school fined for facial recognition pilot program even after obtaining “consent”

In a decision handed down in August 2019, the Swedish data protection authority, Datainspektion, found that a facial recognition pilot program carried out by a school violated GDPR and imposed a fine of around €20,000 on the school. In the pilot program, the school tracked the attendance of 22 students for about three weeks by identifying each student’s face when they entered the classroom, comparing the captured image with a previously uploaded photo of the student and linking the image with the student’s full name.

The students’ guardians were asked to give and gave explicit consent and they also had the option of excluding their child from the program. The fact that facial recognition was only in the pilot phase at the school did not affect the decision, as the authorities took immediate action and handed down a quick decision.

What can we learn from the Swedish school decision?

There are at least three key takeaways from the Swedish school decision and associated fine.

1) Remember to adhere to the principles of data protection. In Sweden the principles governing the processing of personal data were violated and personal data was processed more extensively than was necessary for the purpose of processing. Facial recognition was considered disproportionate for tracking attendance, which meant that the data processing violated the principle of proportionality. Less intrusive options were available.

2) Assess the grounds for processing and the nature of consent. According to the Swedish data protection authority, the collected data was classified as a special category of personal data pursuant to Article 9 of GDPR that makes it possible to uniquely identify a person. The processing of such data is prohibited as a rule. However, processing is possible based on explicit consent, which had indeed been obtained in this case.

The data protection authority nevertheless found that data processing based on explicit consent was not possible in this case due to the imbalance of power dynamics between the school and its students (the controller and the data subject) and the one-sided nature of the tracking of attendance data.

Consequently, consent could not be considered freely given in the manner intended by GDPR and thus it did not constitute a valid exception to the prohibition of processing, which is addressed in Recital 43 of GDPR.

3) Conduct the data protection impact assessment correctly and request a prior consultation if necessary. In the case of the Swedish school’s facial recognition pilot program, the data protection impact assessment (Article 35) and prior consultation (Article 36) were essentially not conducted at all. The school reported having conducted a risk analysis that they considered sufficient and, based on the risk analysis, the school did not deem it necessary to conduct a specific risk analysis with respect to personal data.

The data protection authority stated that an actual data protection analysis had not been carried out and concluded that the assessment conducted by the school was insufficient. The assessment did not give due consideration to the risks to the data subject’s rights and freedoms, nor did it include an assessment of proportionality between the data collected and the purpose of its use.

The data protection authority further concluded that an impact assessment pursuant to Article 35 had not been carried out and a prior consultation pursuant to Article 36 had not been requested. It should be remembered that a prior consultation can only be requested after an impact assessment has been carried out.

Remember these key takeaways to avoid GDPR problems associated with facial recognition technology such as those that befell the Swedish school.