The Article 29 (A29) Working Party has recently published their opinion paper on the rise of facial recognition technology and the concerns that this brings for the protection of personal data online. This note looks at the issues of online privacy and the concerns for data privacy as facial recognition software becomes more widely available.

The A29 Working Party is the European body which comprises leading representatives from each data protection supervisory authority in the EU (in the UK, this is the Information Commissioner’s Office); its opinions are therefore particularly influential, if not binding.

Last year Pitmans published a briefing explaining the issues of privacy at the time Facebook changed their ‘tagging’ service for photographs to incorporate facial recognition technology. For further information, click here.

Since then, the availability and application of the technology has grown exponentially; as its accuracy and deployment expands, this technology could be used for the most routine events in every day life – but also by advertising companies, collecting market information based on attendance monitoring and profiling to tailor targeted advertising messages.

The A29 Working Party has identified facial recognition technology as being used for authentication or verification for devices or online services. However, the application of this technology may be naturally extended from the online to the offline world. From a defence and security perspective, retinal scans and other biometric data access are already in use at a number of airports and conditional access facilities; in addition, full facial recognition systems are reportedly already used by security agencies to identify known criminals at sporting and live events by using the technology to identify particular faces amongst the crowd (e.g. known hooligans at a football match or members of the public at the London Olympics).

Similarly, access to live events, venues and concerts has become more sophisticated than merely paper tickets – organisers continue to explore ways in which they may combat the growing grey market in second hand ticket sales which diverts income, and brand value, away from events and the artists. Methods include tickets containing photographs, bar codes or employing near field communication (NFC) technology. Fully automated facial recognition technology is a natural technological progression for those industries where secure access is an essential requirement.

But such applications raise data privacy concerns and consequently companies controlling or processing the data may be in breach of data privacy laws, unless such measures and new technologies are balanced against an individual’s right to privacy. While the A29 Working Party’s opinion on facial recognition focuses on online and mobile, the principles apply equally to anyone collecting and using data for facial recognition services.

The A29 Working Party consider that where a digital image contains an individual’s face, which is clearly visible and allows identification of the individual then such an image would be considered personal data. Therefore, where a reference template is created from an individual’s image, this template will also be personal data if it contains a set of distinctive features of an individual’s face which can be linked to the specific individual and stored for later use. The only instance where a template is likely not to be considered personal data, would be where it was not associated with an individual’s record, profile or original image – but clearly this would limit the application of the technology. Importantly, the template and corresponding profile (or personal details) of the data subject in question do not need to be held by the same entity – it may still constitute personal data where a data controller has the means to access the corresponding information needed to identify that individual (even where held by a third party supplier).

Directive 95/46/EC states the conditions by which the processing of personal data must comply. Article 6 states that images and templates must be relevant, and not excessive, for the purposes of facial recognition processing. As the images constitute biometric data, the processing of the personal data may only be performed if the informed consent of the individual is obtained prior to commencing processing or if another exception is satisfied under the Directive (e.g. for legitimate purposes pursued by the data controller – such as security for the venue in the light of perceived terrorist threats – provided it does not prejudice the rights of the individual concerned). The A29 Working Party note that some elements of processing may be necessary before consent is obtained, i.e. to verify existing records, but this should only be for the strictly limited purpose, and the information deleted immediately.

The digital images or templates stored must be used only for the specified purpose for which the have been provided – and for which consent has been sought or where another relevant exemption applies (as, for instance, in the case of the legitimate use exemption described above). The greater the sensitivity of the personal data concerned the more likely explicit consent will be required.

The A29 Working Party considers that technical controls should be implemented to ensure that third parties do not gain access to the data and use it in an unauthorised manner. As trials of cashless technology grow for events, it may be that this technology is used by individuals to purchase items using credit stored against their profile, for instance drinks or merchandise. Controllers should be aware of the parameters of consent and that data stored against a user’s profile, including data used for, or available from, facial recognition data, can be valuable information for advertising or marketing agencies profiling consumers.

Similarly, controllers and processors will need to guard against security breaches which may result in unauthorised access to the data. The A29 Working Party advises that technical measures such as encryption will need to be used for data storage and data transit. One method suggested by the A29 Working Party is for biometric encryption techniques themselves to be used so that the cryptographic key is directly bound to biometric data and is only re-created where correct live biometric sample is presented on verification.

To reduce such concerns the Working Party recommends minimising the data so that the images or templates stored do not contain more data than necessary to perform the specified purpose. Similarly, templates should not be transferable between facial recognition systems. Organisations developing or deploying such technology should also carry out Privacy Impact Assessments (PIA) and follow development methodologies based on Privacy by Design (PbD).

The everyday use of facial recognition software in society to improve security checks for employees, visitors or customers may soon become common place when using even the simplest of access control systems.

Data controllers and data processors should be aware of the law in this area as the technology becomes more prevalent. But consequently it appears the law may also need to keep abreast of various ways in which the software can be exploited to monitor and profile individuals using a range of services and ensure adequate protection for data subjects as the technology advances.