In October, the U.S. Federal Trade Commission (FTC) issued a Staff Report, entitled “Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies”. Organizations operating in Canada and the U.S. should carefully consider the guidance in the FTC Staff Report.  They should also have regard to earlier guidance on the collection of biometric information, including facial information, issued by the Office of the Privacy Commissioner of Canada (OPC).

In this post, I examine some of the privacy issues that facial recognition technologies present and compare and contrast the U.S. and Canadian guidelines on the use of facial recognition technologies.

A question of liberty and control

The Supreme Court of Canada has said that privacy is at the heart of liberty. “[R]estraints imposed on government to pry into the lives of the citizen go to the essence of a democratic state” (R. v. Dyment, 1988 CanLII 10 (SCC) at para. 17). Very recently, the Supreme Court of Canada reiterated that the underlying values of dignity, integrity and autonomy are fostered by protecting a biographical core of personal information from the state (R. v. Cole, 2012 SCC 53 at para 45, quoting R. v. Plant, 1993 CanLII 70 (SCC)).

Private sector privacy advocates may argue that those same values require that individuals have the right to protect (and control) a biographical core of personal information from private sector organizations, as well, should they choose to do so.

Facial recognition technologies create new challenges for privacy protection.  In public spaces, there is, of course, the possibility that people might recognize you.  However, one of the features of urban spaces is that an individual can often move around in a way that is relatively anonymous.

Advanced facial recognition technologies have the potential to match images across platforms. Pervasive private-sector passive security video surveillance, facial recognition in digital signage, and photos and videos uploaded to social media could, in theory, be combined and cross-matched.  The ability to move around in relative anonymity could, in theory, be lost, along with the ability to control the use of one’s own image. Moreover, the collection of this information could, in addition, be combined with public-sector data from government issued identification and licensing activities, leading to concerns of mass surveillance.

In Canada, we have already had some experience with the potential use of combining private sector data with public sector databases for law enforcement purposes.  Following a riot in Vancouver, the Insurance Corporation of British Columbia (ICBC) (a Crown corporation subject to private sector privacy legislation in British Columbia) offered its facial recognition technology to assist police in comparing images of individuals alleged to have participated in the riot with images in its database of drivers.  ICBC is the provincial insurers for drivers in British Columbia.  The plan was to take images contained on surveillance video and images uploaded to social media and compare them using facial recognition technology with those in ICBC’s database of driver photos. The Office of the Information and Privacy Commissioner of British Columbia (IPC) responded with an investigation that concluded that ICBC did not provide adequate notice of this potential use to citizens and that it must receive a warrant, subpoena or court order before using facial recognition software to assist law enforcement.

Notwithstanding the concerns raise by the IPC in British Columbia, it is easy to be drawn into being overly critical of the use of facial recognition. As the dissenting Commissioner, J. Thomas Rosch, stated in an appendix to the FTC Staff Report, there is, as yet, little evidence that facial recognition technologies is being systematically “misused”.  In Commissioner Rosch’s view, the Staff Report was, among other things, premature.

It is also important to acknowledge that reasonable people may disagree on a number of the values underlying suspicion of facial recognition technology.  Some may be sceptical as to whether facial recognition technologies present any material threat to liberty.  Others may be sceptical whether the relative anonymity that urban life affords has anything to do with liberty.  Reasonable people may also differ in the extent to which they are prepared to submit to surveillance for the purposes of public safety.

Moreover, when critiquing facial recognition technologies, it is important to acknowledge that not all facial recognition technologies are the same and not all uses have the same degree of intrusion on an individual’s ability to be “left alone” in relative anonymity.  As the FTC Staff Report notes, there is a spectrum of technological sophistication and a spectrum of uses. Facial recognition technologies may simply detect and locate a face in an image. Other technologies and uses may be to identify demographic characteristics or moods or emotions of the person to deliver targeted advertising.

FTC: technological neutrality but greater transparency and choice

For the most part, the FTC Staff Report is neutral with respect to the use of facial recognition technologies in consumer settings. The FTC acknowledges that facial recognition can be used “in ways that benefit consumers by providing them innovative products and services, such as the ability to try beauty products by uploading their faces to the Web, the ability to target search results, and the ability to organize and manage photos.” Facial recognition technology can also be used to enhance privacy protections. The technology can be used for authentication of mobile devices and to blur images of individuals captured in video.

However, the FTC is also concerned about potential erosions of privacy in ways that are unfair to consumers.  In providing guidance, the FTC has organized its analysis around three core principles:

  1. “Privacy by Design: Companies should build in privacy at every stage of product development.”

The FTC Staff Report states that the transmission of facial information should be encrypted or secured to protect against intrusion from a hacker who could view the images in real time. Organizations should also attempt to prevent unauthorized scraping of images. If images will be retained, there must be reasonable data security protections in place and the images should be subject to destruction once they are no longer necessary for the purpose for which they are collected.

  1. “Simplified Consumer Choice: For practices that are not consistent with the context of a transaction or a consumer’s relationship with a business, companies should provide consumers with choices at a relevant time and context.”

The FTC considers a consumer’s face to be a persistent identifier in the sense that it can’t simply be changed in the way that other identifiers can be such as a credit card number or a tracking cookie. Accordingly, it is critical that there be meaningful and informed choice.

The FTC Staff Report suggests that “walk-away choice” is sufficient if (a) the technology is being used to gather demographic information (age and gender), (b) images are not stored, and (c) the organization has been sufficiently transparent about its activities.

By contrast, using facial recognition technologies for identification purposes requires affirmative express consent. Similarly, using an image in a materially different way (for example, a new use) would require affirmative express consent.

  1. “Transparency: Companies should make information collection and use practices transparent.”

The FTC is concerned that the public is not well-educated in the uses of facial recognition technology. For example, the FTC is of the view that facial recognition technologies in digital signage would not be consistent with reasonable consumer expectations. Therefore, it is important to provide prominent notice so that consumers have a meaningful choice as to whether they want to come into contact with these types of technologies.

The FTC Staff Report states that a notice should be prominently placed at the entrance to the store or at the entrance to the area of the store in which the technology is being used. When used with digital signage or other novel applications, a notice should be placed near the digital signage or area of novel use. The notice should state the purpose of the technology and how consumers can find out more information about the technology and the practices of the company operating the signs in that venue.

If facial recognition is used on image submitted in social media, the operators of those social networks should provide consumers with an easy to find, meaningful choice and the ability to turn off the feature and delete biometric data.

Canada’s focus on proportionality

The Canadian guidance from the OPC contains similar themes. Individuals should be informed that facial recognition is being collected. If facial information will be used for other purposes than those disclosed at collection, additional consent will be required.

However, unlike the U.S. approach, the Canadian approach by the OPC requires that organizations be prepared to justify the use of facial recognition. In part, this is probably because subsection 5(3) of the Personal Information Protection and Electronic Documents Act (PIPEDA) provides that “[a]n organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances” (emphasis added).

In determining what is reasonable, the OPC encourages organizations to apply a four-part test.

  1. Is the use of the technology demonstrably necessary to meet a specific need?
  2. Is the use of the technology likely to be effective in meeting that need?
  3. Would the loss of privacy be proportionate to the benefit gained?
  4. Is there are less privacy-invasive way of achieving the same end?

The application of this test means that technologies such as facial recognition are not to be employed simply because they are efficient, convenient or cost-effective. Instead, the OPC suggests that facial recognition should be “essential for satisfying a particular need”. Any loss of privacy must be proportional to the benefit obtained from the technology. If the benefit to the organization of using facial recognition is minor, then it will be difficult to justify the loss of privacy from technologies that may be used to identify individuals. By contrast, technologies that are being deployed for privacy enhancing purposes (such as blurring faces in photos) or that are based simply on sensing that there is a person facing a digital signage may be much easier to justify in the cost to privacy / benefit to the organization calculus.

Implications of the Philosophical Difference

The Canadian focus on the contextual reasonableness of facial recognition technologies is an important philosophical difference in approach, with practical implications. In particular, it may be necessary in Canada to more carefully calibrate the use of facial recognition technologies in consumer settings to a clearly defined need.

Although the use of facial recognition technologies may be more restricted in Canada, they can be used in privacy enhancing ways, as demonstrated by the experience in Ontario casinos.

The Ontario Lottery and Gaming Authority (OLG) facial recognition program is instructive.  OLG maintains a voluntary self-exclusion program for persons who do not want to be admitted to gaming sites. In collaboration with the Information and Privacy Commissioner and the University of Toronto, the OLG developed a facial recognition program that uses biometric encryption. A biometric pointer key is created from a sample image. The sample is then discarded. The identity of the person can only be unlocked by the biometrically encrypted pointer key derived from a person’s live image. Images that do not unlock a self-excluded gambler’s photograph are discarded, thereby protecting the privacy of the general public visiting the casino. If a likely match is identified, staff will check identification, which eliminates false positives. The Ontario Information and Privacy Commissioner has authored a paper describing the project and has presented on the topic recently.

Facial recognition technologies won’t be going away.  They are novel, useful, and fun for consumers.  However, developers should consider engaging in a privacy impact assessment with respect to any deployment of these technologies for new uses and applications.