In a major development for companies that collect, use, and store biometric data, the US Federal Trade Commission (FTC) reached a proposed settlement of a complaint against a company that allegedly deceived consumers about its use of facial recognition technology and its retention of consumers’ biometric data. In its January 11, 2021, announcement of a settlement with the parent company of a now-defunct photo storage app, the FTC signaled that it is increasing its focus on commercial practices relating to consumer biometric information. As part of the settlement, the defendant was required to delete facial recognition algorithms created using photographs provided by consumers without their consent. The settlement is also noteworthy in light of the enormous recent and projected growth of the biometric industry, the small minority of states that directly regulate private entities’ collection and use of this data, and one Commissioner’s recent public comments about the FTC’s interest in policing this area.
The FTC complaint
The FTC complaint arose from allegations about how the defendant stored and used consumer data after it was collected through a free app that allowed consumers to upload photos and videos to the company’s servers for storage and organization. The app later launched a facial recognition feature that enabled users to tag people and group together uploaded content accordingly.
The FTC alleged that the defendant violated Section 5 of the FTC Act by misrepresenting what happened to users’ content after it had been uploaded. According to the FTC, the company gave the impression that it was not running facial recognition algorithms unless the customer opted in to the feature, but in fact did enable the feature for most users, and without providing the option to disable the feature. The company also allegedly used data gathered by the artificial-intelligence-powered technology to develop a standalone facial recognition service for sale to its enterprise customers. Significantly, the only users exempt from this challenged practice were residents of Illinois, Washington, and Texas – the small minority of states with laws governing the collection, use, and storage of biometric identifiers – and of the European Union, which is governed by the General Data Protection Regulation (GDPR).
The FTC further alleged that the company falsely stated that it deleted users’ content when they deactivated their accounts. On the contrary, according to the FTC, the company’s practice was to retain the content indefinitely.
As stated in the proposed settlement agreement, the company neither admitted nor denied any of the allegations in the complaint, except as specifically stated in the consent order.
The consent order
The five FTC Commissioners unanimously voted to enter the consent order, which adopts a definition of biometric information more broad than most biometrics laws in the US For example, it defines biometric information to include images, recordings, and “characteristic movements” such as gait or typing pattern, none of which is included in the definition of biometric identifier in the Illinois Biometric Information Privacy Act (BIPA), the most restrictive of the biometrics laws in the country. The FTC’s definition of biometric information in the consent order aligns more closely with that of the California Consumer Privacy Act (CCPA).
The order also requires the company to provide notice to, and obtain affirmative express consent from, consumers whose biometric information was used to create a “Face Embedding” or train an artificial intelligence (AI) algorithm. Face Embedding is defined as “data, such as a numeric vector, derived in whole or in part from an image of an individual’s face.” In this respect, the order goes beyond existing biometrics laws (including BIPA), which generally focus on biometric data that is used to identify or verify the individual from whom the information is collected.
The order also requires the deletion of content from old user accounts, including biometric identifiers collected or derived from that content without the consumers’ consent. Significantly, the company must also destroy any facial recognition tools it created with data collected from unsuspecting consumers. The consent order prohibits the company from misrepresenting how it collects, uses, discloses, maintains, or deletes the personal information of its consumers, including face embeddings. The order also imposes upon the company a requirement that it obtain express consent from any consumer who provides biometric information to the company in the future before using it to develop other technologies or digital identifiers.
The FTC will publish notice of the consent agreement package in the Federal Register for public comment, after which it will decide whether to make the proposed consent order final.
Statements of Commissioner Chopra
In a statement published in conjunction with the consent order, FTC Commissioner Rohit Chopra expressed concerns about facial recognition technology generally, and argued that the FTC’s failure to impose financial penalties resulted in a failure to deter companies from violating FTC Section 5. Chopra, who will depart the FTC and lead the Consumer Financial Protection Bureau in the Biden Administration, also characterized the FTC’s order as requiring the company to “forfeit the fruits of its deception” by deleting the facial recognition technologies developed with improperly obtained photos. He noted that this was not the FTC’s prior practice. Commissioner Chopra also stated that “the FTC needs to take further steps to trigger penalties, damages, and other relief for facial recognition and data protection abuses” in the future. These statements are consistent with the Commissioner’s prior public comment that “the current state of facial recognition is flawed and dangerous.”
Notwithstanding these statements (or the FTC’s actions more generally), Commissioner Chopra stressed the effectiveness of state and local government efforts to regulate the use of facial recognition technology and other biometrics. Notably, Commissioner Chopra pointed to the company’s more favorable treatment of residents in Illinois, Washington, and Texas, which have “passed laws related to facial recognition and biometric identifiers. Chopra also cautioned against federal legislation that would preempt state activity in this arena, stating “we need all hands on deck to keep these companies in check.” He concluded his statement with something of a warning, noting that “[i]t will be critical for the Commission, the states, and regulators around the globe to pursue additional enforcement actions to hold accountable providers of facial recognition technology who make false accuracy claims and engage in unfair, discriminatory conduct.”
The past few years have seen dramatic innovations in biometric technology, and an increasing number of companies employ consumer- or employee-facing biometrics in their day-to-day operations, often with little regulatory oversight. Until more states and/or the federal government enact legislation governing the collection, use, and storage of biometric data, and/or the use of AI algorithms, the FTC is well-positioned to fill the existing void. This case demonstrates that the FTC is willing to take a broad view of what constitutes biometric information, and impose strict consent requirements on companies that use such data for purposes other than to identify or verify individuals. The FTC also made clear that it will prohibit the use of AI algorithms in their entirety if any underlying data was collected or used in a way the FTC deems to be unlawful.