Law360
When it comes to security and identity verification, and offering personalized service, biometric information is invaluable. People’s one-of-a-kind biometric information — such as a fingerprint, voiceprint, or retina, iris, hand or face scan — allows technology to verify their identities in previously impossible ways.
The fact that biometric information is difficult to replicate or forge, however, also creates privacy and security concerns, as a person cannot replace his or her biometric information in case it is ever hacked. For that reason, several states have passed laws that restrict companies’ use of biometric information, and other states are currently considering such laws. And as more companies take steps to comply with Europe’s General Data Protection Regulation, which, starting May 25, 2018, will place heightened privacy restrictions on any business whose website is accessible in Europe, more laws (both at the state and federal levels) will likely emerge.
These laws do not only impact high-tech companies. To the contrary, companies across industries use biometric data such as fingerprints, voice recognition or facial recognition for a wide range of purposes:
- Employers use biometric data for timekeeping purposes (e.g., fingerprint technology used by employers in place of the traditional punch-card system);
- Companies use biometric data (such as fingerprints or retina scans) for security purposes, such as to provide restricted access to buildings, servers, or documents;
- Numerous apps and websites use, recognize, and/or analyze photos of users’ faces (in two cases, plaintiffs are currently arguing that the act of storing photos can implicate these laws);
- Some companies use hidden cameras to analyze consumer demographics and their responses to products;
- Retailers may also use facial recognition to track consumers between stores; and
- Companies provide products in the internet of things that use biometric cues or indicators, such as voice recognition (including toys, which present a heightened risk given the strictness of children’s privacy laws).
Over the last year or so, what began as a handful of cases under a little-known Illinois statute has evolved into a wave of well over 30 cases against some of the world’s largest companies.
The Statutory Landscape
Only three states — Illinois, Texas and Washington — currently have laws in place that prohibit businesses from collecting individuals’ biometric information without their prior consent. Neither the Texas nor Washington statute contains a private right of action, though Texas’ law carries hefty civil penalties of up to $25,000 per violation. Illinois’ Biometric Information Privacy Act, by contrast, provides a private right of action to “aggrieved” consumers, and as such, has been the focus of the majority of litigation in this area.
Enacted in 2008, BIPA regulates the collection and storage of (1) “biometric identifiers,” which includes a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry, and excluding photographs, and (2) “biometric information,” which includes “any information ... based on an individual’s biometric identifier used to identify an individual,” and “does not include information derived from items or procedures excluded under the definition of biometric identifiers,” such as photographs.
BIPA prohibits companies from selling, leasing, trading or profiting from a person’s biometric information, and from disclosing or disseminating a person’s biometric data without his or her informed consent. Entities that “collect, capture, purchase ... or otherwise obtain a person’s or a customer’s biometric identifier or biometric information” must first (1) inform the person in writing that his or her data as being collected, along with the purpose of the collection and the length of time the information will be retained; and (2) obtain a “written release.” The law further requires such entities to (3) develop and publish a policy governing the retention and destruction of biometric data.
BIPA provides for statutory damages in the amount of $1,000 per negligent violation and $5,000 per intentional or reckless violation, or actual damages, whichever is greater.
Litigation Under BIPA
As mentioned above, more than 30 class actions have been filed under BIPA in the last couple of years. This litigation began in 2015, with early suits filed against Facebook, Google, SnapChat and video game developer Take-Two Interactive Software concerning their use of facial recognition and face scanning technology. However, vast majority of BIPA cases — most of which have been filed in the last six months — concern employers’ use of fingerprint timekeeping technology to monitor employees’ work hours.
At the motion to dismiss stage, defendants have generally argued that the plaintiff lacked constitutional or statutory standing because he or she was not actually injured by the defendant’s purported violation of BIPA; that BIPA could not be applied extraterritorially (several BIPA cases have been filed in California); or that the types of data being collected were not subject to the statute. Many such motions to dismiss have been denied.
However, a recent decision from the Illinois Court of Appeals will likely make it much harder for plaintiffs to survive the pleadings stage. In Rosenbach v. Six Flags Entm’t Corp., __ N.E.3d __, 2017 IL App (2d) 170317, ¶¶ 15, 23 (Dec. 21, 2017), the Illinois Court of Appeals recently explained that to be “aggrieved,” (and therefore, to have a private right of action under BIPA), a plaintiff must establish that he or she “suffered an actual injury” over and above the alleged collection of biometric data without notice and consent — a violation of the statute, in itself, is not enough.
The plaintiff in Rosenbach alleged that Six Flags violated BIPA by taking her son’s thumbprint after he purchased a season pass for one of its theme parks, without first obtaining his written consent. She claimed that she would not have allowed her son to purchase the pass if she had known his fingerprint would be collected. The trial court denied Six Flag’s motion to dismiss but certified for interlocutory appeal the question of “whether an individual is an aggrieved person ... when the only injury he or she alleges is a violation of [BIPA] by a private entity that collected his or her biometric identifiers and/or biometric information without providing him or her the disclosures and obtaining the written consent required by [the statute].”[1] The Appellate Court answered unanimously “in the negative,” holding that “[i]f a person alleges only a technical violation of the Act without alleging any injury or adverse effect, then he or she is not aggrieved and may not recover.”
This significant decision could potentially stifle the momentum growing in these cases and give current defendants a strong argument for dismissal.
Facebook recently relied heavily on Rosenbach in its opposition to plaintiffs’ motion for class certification in a consolidated BIPA action pending against the company in California, which originated as three cases filed in Illinois. In re: Facebook Biometric Information Privacy Litigation, Master Docket No.: 3:15-CV-03747-JD (N.D. Cal). The plaintiffs in these cases claim that Facebook violated BIPA by using facial-recognition technology to analyze photos of them (through its “Tag Suggestion” feature, which identifies people in posted photos) without giving them adequate notice or obtaining their consent. Citing Rosenbach, Facebook argued in its opposition that BIPA’s “aggrieved consumer” requirement is even more stringent than the Article III standing requirement that a consumer has suffered a concrete injury. Oral argument on plaintiffs’ motion for class certification, in addition to Facebook’s pending motion for summary judgment (which argues that BIPA does not apply to Facebook, because the data captured by Facebook is stored in California, not Illinois), is scheduled for March 29, 2018. If certified, the class, consisting of Illinois Facebook users, could be as many as 6 million people.
In the same case, Judge James Donato on Feb. 26, 2018, denied Facebook’s motion to dismiss for lack of subject jurisdiction, which argued that plaintiffs lacked Article III standing under the U.S. Supreme Court’s decision in Spokeo Inc. v. Robins because they had not been injured by Facebook’s alleged failure to comply with BIPA’s notice and consent requirements. The court disagreed, explaining that the “abrogation of the procedural rights mandated by BIPA” is “quintessentially an intangible harm that constitutes a concrete injury in fact”:
BIPA vested in Illinois residents the right to control their biometric information by requiring notice before collection and giving residents the power to say no by withholding consent. As the Illinois legislature found, these procedural protections are particularly crucial in our digital world because technology now permits the wholesale collection and storage of an individual’s unique biometric identifiers — identifiers that cannot be changed if compromised or misused. When an online service simply disregards the Illinois procedures, as Facebook is alleged to have done, the right of the individual to maintain her biometric privacy vanishes into thin air. …[¶] This injury is worlds away from the trivial harm of a mishandled zip code or credit card receipt.
Significantly, the decision did not mention the statute’s “aggrieved” requirement or the court of appeal’s Rosenbach decision.
Another case where questions concerning the plaintiff’s injury have recently come into play is Smith v. Pineapple Hospitality Co., et al., Case No. 1:17-cv-08106 (N.D. Ill.). There, defendant, soon after removing the case from state court, argued that plaintiff lacked statutory standing under Rosenbach because she had not suffered an actual injury. The plaintiff then filed a motion to remand, arguing that neither she nor defendant could establish the kind of concrete injury required of Article III standing, and that by filing a motion to dismiss, defendant had undermined the argument in its removal papers that the federal court had jurisdiction. Defendants were quick to argue in response to the motion to remand that the plaintiff’s claim to have not suffered a concrete injury as required by Article III precluded her from establishing statutory standing under Rosenbach. Oral argument has not been scheduled for either defendant’s motion to dismiss or plaintiff’s motion to remand.
Other Current and Proposed Laws Concerning Biometric Data
Lawmakers in Alaska, California, Connecticut, Idaho, Montana and New Hampshire, among others, have introduced bills that would regulate the collection, use and retention of biometric data. The Alaska, Montana and New Hampshire bills, if passed, will require notice and consent before biometric data is collected, used or retained, and also impose requirements for the retention, disposal and/or security of this data. The Alaska and New Hampshire bills, like BIPA, provide for a private right of action.
Additionally, Section 201-a of New York’s Labor Law prohibits employers from requiring the fingerprinting of employees “as a condition of securing employment or of continuing employment.” In April 2010, the New York Department of Labor explained that requiring employees to use their fingerprints to clock in likely violates Section 201-a, even if the device does not store employees’ fingerprints. In the same statement, the department made clear that the statute permits employers to use employees’ fingerprints as long as the employee voluntarily agrees to share them, and that the statute does not cover the use of instruments that measure the geometry of a hand (rather than the hand’s surface).
Biometric information also comes into play with respect to data breach laws. Connecticut, Delaware, Iowa, Nebraska, New Mexico, North Carolina, Oregon, Wisconsin and Wyoming include biometric information as a category of information covered by their laws against data and identity theft, and several other states may soon join the list. On Jan. 22, 2018, Colorado lawmakers introduced a bill that would expand notification requirements and require companies to implement “reasonable security procedures” to protect consumers’ “personal identifying information” including “biometric data.” On Jan. 23, 2018, the State Senate Judiciary Committee in South Dakota (one of only two states in the country to not have a data breach law) passed a bill that would require companies to inform consumers of any “unauthorized acquisition” of personal data, including “biometric data,” which the bill defines as that “generated from measurements or analysis of human body characteristics for authentication purposes.”
The federal government is also considering similar legislation. In November 2017, Sen. Patrick Leahy, D-Vt., with six cosponsors, introduced the Consumer Privacy Protection Act, which would “require[] that corporations meet certain baseline privacy and data security standards to keep information they store about consumers safe, and … that these firms provide notice and protection to consumers in the event of a breach.” Biometric data is specifically identified in the bill as a category of protected information. The proposed legislation does not include a private right of action, but it expressly authorizes the Federal Trade Commission, the federal attorney general and state AGs to bring enforcement actions.
How to Protect Your Company
As the use of biometric data increases, so, too, will its regulation. Lawsuits will likely follow suit. Businesses using biometric information should therefore educate themselves on the applicable laws to ensure that they are compliant.
At a minimum, this includes receiving informed consent from any person whose biometric data is being collected, preparing a written policy on the retention and destruction of biometric data, and ensuring that any biometric data is stored in a secure way. Retailers might also consider using more cautious practices in Illinois, Texas, and Washington that in the rest of the country (for example, Google recently garnered attention for disabling its art-selfie app in Illinois and Texas). Other protective measures include having users sign arbitration agreements that contain class action waivers, and securing an insurance policy that covers the hacking, theft or mishandling of biometric information.