With the growth of digital avatars and face-scanning technology, many tech companies have been hit with suits challenging privacy protections. A new decision from New York federal court may ease their concerns.

Siblings Ricardo and Vanessa Vigil purchased NBA 2K15, in part due to the “My Player” feature that allowed users to undergo a facial scan and create a personalized avatar in the game. Although the Vigils signed a release permitting Take-Two Interactive Software Inc. to take the scan and use the data collected to create an avatar, they alleged the company violated the Illinois Biometric Information Privacy Act (BIPA).

The BIPA sets forth disclosure, consent, and retention requirements for private entities that collect, store, and disseminate biometric data. The Vigils claimed that Take-Two failed to provide adequate disclosures (in particular by failing to inform them that their likenesses would be visible to other players online and not providing a retention schedule or guidelines for destroying biometric identifiers) and therefore their consent was invalid.

The software company moved to dismiss, arguing that the plaintiffs failed to adequately allege concrete harm under the statute as required by Spokeo, Inc. v. Robins. U.S. District Court Judge John G. Koeltl agreed.

Under the BIPA, the collection and storage of biometrics to facilitate financial transactions is not in and of itself undesirable or impermissible, the court said. Instead, the purpose of the statute “is to ensure that, when an individual engages in a biometric-facilitated transaction, the private entity protects the individual’s biometric data, and does not use that data for an improper purpose, especially a purpose not contemplated by the underlying transaction.”

Take-Two’s personalized basketball avatar feature complied with the statutory requirements, the court said.

“The plaintiffs … allege that the MyPlayer feature functioned exactly as anticipated,” the judge wrote. “There is no allegation that Take-Two has disseminated or sold the plaintiffs’ biometric data to [third parties] or that Take-Two has used the plaintiffs’ biometric information in any way not contemplated by the only possible use of the MyPlayer feature: the creation of personalized basketball avatars for in-game play.”

The Vigils further failed to establish an imminent risk of harm that their biometrics could actually be misused and no event (such as data theft) has occurred that could demonstrate the risk was a reality, the court said. While the plaintiffs told the court that the potential risk of harm associated with their face scans could be potentially great, “the hypothetical magnitude of a highly speculative and abstract injury that is certainly not impending does not make the injury any less speculative and abstract,” the court said.

None of the purported statutory violations were sufficient to establish standing or actual harm, Judge Koeltl concluded, rejecting the plaintiffs’ other theories of additional harm including invasion of privacy and reluctance to enter into future biometric-facilitated transactions. The Vigils agreed to have their faces scanned and displayed on personalized basketball avatars and standing could not be manufactured based on fears of hypothetical future harm, the court said.

“The plaintiffs cannot aggregate multiple bare procedural violations to create standing where no injury-in-fact otherwise exists,” the judge wrote, dismissing the complaint with prejudice. “Accordingly, the plaintiffs do not have Article III standing to pursue their claims against Take-Two.”

To read the opinion and order in Vigil v. Take-Two Interactive Software, Inc., click here.

Why it matters: As the use of biometric scanning continues to increase, so will the number of consumer challenges to companies making use of the technology. In addition to Take-Two (whose legal woes are not over, as the plaintiffs have already filed an appeal), both Facebook and Google have already been hit with putative class actions asserting violations of the Illinois BIPA.