The Information Commissioner's Office (ICO) has fined Clearview AI Inc £7,552,800 for using images of people in the United Kingdom and elsewhere that were collected from the Internet and social media platforms to create a global online database that could be used for facial recognition. The ICO has also issued an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is publicly available on the Internet, and to delete the data of UK residents from its systems.
The ICO enforcement action comes after a joint investigation with the Office of the Australian Information Commissioner into Clearview AI Inc's use of people's images, data scraping from the Internet and the use of biometric data for facial recognition.
The ICO explains that Clearview AI Inc had collected more than 20 billion images of people's faces and data from publicly available information on the Internet and social media platforms all over the world to create an online database. People were not informed that their images were being collected or used in this way.
The company provides a service that allows customers, including the police, to upload an image of a person to the company's app, which is then checked for a match against all the images in the database. The app then provides a list of images that have similar characteristics with the photo provided by the customer, with a link to the websites from where those images came.
The ICO says that given the high number of UK internet and social media users, Clearview AI Inc's database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge. Although Clearview AI Inc no longer offers its services to UK organisations, the company has customers in other countries, so the company is still using personal data of UK residents.
The ICO found that Clearview AI Inc breached UK data protection laws by:
- failing to use the information of people in the United Kingdom in a way that was fair and transparent, given that individuals were not made aware or would not reasonably expect their personal data to be used in this way;
- failing to have a lawful reason for collecting people's information;
- failing to have a process in place to stop the data being retained indefinitely;
- failing to meet the higher data protection standards required for biometric data (ie, "special category data" under the EU General Data Protection Regulation (GDPR) and the UK GDPR); and
- asking for additional personal information, including photos, when asked by members of the public if they were on their database, which may have acted as a disincentive to individuals who wished to object to their data being collected and used.
For further information on this topic please contact Siobhan Lewis at Wiggin by telephone (+44 20 7612 9612) or email ([email protected]). The Wiggin website can be accessed at www.wiggin.co.uk.