Patrick Rennie November 4 2022 ICO warns organisations to assess public risks of using emotion analysis technologies before implementing them Wiggin LLP | Tech, Data, Telecoms & Media - United Kingdom Patrick Rennie Tech, Data, Telecoms & Media The Information Commissioner's Office (ICO) has explained that emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture. Examples include monitoring the physical health of workers by offering wearable screening tools or using visual and behavioural methods including body position, speech, eyes and head movements to register students for exams.Emotion analysis relies on collecting, storing and processing a range of personal data, including subconscious behavioural or emotional responses, and in some cases, special category data. This kind of data use is far more risky than traditional biometric technologies that are used to verify or identify a person.The ICO says that the inability of algorithms that are not sufficiently developed to detect emotional cues means there is a risk of systemic bias, inaccuracy and even discrimination.The ICO says that to enable a fairer playing field, it will act positively towards those demonstrating good practice, while investigating and taking action against organisations who try to gain unfair advantage by using unlawful or irresponsible data collection technologies.The ICO is also developing guidance on the wider use of biometric technologies. These technologies may include facial, fingerprint and voice recognition, which are already successfully used in industry.The ICO says that its biometric guidance, which is due to be published in Spring 2023, will aim to further empower and help businesses, as well as highlight the importance of data security. Biometric data is unique to an individual and is difficult or impossible to change should it ever be lost, stolen or inappropriately used, the regulator says.In developing the guidance, the ICO will hold public dialogues with both the Ada Lovelace Institute and the British Youth Council, which will explore public perceptions of biometric technologies and gain opinions on how biometric data is used.The ICO says that supporting businesses and organisations at the development stage of biometrics products and services embeds a "privacy by design" approach, thus reducing the risk factors and ensuring organisations are operating safely and lawfully. Accordingly, it has published two new reports to support businesses navigating the use of emerging biometrics technologies.(1)For further information on this topic please contact Patrick Rennie at Wiggin by telephone (+44 20 7612 9612) or email ([email protected]). The Wiggin website can be accessed at www.wiggin.co.uk.Endnotes(1) To read the ICO's news release in full, which includes examples of where biometrics technologies are currently being used and the sectors involved, and for links to the new reports, click here.