Hungary’s data protection authority (NAIH) recently levied a fine of HUF 250 million (EUR 675,675) against a bank for the shortcomings of its automatic AI analysis of recordings of customer service calls, which included assessing the emotional state of the speaker and other characteristics. The bank used the results to monitor the quality of calls, prevent complaints, rate the quality of work and increase the efficiency of its call-handling staff.
Specifically, the AI-based speech-signal processing technology automatically analysed a list of keywords and the emotional state of the speaker. The results of the detected keywords and emotions were also stored along with the call, and the calls could be replayed within the voice analytics software for up to 45 days. The software ranked the calls and provided recommendations according to priority of the callers to be contacted.
While the NAIH did not rule the AI analysis of recorded customer service calls unlawful, it found the following shortcomings in this particular solution:
The bank’s customer service privacy notice did not contain any substantive information on voice analysis. The privacy notice only mentioned quality assurance and complaint prevention as data processing purposes.
The bank based data processing on its legitimate interest in retaining customers and improving the efficiency of its internal operations. However, the different data processing operations related to these interests were not separated either in the privacy notice or in the balancing of interest test (LIA).
The bank's data protection impact assessment (DPIA) concluded that this processing is high-risk for a number of reasons. However, the DPIA did not provide substantive solutions to address these risks.
The bank did not actually examine the proportionality of the data processing and its effects on data subjects, and trivialised the significant risks to fundamental rights. It expressly failed to take into account the right of data subjects to adequate information and their right to object.
The NAIH hinted that only informed consent, freely and actively given, could be the basis for similar data processing operations. The NAIH’s decisions suggest that in all cases companies using AI solutions must ensure that a data subject's rights in relation to the processing are adequately protected. This particularly applies to the right to be informed through a properly drafted privacy notice. If a company still relies on legitimate interest to process personal data, it must ensure that data subjects have advance knowledge of data processing and are able to object to it, since after the processing –particularly, during short-term or one-off data processing operations – the objection right becomes void.