The beginning of 2017 saw a flurry of comment regarding automated processing of personal data in the workplace. The press reported that a company called "Humanyze" was supplying at least four UK employers with wearable badges that tracked workplace interactions at a granular level (including monitoring employees' movements and tone of voice). The badges can be used in combination with other digital tools to identify positive team dynamics and improve productivity.
Even where employers have not adopted wearable technologies, they may deploy a variety of performance management tools to assess their employees. Professional services firms have long used time-recording software to track activity levels. Sales managers may rank their teams according to conversion rates or income and circulate the data to encourage competition. Sickness absence management tools notify human resources professionals or line managers when absence thresholds are triggered.
The application of these performance and absence management tools can have significant consequences for employees (triggering warnings under performance improvement plans or capability procedures). Ostensibly benign initiatives like wellness programmes, which provide employees with fitness trackers, also place a significant volume of potentially sensitive data in the hands of an employer.
The GDPR, which comes into force in May 2018, will impose limits on the way in which such data can be used and greater sanctions for non-compliance than under the current regime. Employers will need to audit their data gathering tools to ensure that they do not infringe the more stringent obligations under GDPR, including the prohibition on solely automated processing, and risk fines linked to their turnover, negative publicity and claims by data subjects.
Prohibition on fully automated decision-making which has a legal or similarly significant effect
Under Article 22 of the GDPR, data subjects have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects or which similarly affects the data subject. Recital 71 explains that this prohibition could capture automated processing of personal data which analyses or predicts aspects of the data subject's performance at work, including their "reliability or behaviour, location or movements."
When the Information Commissioner's Office sought feedback on profiling and automated decision-making, respondents were concerned to understand how much human involvement (and at what point in the process) was required to take the processing out of the solely automated category. The October 2017 Article 29 Data Protection Working Party guidelines state that the prohibition applies where there is no human involvement in the decision-making process.
The use of a software programme to analyse individual performance and rank individuals (with such ranking impacting the employee's pay, promotion and/or assignment opportunities) would be caught if there was nobody reviewing the findings or considering other factors before a decision was taken. The same prohibition would apply if an absence management programme automatically resulted in warnings being imposed, without a line manager or HR professional considering the wider context.
The Working Party guidelines make it clear that human intervention must be meaningful and carried out by someone with the authority and competence to change the decision, having regard to all the available data. This reflects good practice in any event. For example, absence management programmes which do not take into account an employer's obligations under disability discrimination legislation create significant risk for an employer. The same is true of performance management exercises which focus exclusively on an employee's output and do not assess the reasons for under-performance.
Exceptions to the prohibition on fully automated processing
The Working Party guidelines acknowledge that automated decision-making could potentially allow for "greater consistency or fairness in the decision making process" and "deliver decisions within a shorter time frame" to improve efficiency. As such, employers keen to maximise productivity using data gathered via these software programmes and wearable technologies should familiarise themselves with the exceptions to the prohibition. These exceptions include automated processing which is (i) necessary for the entering into or performance of a contract; (ii) authorised by law (which also provides safeguards for the data subject); or (iii) the subject of explicit consent. In an employment context, the contractual necessity exception is most likely to be relevant.
Necessity is interpreted strictly and an employer will have to show that a less intrusive means of achieving their objective would not suffice (for example, why GPS trackers are the most reliable means of monitoring road safety compliance by field sales staff). Employers will strengthen their position by carrying out a privacy impact assessment to show how any detrimental impact on the data subject has been mitigated. For personal data that is not deemed "special categories of personal data," it is easier for an employer to rely on the necessity exemption rather than explicit (affirmative) consent. This is due to the requirement for consent to be freely given which is challenging given the imbalance of power between employer and employee.
The exception of necessity is not available for "special categories of personal data" (broadly what we know as "sensitive personal data") which requires the "explicit" consent of the data subject and suitable measures to safeguard their rights, freedoms and legitimate interests. Due to the difficulties surrounding consent in an employment context, employers must ensure they have obtained informed, affirmative, specific and freely given consent. Employers should be transparent about how special categories of personal data are being processed (such as data obtained via fitness trackers) and inform employees that their consent to automated decision-making processing can be withdrawn at any time without adverse consequences.
Having established an exception of necessity or explicit consent, data controllers using an automated decision-making process are still required to observe minimum safeguards for data subjects. Employees must be informed that automated decision-making, including profiling, is taking place, provided with meaningful information about the logic involved, the significance and potential consequences.
However, the most significant safeguard against the potential inaccuracies and generalisations associated with automated processing is the "human factor" enshrined in GDPR. Under Article 22(3), the employee must be informed of the right to obtain human intervention, to express his or her point of view and/or to contest the decision. As the Working Party guidelines indicate: "Human intervention is a key element. Any review must be carried out by someone who has the appropriate authority and capability to change the decision. The reviewer should undertake a thorough assessment of all the relevant data, including any additional information provided by the data subject."
This fundamental safeguard is consistent with best employment practice. It is right that employees should be managed by people, rather than a software programme, and that there should be face-to-face discussion about decisions of significance in the workplace. Performance or absence warnings should not be imposed without understanding the reasons behind the poor performance or sickness absence and that requires a meeting where each party has the opportunity to make representations. If employees are not eligible for promotion, pay rises, transfers or international assignments based on performance, the manager should own that decision and be prepared to discuss it directly.
It is worth remembering that where employees do not understand the reasons behind workplace decision-making or the consequences, they look for other reasons (outside of their own performance) and that is where significant claims and internal costs arise. GDPR adds another layer of risk by introducing the prospect of fines based on turnover and compensation claims by any person who has suffered material or non-material damage as a result of an infringement.