On October 17, the Article 29 Working Party (A29WP) published its much awaited draft guidance on automated decision making and profiling under GDPR. The A29WP are accepting comments on the guidance until 28 November.
The A29WP recognise the tension between the benefits of profiling activities in increasing efficiencies and resource savings, and the risks posed by profiling for example in advancing stereotypes and black box decision making.
What is profiling under GDPR?
The GDPR defines profiling as 'any form of automated processing of personal data consisting of the use of personal data to evaluate personal aspects relating to the natural person, in particular to analyse or predict certain aspects concerning that natural persons performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location and movement'.
The A29WP note that simply classifying individuals based on age, sex and height could be considered profiling.
Profiling can broadly be broken down into two frameworks under GDPR:
(i) The taking of significant, solely automated decisions (See Section 1 below)
(ii) General rules on profiling (See Section 2 below). These general rules also apply to significant automated decisions.
1. Significant Solely Automated Decision Making
Under GDPR individuals have the right not to be subject to a decision based solely on automated processing (including profiling) if the decision produces:
(i) legal effects or
(ii) similarly significantly affects the individuals.
The A29WP interpret this rule as a prohibition on automated decision making unless one of the conditions at (i)-(iii) below apply (or Article 22(4) GDPR in relation to sensitive data), as opposed to a right for individuals to merely object to such decisions.
Such significant automated decisions can be taken if they are:
(i) necessary to enter into, or to perform, a contract between the individual and the Controller (according to the A29WP, ‘necessity’ should be interpreted narrowly);
(ii) authorised by EU or Member State law; or
(iii) based on the individual’s explicit consent.
The requirement that the decision be based 'solely' on automated means is significant. There must be no human involvement in the decision making process for the restriction to apply. To fall outside the rule the A29WP state that human involvement must be meaningful, not a token review.
The A29WP also provide guidance on the meaning of a decision having a ‘legal’ or ‘similarly significant’ effect.
'Legal Effects' are those that have an impact on an individual’s legal rights such as statutory or contractual rights (for example an individual being refused entry at a border, being denied a social benefit granted at law or having their mobile phone terminated for failure to pay the bill).
'Similarly Significant effects' are those that are equivalent or similarly significant to legal effects. The effect must be more than trivial and must have the potential to significantly influence the circumstances, behaviour or choices of the individuals concerned (examples could include automatic refusal of an online credit application or e-recruiting practices without human intervention). Much depends on the context, and it is difficult to provide a fixed list of what might be considered 'significant'.
Can targeted advertising be a significant automated decision?
In a move that will be welcomed by those engaged in online advertising, the A29WP note that typical cases of targeted advertising would not have a significant effect on individuals. However, the A29WP recognise that in certain contexts targeted advertising could lead to a legal or significant effect.
Relevant considerations in making an assessment will include (a) the intrusiveness of the profiling process; (b) the expectations and wishes of the individual; (c) the way the advert is delivered; and (d) the particular vulnerabilities of the individuals targeted. Examples of online advertising that could be significant automated decisions include differential pricing preventing an individual obtaining certain goods or an individual with financial difficulties being regularly targeted ads for gambling sites.
Even if the advertising activity is not a significant automated decision, Controllers must still comply with the general rules on profiling under GDPR (See 2 below). The proposed e-Privacy Regulation will also be relevant here, and organisations should keep this area under review.
Safeguards for automated decision makers
If the Controller relies on the conditions for contract performance and consent for the automated decision, the Controller must implement suitable measures to safeguard the individual. At a minimum, this must include a right to obtain human review of the decision. The A29WP note the review must be carried out by someone with appropriate authority and capability to change the decision.
The A29WP note that organisations must find simple ways to tell the individual about the rationale behind or criteria relied on to reach the decision. The information should be meaningful to the individual, but this may not necessarily involve including complex explanations of algorithms.
According to the A29WP, Controllers should carry out regular testing on the data sets they process to check for any bias and measures should be taken to prevent errors, inaccuracies or discrimination on the basis of special categories of data. Audits of algorithms are also advised.
2. General Provisions on profiling and automated decision-making
As with other processing of personal data under GDPR, the Data Protection Principles will apply to both profiling and automated decision making. Much of this is not new, but the A29WP provides useful guidance on the application of the principles to profiling:
|Principle / Issue||A29WP Guidance|
Profiling is often invisible to individuals. Controller must provide concise, transparent, intelligible and easily accessible information about the processing.This should be in line with the transparency obligations under GDPR.
Controllers will need a lawful basis for profiling/automated decision making. These may include (among others):
Consent - Controller must show that individuals understand exactly what they are consenting to. Consent won't work where there is an imbalance of power (e.g. employer - employee relationships). A29WP to publish separate guidance on consent under GDPR. (Note explicit consent will generally be required for significant, solely automated decisions)
Legitimate Interests - Controller having a legitimate interest to profile is not enough.The Controller must carry out the usual balancing exercise to ensure the Controller's interests are not outweighed by the interests to the individuals concerned.
The following factors are relevant in establishing the balance of interest (a) level of detail of the profile (e.g. is the profile within a broadly defined cohort, or targeting on a granular level); (b) the comprehensiveness of the profile (whether it only describes a small aspect of the individual or creates a more comprehensive picture); (c) the impact of the profile; and (d) the safeguards aimed at ensuring fairness, non-discrimination and accuracy of the profile.
Necessary for compliance with a legal obligation - for example in connection with fraud prevention or anti-money laundering.
Necessary for Performance of a Contract - this condition is narrowly construed. Even if profiling is specifically mentioned in the small print of the contract, this alone is not enough to make it 'necessary' for the performance of the contract.
|3||Sensitive Personal Data|| |
Usual rules apply: Controller must satisfy one of the conditions under Article 9(2) GDPR.
Profiling may identify special categories of data by inference (for example inferring an individual's state of heath from records of their food shopping). The A29WP note that informing individuals is particularly important where inferences about sensitive data can be drawn from other data. The Controller must also satisfy the conditions in Article 9(2) for special categories of data inferred from profiling.
|4||Purpose Limitation|| |
Profiling often involves use of personal data that was collected for other purposes.
The further use of data for profiling must be compatible with the purposes for which it was collected; otherwise the consent of the individual may be required. The A29WP give the example of mobile applications providing location services to find restaurants offering discounts, but then using that location data to target ads for restaurants.
Controllers should have regard to the Notice they gave to individuals, and the factors for establishing compatibility of processing set out in Article 6(4) GDPR.
|5||Data Minimisation/ Storage limitation|| |
The business opportunities created by profiling mean organisations may be inclined to collect more data and keep it longer.
Controllers should be able to clearly explain and justify the need for collecting and holding data and consider whether aggregated/ anonymised data could be used for profiling.
Storing data for too long may conflict with the proportionality principle. Controllers should implement clear retention periods for profiles
If the data used in automated decision making/profiling is inaccurate any resulting decision will be flawed i.e. Garbage in - Garbage out
Controllers need to have robust measures in place to verify on an ongoing basis that data reused or obtained indirectly is up to date
|7||Right of Individuals|| |
The GDPR introduces stronger rights for individuals. These rights will also apply to Controllers in the context of profiling and automated decision making. For example, access rights will cover profiling data.
On access rights, A29WP note that only in rare cases will the Controller's IP rights outweigh the individual's right of access. Controllers should not use this as an excuse to deny access. These rights should be balanced against the individual's right to information.
Children and Profiling
Recital 71 GDPR provides that significant automated decision making (including profiling) should not concern children. The A29WP does not consider this an absolute prohibition, as the restriction is in the Recital of the GDPR not the main GDPR text. (It will be interesting to see if this reasoning translates to other A29WP Guidance, for example some of the more difficult requirements for obtaining valid consent under GDPR are in the GDPR Recitals).
The A29WP note that there may be circumstances where Controllers need to carry out significant automated decisions in relation to children, although the examples the A29WP give look narrow (example protecting the child's welfare). In those cases, Article 22 GDPR must be followed, and the safeguards implemented should be appropriate to children. This flexibility is helpful, as otherwise the rule could have prevented activities that are clearly in the child's interest (for example, health care devices that trigger changes in medication or alerts). More generally, organisations should refrain from profiling children for marketing purposes as children can be particularly susceptible in the online environment and more easily influenced by ads.
Data Protection Impact Assessments ('DPIA')
A DPIA will enable Controllers to assess the risks involved in automated decision-making, including profiling.
The GDPR calls out significant automated decisions as an example of high risk processing requiring a DPIA. The A29WP interpret this broadly to also capture decisions not wholly taken by automated means which have legal or significant effects. Other profiling activities may also warrant a DPIA. See A29WP Guidance on DPIAs here.
A copy of the draft guidelines can be found here.