The 'right to be forgotten' in the context of EU data protection law, is something of a misnomer; it is, in fact, a qualified right to the erasure of personal data. While it does not afford individuals with a blanket right to have their personal data erased or forgotten (except in relation to direct marketing), it is an essential weapon for individuals in the wider privacy arsenal.
The 'right to be forgotten' hit the headlines following the shock judgment in the Google Spain case. The judgment pre-empted both the extended territorial scope and the right to erasure in the EU General Data Protection Regulation 2016 (GDPR), and highlighted the frequent tension between the rights to privacy and data protection on the one hand, and to freedom of expression and access to information on the other. Data subjects seeking to exercise the right effectively and data controllers in receipt of erasure requests, need a thorough understanding of its extent and limitations.
Data Protection Directive 1995
The right to erasure is not a new concept in EU data protection law. Under the Data Protection Directive 1995, individuals were given a right to ask controllers to erase or block their personal data where its processing did not comply with the provisions of the Directive, in particular where it was incomplete or inaccurate.
Under the UK's implementation of the Directive in the Data Protection Act 1998 (DPA98), the explicit right to erasure was linked to inaccuracy in s12, but s10 also gave individuals the right to prevent processing for a purpose or manner "causing or likely to cause substantial damage or substantial distress" where such damage or distress was "unwarranted". In addition, the data protection principles which underpinned the DPA98, required data to be accurate, up to date, kept for no longer than necessary for the purpose for which they were collected, processed only in relation to a specified lawful purpose and not excessive in relation to the related purpose. While this didn't result in a clearly defined right to be forgotten, it did give individuals reasonable scope to have their personal data erased.
The other side of the privacy coin is, of course, freedom of expression and the right to access information. To protect those rights, the Directive provided an exemption from some obligations under the Directive in the case of personal data processed only for one of the special purposes of journalism, art and literature. Where the exemption applied, it covered provisions including those under ss10 and 12, and all but one of the data protection principles, where:
- The processing was undertaken with a view to the publication of any journalistic, artistic or literary material.
- The data controller reasonably believed that, having regard in particular to the special importance of the public interest in freedom of expression, publication would be in the public interest.
- The data controller reasonably believed that in all circumstances, compliance with any of the potentially exempted provisions would be incompatible with the special purposes.
As the internet went mainstream, it became increasingly clear that the Data Protection Directive was not designed to deal with a world in which information about individuals can be made publicly available on a global basis, potentially in perpetuity, possibly from another jurisdiction and accessible at the press of (a few) buttons. Data privacy is by no means the only area of law which has struggled to be effective in the virtual world but for a while the problem of enforcing data protection rights against businesses like search engine operators based in a different country and often outside the EU, appeared intractable.
Google Spain – the judgment
In May 2014, the CJEU's judgment in the case of Google Spain v AEPD and Mario Costeja Gonzalez appeared to swing the privacy pendulum back towards individuals. The CJEU ruled on questions referred by a Spanish court relating to interpretation of the Data Protection Directive and its application to search engine activities. The CJEU found that search engines were data controllers in respect of their search results; that European data protection law applied to their processing of the data of EU citizens, even where the relevant data was processed outside the EU; and that a 'right to be forgotten' online applied to outdated and irrelevant data in search results unless there was a public interest in the data remaining available and even where the search results linked to lawfully published content.
The decision (which went against the advice of the Advocate General) caused shockwaves. Hailed by privacy campaigners, it was widely perceived at the time to be a blow to freedom of speech. This was partly because of the emphasis the judgment placed on the individual's rights to privacy and data protection, suggesting that they take precedence over the rights of internet users and search engines to access and display information in search results unless there was a public interest in the information being available.
While Google Spain did not actually uncover a new right for data subjects but rather crystallised a number of rights in the then current Directive, it certainly expanded the reach of EU law in this area by holding that Google Spain (a branch of Google Inc. responsible for sales and marketing in Spain) was processing personal data in the context of the activities of Google Inc. even though the processing at issue in the proceedings was entirely separate from the activities of Google Spain.
Google Spain – the aftermath
Search engines scrambled to understand and comply with the judgment as they were inundated with deletion requests. By October 2014, Google had evaluated 498,737 URLs and removed 41.8% of those. Advice from regulators followed, most notably from the Article 29 Working Party (WP29). Among other things, the WP29 suggested that limiting de-listing to EU domains is insufficient to guarantee the rights of the data subject and said any link removed on the basis of the Google Spain ruling should be removed globally.
This advice was followed by the French data protection authority, the CNIL, in 2015 when it served formal notice on Google that, when acceding to a request from a natural person for the removal of links to web pages from the list of results displayed following a search performed on the basis of that person's name, it must apply that removal to all of its search engine's domain name extensions. Google did not comply with the CNIL's order but de-listed searches on its EU domains and proposed using geo-blocking to prevent access to results appearing on other domains. The CNIL fined Google EUR100,000 for failing to comply in full. Google appealed to the Conseil d'État which referred questions to the CJEU for a preliminary ruling.
The Advocate General published his Opinion in January 2018 (yes, the pace is slow) and has supported Google's approach, notwithstanding the non-binding WP29 guidance, holding that search requests made outside the EU should not be impacted by a successful request to de-list in the EU. While understanding the CNIL's reasons for preferring a holistic, simple approach, the AG said that approach would fail to take account of all the competing rights and that the Data Protection Directive could not be extended or create rights beyond the EU. The AG said the most significant argument against global de-listing was that a fundamental right to be forgotten must be balanced against other rights. If worldwide de-listing were permitted, the EU authorities would not be able to define and determine a right to receive information, let alone carry out a proper balancing exercise. In addition, public interest in accessing information would necessarily vary from third country to third country; a worldwide approach to de-listing would prevent people in third countries accessing information and could also result in third countries restricting access to information by people in the EU.
The AG recommended that the CJEU hold that a search engine operator is not required to carry out de-referencing on all domain names of its search engine in such a way that the links no longer appear, regardless of the location from which the search is performed. There might be occasions when worldwide de-referencing would be appropriate but this would need to be decided on a case by case basis. Once a right to have search results de-listed within the EU is established, the AG emphasised that the search engine operator must take all available measures to carry out full and effective de-referencing, including by using geo-blocking in respect of IP addresses deemed to be located in a Member State, whatever the domain name used by the searcher might be.
Search engines and freedom of speech campaigners are no doubt delighted by the AG's Opinion and will be hoping that the CJEU follows it (although it has form in this area for going against the AG's advice) as this would potentially reign back the territorial application of the EU's right to be forgotten online.
As the consequences of Google Spain played out, the European Commission continued to work towards a new General Data Protection Regulation (GDPR), finally passed in 2016 and applied across the EEA from 25 May 2018. The Google Spain decision was very much in line with the direction of travel of the GDPR as it went through the legislative process. The GDPR expanded the territorial application of EU data protection law, and introduced a consolidated right to erasure while retaining principles requiring that personal data be accurate and up to date. The following provisions are all relevant to the right to be forgotten.
Article 3 – territorial scope
The GDPR applies to processing of personal data in the context of the activities of an establishment of a controller or processor in the EU, regardless of where the processing takes place. It also applies to the processing of personal data of data subjects who are in the EU by a controller or processor outside the EU where the processing activities relate to the offering of goods or services to EU data subjects or to the monitoring of their behaviour in the EU.
Article 5 – principles relating to processing of personal data
The GDPR contains a requirement that personal data be accurate and, where necessary kept up to date in Article 5(d) which also states that "every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay".
Article 17 – the right to erasure
Under Article 17, data subjects have the right to obtain erasure of their personal data without undue delay, from the relevant data controller who has an obligation to erase the personal data without undue delay where:
- The data is no longer necessary for the purpose(s) for which it was collected or processed;
- The data subject withdraws consent on which the processing is based and there is no other legal ground for the processing;
- The data subject objects to the processing under Article 21(1) and there are no overriding legitimate grounds for the processing, or the data subject objects under Article 21(2) (direct marketing);
- The personal data has been unlawfully processed;
- The personal data has to be erased for compliance with a legal obligation under EU or Member State law to which the controller is subject; or
- The personal data has been collected in relation to the offer of information society services from a child who is below the digital age of consent (which varies across Member States from 13-16).
Where the controller has made data which is the subject of a successful erasure request public, it must take reasonable steps (taking into account available technology and cost) to inform controllers processing the personal data that the data subject has requested erasure of any links to or copy or replication of the data.
Article 17(3) provides for exceptions to the right to erasure which cover:
- Exercising the right of freedom of expression and information;
- Compliance with a legal obligation which requires processing by EU or Member State law to which the controller is subject, or for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller (Member States have discretion as to how they define this – in the UK, the Data Protection Act 2018 sets out a list in clause 8, Part 2, Chapter 2).
- For reasons of public interest in the area of public health (certain sensitive data only);
- For archiving purposes in the public interest, scientific or historical research or statistical purposes in accordance with Article 89(1) where erasure would seriously impair the objectives of such processing; or
- For the establishment exercise or defence of legal claims.
Article 19 – notification obligation regarding rectification or erasure of personal data or restriction of processing
Under Article 19, a controller who carries out erasure of personal data following an Article 17 request, must communicate the erasure to each recipient of that data unless that proves impossible or involves disproportionate effort. The controller is required to provide the data subject with a list of these recipients if requested by the data subject to do so.
Article 85 – processing and freedom of expression and information
Member States are required under Article 85, to "reconcile the right to the protection of personal data pursuant to the Regulation with the right to freedom of expression and information, including processing for journalistic purposes and the purposes of academic, artistic or literary expression". They must provide for exemptions and derogations from many of the requirements of the GDPR, including the right to erasure, in these areas (see below).
Sanctions for non-compliance
Under the GDPR, data subjects may complain to the regulator where they believe a breach of the GDPR takes place and may also issue legal proceedings against controllers and processors as relevant. The regulators have a range of available powers, the most draconian of which is the ability to issue large fines. Breach of Article 17 provisions can incur fines of up to the higher of 4% of annual global turnover or EUR20m.
Data Protection Act 2018
The Data Protecton Act (DPA18) supplements the GDPR in the UK and covers permitted exemptions and derogatons.
Exemptions for reasons of freedom of expression and information – Schedule 2, Part 5
In the UK, the Article 85 exemptions and derogations are dealt with in part 5 of Schedule 2 DPA18 which states that processing for the purposes of journalism, academic, artistic and literary purposes constitute "special purposes". If processing is carried out for the special purposes with a view to the publication by a person of journalistic, academic, artistic or literary material, and the controller reasonably believes that the publication of the material would be in the public interest, then a defined set of GDPR provisions including the Article 17 right to erasure will not apply to the extent that their application would be incompatible with the special purposes.
When making the assessment of whether publication would be in the public interest, the data controller is required to take into account "the special importance of the public interest in the freedom of expression and information" as well as specific codes of practice and guidelines when making their public interest assessments.
The DPA18 journalistic exemption is very similar to the one in the DPA98 but goes slightly further in that the personal data no longer has to be processed solely for a single special purpose in order to benefit from the exemption, and the list of GDPR clauses which the exemption covers is wider than under the DPA98.
The DPA also contains a range of other exemptions from Article 17, for example, in relation to health, social work, education and child abuse, which are beyond the scope of this article but can be found in Schedule 2.
The ongoing tension between competing rights
What all this means is that the rights of individuals in the GDPR can come into conflict with the rights to freedom of expression and access to information. Beyond the GDPR, these rights are enshrined in the European Convention on Human Rights and the EU Charter of Fundamental Rights. Both the Google Spain judgment and other judgments relating to the right to be forgotten point to the balancing exercise which needs to be carried out on a case by case basis between the right to data protection and privacy on the one hand, and the rights to freedom of expression and access to information, particularly where the information is published for the purposes of journalism or constitutes artistic or literary expression.
The more judgments we see around this issue, the more guidance we have as to the where the balance between these rights lie.
NT1 and NT2 v Google
The English High Court considered the Data Protection Directive right to erasure and the Google Spain judgment in NT1 and NT2 v Google when deciding on applications by two individuals to have search results which linked to criminal convictions de-listed. NT1 was convicted of a criminal conspiracy to account falsely in relation to the activities of a property business which dealt with members of the public. NT1 was sentenced to four years' imprisonment. His claim related to three links appearing in google search results giving information about the conviction. NT2 was sentenced to six months following a conviction for conspiring to intercept communications. His claim related to 11 source publications. The Court applied the same balancing exercise in both cases but came up with different decisions. NT1's request was denied and NT2's was upheld.
NT1 failed to make out his claim for de-listing pursuant to Google Spain. The main factors in holding that the balance of interests did not fall in NT1's favour were:
- NT1 continues to have a role in public life as a businessman.
- The information about the crime relates to his business activities and has never attracted an expectation of privacy.
- The original reporting appeared in the context of the crime committed by NT1.
- The sentence was of such a length that at the time it was handed down, NT1 could not have expected the conviction would ever be spent (the law changed subsequently).
- NT1's business career since leaving prison makes the information relevant to the past assessment of his honesty by members of the public and remains relevant today. This is particularly true as NT1 has not acknowledged his guilt, has misled the public and the Court, and has shown no remorse. Keeping the information in the public domain by maintaining links in search results minimises the risk that he will continue to mislead.
NT2's Google Spain de-listing claim was upheld because:
- The crime and punishment information had become out of date, irrelevant and was of insufficient public interest.
- NT2 acknowledged guilt and expressed remorse.
- The sentence was always going to be spent.
- The past offence had little if any relevance to NT2's current or future business activity.
Interestingly in both cases, the Court held that Google could not rely on the journalistic exemption because the personal data in question had not been processed solely for the special purpose of journalism. This contrasts with the view taken in the recent Advocate General's Opinion in G.C. and Others v CNIL. This reference involved four joined cases also relating to the de-referencing of Google search results linking to sensitive personal data. The AG opined that Google could rely on the journalistic exemption where the underlying material warranted it, if it was required to conduct a Google Spain balancing exercise. The High Court's assessment on this issue in NT1 and NT2 will be less relevant in GDPR cases in part given the fact that processing does not have to be done solely for a special purpose to benefit from the special purposes exemption.
Applicant v Google
A recently reported decision from the Netherlands overruled the Dutch data protection regulator and Google and ruled in favour of a surgeon applying to have links to information about a medical negligence investigation removed. The doctor in question had her registration on the register of healthcare professionals suspended by a disciplinary panel. On appeal, the registration was suspended on a temporary basis, after which the doctor continued to practise. The doctor's name was placed on a website containing an unofficial 'blacklist' of medical practitioners which appeared as the first set of search results against the doctor's name. Google and the Dutch regulator resisted the Doctor's application to have the links to the website de-listed on the basis that the doctor was still on probation so the information was still relevant. It was also accurate.
The District Court of Amsterdam ruled in favour of the doctor, saying that she "had an interest in not indicating that every time someone enters their full name in Google's search engine, (almost) immediately, the mention of her name appears on the 'blacklist of doctors' and this importance adds more weight than the public's interest in finding this information in this way". The judge said the information was publicly available elsewhere with less pejorative overtones than the unofficial blacklist which amounted to "digital pillory". The fact that it is hard for doctors to defend themselves publicly owing to obligations of confidentiality was also relevant.
Google is set to appeal the decision but as it stands, it appears to favour the rights of the individual, suggesting that if links are to information which is presented in an unnecessarily damaging way but is available in a more neutral context elsewhere, the links should be de-listed even though the information is accurate and up to date. It will be interesting to see how this progresses.
To date, case law around the right to be forgotten has considered the right to erasure in the context of the Data Protection Directive rather than the GDPR, but substantial sections of those judgments will remain relevant to future GDPR cases due to the fact that the balancing exercise between competing rights which may need to be carried out has not changed in essentials and rights to have data kept accurate and up to date, and to have it erased under certain circumstances, have been preserved.
The cases to date have also mainly involved search engine results but it is important to remember that the GDPR right to erasure is not limited to online information, nor to search engine results. It is also worth noting that data protection law is unlikely to be the only issue in right to be forgotten cases. In NT1 and NT2, for example, claims were made under UK data protection law and the tort of misuse of private information and the Court recognised ten relevant elements of law. We consider how the right to be forgotten may be wielded and defended against in a media context..