Search engines are no longer in the dark about how to interpret the now infamous “right to be forgotten” ruling of the European Court of Justice (“ECJ”), handed down in May this year (read our post on the decision here).
On Wednesday 26 November, the European data protection authorities assembled in the “Article 29 Working Party” (“Working Party”) to adopt guidelines on the implementation of the ECJ ruling (“Guidelines”). The Guidelines contain an interpretation of the ruling together with a list of criteria to be used by data protection authorities when assessing complaints.
These criteria provide some much-needed guidance on the correct approach to considering requests for the removal of “outdated, wrong or irrelevant” personal data. The sheer volume of requests for the removal of such data received by Google since the ruling was handed down in May this year are indicative of its impact. According to its Transparency Report:
- Google has received over 185,000 requests from individuals to remove URLs from its search results since implementing its official request process in response to the ECJ ruling on 29 May 2014.
- Of these, it has removed 40.4%, but refused to remove 59.6% of requested URLs.
Click here to view image
(Image taken from Google’s online Transparency Report, “European privacy requests for search removals”, on 16 December 2014, you can find it here)
Whether or not the Guidelines have an impact on these numbers will become clear over time. Nonetheless, they provide a helpful policy framework and assistance in assessing take-down requests to search engines and data protection authorities alike.
Some clarity on the scope of the ruling
The Guidelines confirm that the ruling applies to search engines to the extent they process personal data within an EU Member State jurisdiction. Importantly, they also emphasise that the right to have personal data de-indexed from search results in certain circumstances only applies to results obtained from searches made on the basis of a person’s name. It does not require deletion of a link from the index of the search engine altogether. This means that the original information will remain accessible using other search terms or via direct access to the publisher’s original source.
What searches will be affected?
The territorial effect of the decision – namely, how it will practically affect searches conducted outside the European search domains – is a vexed one. Some experts and privacy regulators have argued that limiting the effect of the decision to searches conducted via European search domains renders the ruling less effective, given that users are free to conduct searches through different national domains.
The Working Group took the view that in order to give full effect to the decision and adequate protection of the rights granted, de-listing decisions regarding search results obtained through a European domain will be ineffective as long as individuals can access the information via other national search engine domains (for instance, google.com or google.com.au). The Working Group stated that “in practice, this means that in any case de-listing should also be effective on all relevant domains, including .com.”
This could have significant implications – up until now, Google has only been removing results from searches conducted via its European domains, meaning the results will still come up through a google.com search. It is unclear whether the Guidelines now require search engines to de-list certain links from search results obtained through domains outside the EU. If this is the case, this would seem in practical terms to significantly expand the territorial reach of the original ECJ decision.
On what grounds will results be removed?
Under the ECJ decision, if the search engines refuse to de-list links to certain personal information from search results made against an individual’s name, the individual may bring the matter before their relevant data protection or judicial authority for assessment. The Guidelines now provide a flexible list of criteria which authorities should apply to handle such complaints, including (amongst others):
- does the search result relate to a natural person (an individual)? And does the search result come up against a search on the data subject’s name?
- does the data subject play a role in public life – are they a public figure?
- is the data subject a minor?
- is the data up to date, accurate, relevant and not excessive?
- is the data processing causing prejudice to the data subject or putting the subject at risk?
- does the data relate to a criminal offence?
To an extent, these criteria generate more questions than answers, and the way in which they are interpreted will be the decisive. How can a “public figure” be defined in an age where online public profiles have become ubiquitous? What kind of data might be said to cause “prejudice” to the data subject? What level of risk to the data subject is required before the content will be removed?