On 21 January 2019, the CNIL, the French data protection authority, fined Google LLC (“Google”) € 50,000,000, for two types of infringements: (i) violations of the obligations of transparency and information as required by Articles 12 and 13 of the General Data Protection Regulation (“GDPR”) in relation to Google’ system of information provision to Android users; and (ii) a violation of the obligation to have a legal basis as required by Article 6 GDPR in relation to its advert personalisation services.
Although the significance of the fine imposed has received a great deal of attention, in our view, four other issues of this decision are likely to have more important implications for the Japanese companies operating in Europe. We will thus first address those four issues below before the issue of the fine.
1. Procedure
There are four procedural particularities in this case.
First, the investigation was initiated following complaints lodged by two non-profit organizations, Max Schrems’ None of Your Business and the French advocacy group La Quadrature du Net, in accordance with Article 80 GDPR. This indicates that if a well-known not-for-profit organization lodges a basic but sufficiently argued claim, it might be sufficient for a data protection authority to launch an investigation. Also, the success of the collective complaint of this case may encourage data subjects to utilize a collective complaint and/or redress mechanism provided for under Article 80 GDPR.
Second, the investigation was only carried out online and no on-site inspection was conducted. A data protection authority may now easily investigate digital businesses even if their offices are located outside its geographical jurisdiction. There is therefore an elevated risk of investigations by data protection authorities for global digital businesses.
Third, the CNIL’s investigation focused on the privacy policy and the terms of service; however, it also extended to an assessment of the use of Android and Google Account, digital tools. Once an investigation starts, its scope can be expanded to areas indirectly related to the subject-matter of the investigation.
Lastly, the CNIL dismissed Google’s claim that its right to a fair trial guaranteed under Article 6 of the European Convention on Human Rights was infringed due to the lack of English translations of the fining decision and the opinion of the rapporteur and due to the insufficient time limits to provide observations. If the CNIL intends to address a non-French entity in its infringement decision, as it did in this case (the addressee of the decision was Google LLC), it should be more lenient with regard to the time limits to provide observations given that the CNIL provides all documents in French.
2. Main Establishment
Google claimed that its “main establishment” is Google Ireland Limited and the lead supervisory authority is thus the Irish data protection authority and that the CNIL lacked jurisdiction. Under the One-Stop Shop mechanism introduced by the GDPR, the supervisory authority of the main establishment is competent to act as lead supervisory authority for the cross-border data processing. This mechanism was created with a view to reducing administrative burden for entities carrying out cross-border data processing activities and to avoiding inconsistent decisions from multiple data protection authorities.
The CNIL rejected Google’s claim and concluded that it was competent to handle this matter in the absence of Google’s main establishment in the EU. Relying on the definition of “main establishment” under Article 4(16) GDPR, the CNIL found that Google Ireland Limited did not have any decision making power concerning the purposes and means of data processing carried out in the context of the creation of a Google Account during the configuration of a mobile phone using Android. Google Ireland Limited’s role was more or less limited to the conclusion of contracts with European clients.
This is the first time that an EU Member State data protection authority has ruled on the concept of “main establishment”. Thus, some may try to argue that this is an isolated case and the other EU data protection authorities will not necessarily follow the CNIL approach. However, they most likely will do so in view of the fact that there seems to be consensus with regard to the CNIL’s conclusion. As a matter of fact, when the CNIL informed the other EU data protection authorities of its position concerning the issue of the main establishment to launch the cooperation mechanism under the GDPR, none of them opposed it.
According to the CNIL’s approach, for an EU entity to be qualified as the main establishment in a matter concerning a digital tool or IT system, it must have an active role in the choice and operation of that digital tool or IT system. However, Japanese companies often centralize IT systems at their headquarters in Japan. As a result, it will be difficult for them to claim that one of their EU subsidiaries is the EU main establishment when an IT system is under investigation.
Without the main establishment located in the EU, there is a risk that multinational companies may be subject to multiple fines by different data protection authorities in case the cooperation mechanism does not function properly. The Uber cases are a good illustration of this risk, although they pre-date the application of the GDPR. The American parent company (not its European subsidiaries) was fined € 435,000 by the British authority, € 600,000 by the Dutch authority and € 400,000 by the French authority respectively over the same data breach incident. The total fine amounted to € 1,435,000.
3. Transparency Principle
As shown by the fact that the majority of the sanctions issued in the last few months concerned security failures, the data security issue has gained attention as the primary risk associated with data processing. In such a context, the CNIL’s decision reminds us that the other data protection principles are as essential as the data security and their breaches can entail significant consequences.
In particular, the CNIL made it clear that transparency must not be neglected by any data controller even when complex multiple processing operations are involved. In the Google case, the CNIL found that Google failed to achieve a level of transparency required under Articles 12 and 13 GDPR. In accordance with Article 12 GDPR, when a data controller provides a privacy notice or communicates to data subjects, it must do so “in a concise, transparent, intelligible and easily accessible form, using clear and plain language”.
However, for the CNIL, the privacy notice provided by Google to data subjects to fulfil its obligation under Article 13 GDPR was excessively scattered across several documents, each of which contained buttons and links for complementary information. The users were required to carefully go through a great deal of information to identify relevant information. Due to this complex structure, certain information was not easily accessible.
The CNIL also found that Google did not provide information in an intelligible and clear manner. Google can have an in-depth knowledge of users by combining different types of data from different sources, such as geolocalisation and viewed contents. Such combination represents a highly intrusive nature, which requires a higher level of the “intelligible” and “clear” nature of the information provided. However, the information provided by Google to the users did not allow them to sufficiently understand the consequences of the data processing concerning them. In particular, the description of the purposes did not enable the users to measure the extent of the processing and of the potential intrusion into their private life. Further, the description of the collected data was particularly imprecise and incomplete. Accordingly, the CNIL concluded that the information Google provided lacked intelligible and clear characteristics.
4. Consent
In line with Article 6 GDPR, Google relied on the users’ “consent” to process their personal data for the purposes of personalized advertising. In this regard, according to Article 4(11) GDPR, “consent” means “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.”
However, for the same reasons as those for the lack of transparency (excessively scattered information), the CNIL found that the users were not sufficiently informed. Moreover, the CNIL found that consent was not unambiguous since the account preference settings were pre-ticked by default and there was basically no affirmative action by the users to consent to ad personalization. Further, the CNIL found that consent was not specific because users were obliged to accept all data processing operations for different purposes as a whole. Accordingly, the CNIL concluded that consent on the basis of which Google processed personal data was not valid.
5. Fine
By the Google decision, the CNIL has made the most known risk related to the GDPR, i.e. a heavy financial penalty, a reality. Until then, the risk had remained a theoretical one. The amount of the fines given after the GDPR became applicable on 25 May 2018 remained reasonable. For instance, the fines imposed by the Baden-Württemberg Land authority and the Portuguese authority for GDPR infringements in 2018 were € 20,000 and € 400,000.
The CNIL’s Google decision has crossed a threshold. The fine amount is substantial and it has significantly increased relative to previous fines. In July 2017, CNIL imposed a fine of € 100,000 on Darty, a major retailer of household electrical appliances. In June 2018, two weeks after the GDPR became applicable, the CNIL fined Optical Center € 250,000.
In Google, in order to justify such an increase, the CNIL primarily based its argumentation on the extent of the infringements. It rejected Google’s argument that the violations only concerned 7% of the Android users, i.e. the users creating a new Google account while configuring a new device using the Android OS.The CNIL considered that the violations affected all the Android users on the French market as they were in a similar situation.
6. Comment
Companies in most industries gather data concerning their users (or customers) from multiple sources, such as contracts, user accounts, web browsing, and geolocalisation. The data thus accumulated allows those companies to have an in-depth knowledge of users’ behavior, including their habits, opinions, and social interactions. For the companies, such knowledge is valuable in offering more targeted advertisements or services. However, from the users’ point of view, the significant amount of data collected by the companies and the complexity of its processing represent a high risk of intrusion into their private life. As such, it is essential that the companies enable the users to understand their processing of their personal data in a transparent manner.
The question is how to achieve the requisite level of transparency when providing a privacy notice in relation to complex and often multiple data processing operations. As the CNIL acknowledges, presenting all information concerning complex data processing in one document may be counterproductive. In such a case, a layered approach is encouraged but the layers should not require five or six actions on the part of the user to have a comprehensive view of the information on the data processing. In particular, the preference setting of the user’s account should occur quickly and at the latest in the second document. As to the content, the information must enable the users to understand how their personal data is used and how they can control the scope and manner of such use in a simple manner.
Without a transparent privacy notice, consent also becomes invalid due to the lack of the “informed” nature. Similarly, consent should not be obtained collectively in one action for several processing purposes. For each purpose, there should be a blank (as opposed to pre-ticked) box so that the user can decide whether to accept or refuse processing with respect to each purpose and not with respect to all purposes or a combination of different purposes.
Bearing these points in mind, the companies will need to verify and if necessary, reconsider their ways of presenting privacy notices and of obtaining consent taking into account the users’ perspectives. Without it, and without the EU main establishment, they may face fines in several jurisdictions that they may have never regarded as important from a commercial point of view.