The French data protection authority (the CNIL) recently imposed a €50 million penalty on Google, which can be considered as the first major enforcement action under the GDPR. Shortly after, Google indicated that it will appeal the fine. In this article, we provide further analysis of the decision and some thoughts on the appeal. Finally, we consider the potential implications for the application of the GDPR.

Motivation of the penalty

As stated in our previous article, the penalty was imposed on Google for failing to provide sufficient information and transparency as required under articles 5, 12 and 13 of the GDPR, and for processing personal data without a valid legal basis, constituting a violation of article 6 GDPR, in the context of creating a Google account when setting up an Android phone. The CNIL’s decision was based on the following reasons.

Too many steps required

The first issue addressed in the CNIL’s decision was that information about Google’s personal data processing was not easily accessible for users. The CNIL found that information was spread across too many different documents, and that too many steps were required to find the relevant information — in particular, where it concerned personalisation of advertisements (5 steps) and geo-tracking (6 steps). Another problem noted by the CNIL was that information about the retention terms of data was placed under the title “Exporting & deleting your information”, which took 4 steps to reach and did not make it sufficiently clear to users that information on standard retention terms was to be found there as well. These issues combined led the CNIL to conclude that there was a general violation of the requirement that the relevant information should be “easily accessible” to users.

Information was not sufficiently specific, considering the very high privacy impact

The CNIL also considered that the nature and scope of the personal data and processing activities at hand are particularly intrusive and massive, as information is continuously collected and processed about users’ habits, tastes, contacts, opinions, and whereabouts. In this situation in particular, the CNIL demanded more clarity from Google about the specific ways in which personal data would be processed, rather than broad and generic descriptions of purposes such as ‘offering personalized services in terms of content and advertising’, ‘safeguarding the security of our products and services’, and to ‘provide and develop our services’. The CNIL concluded that information sufficiently clear and detailed to enable users to substantially assess the impact on their privacy was not available in any of the several documents and layers of information provided by Google.

Essential information and settings were available only after creating an account

Another problem noted by the CNIL was that the “privacy check-up” and “dashboard” tools were only provided after the creation of a Google account, and not before. The CNIL considered the information provided there to be essential to properly inform users at the outset. Furthermore, an active step was required for users to access this information. The CNIL deemed this a violation of the requirements of article 13, which states that information should be provided to the data subject at the time when personal data are obtained. Thus, in accordance with the findings of the CNIL, the privacy check-up and dashboard information and settings should have been actively provided as part of the setup process, rather than made available after completion of the setup process. 

Consent was not sufficiently informed, specific, and unambiguous

The CNIL concluded that the user consent obtained by Google did not meet the requisite standard of “informed” per the GDPR, for the same reasons that it found Google to be in violation of the transparency requirements. For one, the CNIL noted that the information provided about ad personalisation did not explain the extent to which data would necessarily be combined across different services provided by Google (such as Google Search, YouTube, Google Home, Google Maps, Play Store, and Google Images). The fact that the box for ad personalisation was pre-ticked by default was also problematic, as it meant that consent was not given by an affirmative action from the data subject. Finally, the CNIL found that the user was asked to provide consent for all processing activities at once, whereas it stated that “consent is “specific” only if it is given distinctly for each purpose.”

Google’s decision to appeal

As mentioned, Google has already indicated that it will appeal the decision, giving the following statement:

"We’ve worked hard to create a GDPR consent process for personalized ads that is as transparent and straightforward as possible, based on regulatory guidance and user experience testing. We’re also concerned about the impact of this ruling on publishers, original content creators and tech companies in Europe and beyond. For all these reasons, we've now decided to appeal."

Such a response was only to be expected. There is plenty at stake here: not only the €50 million fine (which amounts to only about 4.5 hours of turnover for Google), but more importantly, the scope of transparency and consent requirements under the GDPR. The CNIL’s decision reminds us that transparency and consent are foundational elements of the GDPR, and the way these concepts are to be applied will indeed have major consequences for all or most digital services provided in Europe. Moreover, it appears that the decision was intended to signal to the market that data protection authorities in the EU are ready and not afraid to use their substantial enforcement powers. This is likely to provide further incentive for Google to push back.

The first of many privacy fines?

Considering what’s at stake, it is understandable that Google and others will keep trying to minimise what is required of them. While it may seem trivial to some, an issue such as whether pre-ticking a checkbox for ad personalisation is permissible or not can have significant consequences. This is because users tend not to change default settings, and in general, personalised ads attract considerably greater revenue than non-personalised ads. Furthermore, it will require more resources to provide more complete and accurate information, which is continuously kept up-to-date (particularly challenging in the fast-changing world of technology) and provided at the right time and in the right way. It seems likely, however, that tech companies across the board will have to step up their privacy game substantially or risk serious fines. At a maximum of 4% of global turnover, these could be far higher than the CNIL’s €50 million opening salvo.

Google probably wouldn’t be wrong if it claimed on appeal that it already provides data subjects with more information and choices about how their personal data will be used than some others. However, it could prove difficult to convince a judge that the CNIL was wrong to demand a particularly high level of transparency from Google, given how much information Google processes about almost everybody. Your ongoing smartphone usage enables Google to continuously process an enormous amount of data about you: your exact whereabouts over time; every search you type into your browser; all the names, numbers and other contact details of (nearly) everybody you know; and all your calendar appointments, including the details as to where and with whom. Do you know what you were doing and where you were at 16:15 on September 4th, 2018? You may not remember, but chances are that Google has the exact data. It doesn’t seem like an exaggeration to say that these services and apps probably know you better than you know yourself.

Weighing all factors, it doesn’t seem difficult to argue that a company which may know you better than you know yourself, and which is making (nearly all of) its billions of euros of annual turnover by helping other companies to sell you things, can and should be held to a high standard of transparency about how its vast data troves about you may be used. It remains unclear, for instance, if switching off Google’s ad personalisation setting will delete your existing ad profile completely. Another possibility could be that the profile is retained, and perhaps even fed with new data, but this data merely is no longer used to show personalised ads. This point was not mentioned by the CNIL, but it appears that it potentially could (or even should) have been.

If Google has indeed done as much or even more than others to comply with the GDPR, this may be further indication that more fines can be expected soon. It is worth noting that NOYB — one of the associations that filed the collective complaint against Google that ultimately resulted in the penalty — also filed complaints against Facebook, Amazon, LinkedIn, and several other companies.

Another interesting development in this sphere is that Germany’s competition authority recently decided (on February 7th) that Facebook abused its market power by combining user data collected from the Facebook website and app with user data collected through “like” and “share” buttons found on many external websites. A novelty in this case is that privacy considerations have helped shape the assessment under competition law about which forms behaviour should be considered as abusive (in this case: combining data from several sources).

Such interplay between the competition and privacy spheres may also be possible in the opposite direction, as competition considerations could help to shape assessment under privacy law as well. For example, the fact that a given provider is dominant within its market may imply the risk that consent won’t be deemed freely given, because the alternative (i.e. not using its services) is arguably too detrimental. So-called “network effects” can also play a role here — for instance, it is difficult to avoid Facebook if all your friends are on Facebook. Further consequences of a provider being dominant are that the amount of personal data processed will usually be far higher and there will be far more data to potentially combine across different sources. As a result, the insight that such a company may have into individuals’ (private) activities and personal characteristics can be far more comprehensive and intrusive. (Here is an interesting read about an experiment to try to avoid any personal data processing by Google. Spoiler alert: it’s nearly impossible.)

Practical advice to improve compliance

Whatever the eventual outcome of the Google case, providing full transparency and obtaining truly informed consent will remain a continuous challenge. Describing technical processing operations and data flows, which are often highly complex, in a manner which is simultaneously easy to understand, accurate, complete, up-to-date, and provided at the right moment and in the right manner is no easy task. However, we can draw some specific points of advice from the CNIL’s explanations of Google’s shortcomings, which will be relevant to others as well:


  • Integrate privacy settings and information within initial setup procedures. The privacy settings, information, and consent process for specific processing purposes and operations should be part of the initial setup or registration process for digital services. This is because the GDPR requires information to be provided at the time that personal data are obtained. For valid informed consent, providers are also required to present information with the request for consent and not after. A substantial challenge is doing this in a user-friendly way, without overwhelming users with information or asking them so much that they will simply click on without reading.
    • The setup process itself should include sufficient information and explanations. Don’t merely refer to other documents where further information can be found. The challenge is providing information about highly complex processing activities in such a way that users, even those without technical knowledge, can understand the impact on their privacy. Very difficult, we know, but crucial to give true best efforts to achieve.
    • If you want to rely on consent, don’t pre-check boxes or group many purposes and processing activities together. This will increase the risk that consent will not be deemed valid. The bar for consent under the GDPR is high, particularly as interpreted by the CNIL. Any box that is checked in advance is likely to raise concerns that consent is sought in an invalid way.  Relying on a single consent action for various processing activities and purposes at once is also risky, because consent is required to be “specific”. To avoid having to bombard users with copious amounts of (not pre-checked) checkboxes, pop-ups, or other mechanisms to obtain informed, specific, unambiguous, freely given, affirmative (i.e. opt-in) consent — a high bar, indeed — it may be advisable to consider other legal grounds where possible, e.g. performance of a contract or legitimate interest (although it should be noted that those grounds also have their pitfalls).
    • Give users the possibility to configure privacy settings after the initial setup process as well. In addition, it is important that users can review and adjust their privacy settings after initial setup.
  • Provide information in layers but minimise steps and choose topics and highlights carefully. Providing information in several layers, from a very concise, high-level overview, to more detailed information provided after clicking “read more”, is considered a best practice. This approach tries to meet the difficult challenge of providing information that doesn’t require too much time and effort from the user to access and understand, while also providing information that is sufficiently accurate and complete. Particular care should be taken, however, to minimise the number of clicks or actions required. If it takes too many clicks to get to the detailed information, or if you group information under the wrong topic, you will risk non-compliance and fines. Note: this point will be particularly interesting to follow in Google’s appeal. In the process of creating a new Google Account, it appears that far fewer steps are needed than stated by the CNIL to arrive at ad personalisation and other settings — at least via the web pages for services like Gmail or YouTube. Looking at this case from the outside, it is unclear whether this is due to a potential error by the CNIL, because of changes made by Google after the fact, or because the Google Account creation process is different in the specific context of setting up a new Android phone. It is also interesting that the CNIL’s criticism of Google’s “ergonomic choices” to provide more information upon clicking “read more”, does not appear to acknowledge very well that such layering is indeed generally considered a best-practice, also by the European Data Protection Supervisor.
  • Include videos to make privacy information more accessible and easier to understand and provide guides on how these settings can be used. Videos can be a good medium to make information more easily accessible and understandable than long explanatory texts and could therefore help you meet a higher standard of transparency and informed consent. (It may be relevant to note, however, that Google is already doing this and still received the fine from the CNIL.)
  • Perform a data protection impact assessment (DPIA) for your services to help determine which information you should provide and how. As the Google case shows, supervisory authorities may set the bar for consent and transparency higher when services include processing activities that have a higher potential impact on users. In other words, if you process a larger amount of data and/or the data categories which you process are more sensitive in nature, your privacy notices and consent procedures may be held to a higher standard. Performing a DPIA, even when it may not be mandatory under article 35 of the GDPR, can help you better understand the impact and risks of your services to users’ privacy and help you determine which information you should provide them and how.
  • Take particular care concerning personalised advertising, profiling, and data sharing.
    • The information that is usually provided today with respect to personalisation of advertisements and how cookies work or impact user privacy — or other kinds of device IDs and technologies, like programmatic advertising or real-time bidding (RTB) — may be insufficient to meet GDPR standards. (An interesting report on this can be found here.)
    • Creating profiles to inform (automatic) decisions about individuals, including which ads or content to show them, can trigger important privacy concerns and may raise the bar for transparency and consent. It is important to provide sufficient information and choices about whether or not personalisation will occur, which data categories may be included in the profile, after which timeframe data will be deleted from the profile, whether other parties will receive profile data, or whether the profile data may result in different treatment by others, even if they don’t receive the profile data itself. It is also important to be transparent about the underlying logic that is used to create profiles and explain why certain characteristics (inferred or otherwise) will result in being shown certain types of content or advertisements. The information typically provided about this, e.g. when clicking “Why am I seeing this ad?”, currently appears to be highly generic in many cases, which may cause suspicions that the explanation may be incomplete or inaccurate and that far more is going on beneath the surface. It is also advisable to make clear whether the profiles for content and/or ad personalisation will be deleted upon switching this off. (At present, Google still does not make this clear.)
    • Sharing personal data with others is obviously a privacy concern. It is particularly important to be clear and transparent about which parties may receive the data, as well as what they may do with the data and why. Whenever possible, providers should give a complete list of organisations that will receive or obtain access to personal data and for what purposes, particularly if consent is used as the legal basis for the transfer.

Conclusion and further developments

The CNIL’s imposition of the €50 million fine on Google is certainly a landmark event in the enforcement of the GDPR. It will be very interesting to see how the Conseil d’Etat, the French administrative court that is to hear Google’s appeal, will rule on the matter. The Conseil d’Etat may in turn refer certain questions to the Court of Justice of the European Union (CJEU). This does not appear to be unlikely. Indeed, the clash of opinions between the CNIL and Google about two fundamental GDPR concepts — transparency and consent — appears to provide an excellent opportunity for guidance to be sought from the highest court in the EU.

While the final outcome of the case is not easy to predict, it appears likely that at least parts of the CNIL’s decision will be upheld. On the one hand, it is clear that Google has indeed made significant efforts to improve the privacy information and choices provided to its users, as the CNIL has also acknowledged. On the other hand, even more transparency and better consent procedures may be expected under the GDPR, particularly from a tech giant like Google, and particularly in relation to personalisation of ads and content, profiling, and combining or sharing data across different services or companies. One thing is certain: there are interesting times ahead.