The 25 May 2021 marks the third anniversary of the GDPR coming into force. As we have moved from preparation for the GDPR to business as usual compliance with the GDPR, regulators have focused on various issues in different jurisdictions. Although we are now three years into compliance with the GDPR being part of our day to day operations, it is clear that interpretation and expectations regarding compliance from the courts and regulators continue to evolve and develop, which can have a significant practical impact.
In this article, we reflect on key developments and trends across a number of jurisdictions, in particular looking ahead to issues that are coming into focus. You can find more information for each of the following jurisdictions by scrolling below or clicking the links to relevant jurisdictions: France, Germany, Spain, Italy, Netherlands, Poland, Austria, Belgium, Sweden and the UK.
Data Transfers and Schrems II
As covered in our Data Protection Day article earlier this year, international data transfers remain a top priority for data protection compliance in light of the Schrems II judgement. The European Data Protection Board (“EDPB”) published its recommendation for consultation regarding supplementary transfer tools in November last year. In parallel, the European Commission published a draft version of the updated Standard Contractual Clauses in November last year. At the time of writing, an update from the EDPB following its consultation, as well as the final version of the new standard contractual clauses form the European Commission are expected shortly. Both of these will play a key part in efforts to address the practical implications of the Schrems II judgement.
Each year, the Commission Nationale de l’Informatique et des Libertés (“CNIL”) publishes its annual control strategy, guiding stakeholders on its areas of focus in light of new developments and topical issues involving the processing of personal data. The controls carried out following this strategy represent approximately 15% of the formal control procedures carried out by the CNIL during the year. The rest of the CNIL’s inspections are generally triggered by complaints and data breach notifications. In 2020, the CNIL carried out 6,500 investigations, including 247 formal control procedures (only 74 dawn raids due to COVID restrictions). In 2021, the CNIL has focused its inspection strategy on three priority areas:
- Cybersecurity of the French web: The CNIL’s objective is to monitor the level of security of the most widely used French websites in various sectors. The focus will be on personal data collection forms, the use of HTTPS protocol and the compliance with the CNIL recommendation on passwords. The CNIL will also ask organizations about their strategies to protect themselves from ransomware. In 2020, of the 2,825 data breach notifications to the CNIL (up 24% from 2019), more than 500 were the result of a ransomware attack.
- Health data: In the current COVID-19 crisis and given the ever-increasing challenges related to the digitization of the health sector (including the management of access to computerized patient records within health institutions, online medical appointment scheduling platforms, management of personal data breaches in health facilities, etc.), the CNIL decided to maintain health data as a priority and to continue its controls initiated in 2020. Beyond checking compliance, the inspections carried out shall continue to raise the level of security of health data. The CNIL has launched a first “personal data sandbox” session, in the form of a call for projects in the field of health: twelve projects will benefit from CNIL support in 2021, four of which will receive enhanced support in order to come up with a solution that respects the privacy of individuals.
- Cookies and other trackers: This topic was initiated in 2020 in order to ensure compliance with the obligations regarding targeted advertising and profiling of individuals. The CNIL will continue to conduct inspections with an extended scope since, as of April 2021, the guidelines and the recommendations adopted by the CNIL on 1 October 2020 clarifying the rules relating to the collection of consent started to apply. With these new guidelines and recommendations, the CNIL addresses the consequences of the decision dated 19 June 2020 by the French Council of State. This decision partially annulled the previous guidelines of the CNIL on cookies and other tracking tools on the ground that the CNIL’s guidelines could not introduce a blanket prohibition on the use of “cookie walls”, a practice which consists of blocking users access to a website where users refuse to consent to cookies and other tracking tools, in a “soft law” legal instrument. By focusing its inspection strategy on cookies, the CNIL is responding to the high expectations of individuals who are increasingly sensitive to Internet tracking, as evidenced by the constant complaints it receives on this matter.
In addition, in line with the past two years, the CNIL has announced it will continue to cooperate with its European counterparts on cross-border processing activities. Therefore, it will use both methods of cooperation provided for by the GDPR: (i) mutual assistance, which allows for the sharing of all useful information between data protection authorities; and (ii) the conduct of joint operations, which allows for inspections to be carried out in France or in other EU Member States in the presence of the competent authorities’ agents. For example, in 2020, more than 1,000 European cooperation cases concerned complaints or inspections. The CNIL was the lead supervisory authority in approximately 100 cases and the supervisory authority concerned in nearly 400 cases. This cooperation mechanism has made it possible to settle nearly one hundred individual French situations that were the subject of complaints and has led the CNIL’s restricted panel to issue a first sanction under the “one-stop shop” procedure.
Finally, based on the very recent report of activity published by the CNIL on 18 May 2021, and following the pandemic which has brought payment data to the forefront and accelerated some of the transformations in the payment industry, the CNIL announced that it intends to address this issue in a graduated manner and will soon publish a white paper on this topic.
Enforcement actions in the aftermath of Schrems II
- The Bavarian Data Protection Authority ordered a German company to cease the use of an email newsletter service which included a transfer of customer email addresses to a US provider. According to the Bavarian Data Protection Authority, the use of the tool and the transfer of the email address to the US provider was unlawful under data protection law. This was because the German company had not examined whether, in addition to the EU standard data protection clauses which were used, “additional measures” within the meaning of the “Schrems II” decision were necessary in order to make the transfer compliant with data protection requirements and did not implement such additional measures. The Bavarian Data Protection Authority chose not to impose a fine since the German company stated it would stop using the service immediately.
- As part of an information campaign, the data protection authority of Rheinland-Pfalz wrote a letter to dozens of companies, associations and government agencies in Rheinland-Pfalz. This letter strongly recommended checking the lawfulness of data processing activities in connection with third countries included in the checklist provided by the data protection authority of Rheinland-Pfalz, to identify action items in order to stop or prevent data protection violations. The aim of the information campaign is to raise awareness among controllers/processors and therefore to strengthen the rights of individuals. The data protection authority announced that following the information campaign, there will be random spot checks.
Right of access
The scope of the right of access is still debated in Germany, in particular, whether it requires providing copies of files, documents and emails. There are a number of court decisions on the scope of the right of access. For example, the Regional Court of Heidelberg denied a claim of a former board member against his former employer to provide copies of email correspondence, whereas the Regional Court of Cologne ruled that an individual has a right of access to all personal data relating to him/her against his/her insurer, including conversation notes and call memos. Recently, the Federal Labor Court rejected a claim regarding copies of emails because it was not sufficiently precise pursuant to Section 253 Paragraph 2 Nr. 2 German Civil Procedure Code. According to the court’s press release, the plaintiff was employed by the defendant. After termination of the employment the plaintiff requested information about his personal data processed by the defendant as well as the provision of a copy of this data pursuant to Article 15 Paragraph 3 of the GDPR, in particular requesting a copy of any e-mails sent by him or mentioning him in the course of the employment. According to the court’s press release, it was not clear exactly which e-mails the request referred to. The Federal Labor Court has therefore not made a decision on the scope of the data subject access request.
In light of a number of court decisions granting data subjects the right to obtain a full copy of their data according to Article 15 Paragraph 3 of the GDPR, data subject access requests are increasingly becoming a standard tool used in employee termination cases.
In negotiations with works councils GDPR compliance has become an absolute standard topic in practice.
Until recently, German courts appeared to take a narrow interpretation of non-material damages and ruled that a person who has suffered non-material damages must have suffered a noticeable disadvantage and that a mere infringement of the GDPR does not automatically entail a claim for damages. However, in the last year in particular, a few court decisions have been made that ordered companies to pay non-material damages due to data protection violations. For example, the labour Court Düsseldorf ruled that a company must pay 5,000 EUR to a former employee because according to the court the company’s response was late and not comprehensive enough in response to a subject access request.
A recent decision by the Federal Constitutional Court overturned the dismissal of a claim for 500 EUR pursuant to Article 82 of the GDPR and directed the lower court to reconsider the issue. According to the Federal Constitutional Court, it was not clear how Article 82 of the GDPR was to be understood due to the lack of requirements in the GDPR and the Court of Justice of the European Union (“CJEU”) case law. The Federal Constitutional Court therefore decided that the CJEU must clarify the requirements and scope of Article 82 of the GDPR and that the local court could not make a decision without making a referral to the CJEU.
In our Data Protection Day article earlier this year, we mentioned two important penalties imposed to financial entities by the Spanish Data Protection Agency (“AEPD”). Since then, the trend has continued and other million-Euro penalties have been imposed, in the first quarter of 2021 already exceeding the total amount of the penalties for 2020.
The wide margin of interpretation in the application of the GDPR by local regulators is generating important divergences that must be taken into account by organizations to avoid potential sanctions.
In 2020, the AEPD issued 393 sanctioning resolutions (16% more than the previous year), although of these only 172 imposed a financial penalty.
Some of the main conclusions that can be extracted from these resolutions are:
- The AEPD considers that the use of generic and unclear expressions in privacy policies and informative clauses such as “getting to know you better” is considered by the AEPD as “imprecise and vague terminology” that breaches the duty of transparency. It also considers it punishable for data controllers to suggest that their data protection policy benefits the customer, in such a way that it is implied that not accepting it will mean the loss of advantages.
- The AEPD appears to be reluctant to accept legitimate interest as a legal basis for processing personal data. In order to carry out a processing of personal data based on legitimate interest, the AEPD requires that this interest is justified and explained to the user, without being confused with a simple purpose. However, having an interest is not enough. The validity of the legitimate interest as a legal basis for processing personal data will depend on the outcome of a balancing test. In order to show that there is a legitimate interest that serves as a legal basis for data processing, it is necessary to weigh the interests of both parties and see which one prevails. It is unlikely that the entity’s interest prevails and therefore the legitimate interest as a basis for the processing is hardly approved by the AEPD.
- Consent obtained through the user’s inactivity is not considered valid. The user must be able to give separate consents for the different processing of her/his personal data by an entity for the same purpose. Where there are different purposes, the consent must also be given for each one.
- With respect to the roles of a data processor and data controller, the AEPD disagrees with the legal qualifications made by the parties and states that both the data processor and the data controller are not formal concepts, but functional ones. Therefore, the regulator has considered collaborating entities of the sanctioned company as its data processors and makes them responsible for their infringements.
- When calculating penalties, the AEPD has considered as an aggravating circumstance to be the large company status of the responsible entity and its turnover. In one of the resolutions, the fact that the company was implementing a procedure to comply with the GDPR was taken into account to argue negligence as an aggravating circumstance: the Agency believed that they could not have been unaware of the requirements contained in the clauses. The AEPD does not justify in its resolutions the specific amount of the penalties imposed, within the wide range set by the GDPR.
Given the track record of the Spanish regulator, it is expected that both the frequency and amounts of sanctions will remain high in the future, with particular pressure on large companies and multinationals. In this context, it is particularly relevant to take into account the nuances in the application of the GDPR in Spain and to pay special attention to procedural issues if any potential sanctioning procedures were to arise.
The new Committee of the Italian data protection authority (Garante) has been fairly active over recent months.
Recently, the Garante has been fairly busy with COVID related initiatives, and specifically the national COVID certificate and the possibility for private companies to offer vaccinations to workers.
- COVID Certificate: The EU Commission has issued a draft Regulation for a European COVID certificate to facilitate the safe free movement of citizens within the EU. Italy has already issued its digital certificate law and the Garante has raised a number of concerns regarding this law, mainly in relation to the amount of information collected, purposes of processing, and data retention period. In addition, the Garante had not been consulted before the enactment of the law, while consultation is mandatory, even if not binding.
- Vaccinations: In relation to vaccination at the workplace, the Garante has provided guidelines on how this can be implemented, with focus on the central role of the company’s doctor. Any information relating to vaccination (from the identity of workers willing to be vaccinated to the fact that vaccination has occurred) cannot be collected or processed by the employer and should instead be managed only by the company’s doctor, acting as an independent data controller. The specific implementation rules of this initiative are yet to be defined, but it is anticipated that companies willing to offer workers the possibility to be vaccinated should carefully consider the guidelines from the Garante.
- Minors and the online ecosystem: There’s a lot of attention on the issue of minors participating in online activities and the measures to be in place to determine the age of users and to monitor compliance.
- DPO: The Garante has recently published updated indications on the role and functions of the DPO. In Italy other than as required under Article 37 of the GDPR, there are no further circumstances when the appointment of a DPO is mandatory. The Garante’s position is that even if the appointment of a DPO is not mandatory, it is advisable, especially in light of the accountability principle. In case of a group DPO, it is good practice to have local contact points that liaise with the group DPO and also serve as contact point for the Garante and data subjects. In case of a group of companies, where each company has its own DPO, the suggestion is to have a network made of all of the DPOs, which may coordinated by the headquarters’ DPO.
- Workers’ Data and Company IT Systems: A recurring theme is the processing of workers’ data in relation to their use of the company’s IT system and relevant device. This often overlaps with remote monitoring provisions and the Garante has clearly indicated, in recent orders and decisions, that main points of focus are transparency and information to workers, proportionality in the kind of data processed, means of processing, data retention periods as well as the extent of data sharing.
- Commercial Information Code of Conduct: The Code of conduct for the processing of commercial information has been approved by the Garante. This Code of conduct refers to commercial information which is publicly available (for example, published in public registers) or collected from the data subject and that is collected, organized and made available by companies specialized in providing commercial services to third parties. This Code of conduct is not applicable to credit information systems, which are covered under other specific measures of the Garante.
- Enforcement: In terms of enforcement, the main areas of focus have been, in addition to the online environment, spanning from online platforms to IoT, direct marketing and profiling, employment monitoring, transparency and information to data subjects. Specifically in relation to transparency, the Garante has launched a campaign with the claim “Easy privacy information via icons? Yes, you can!’ to call for solutions which can render information notices “simpler, clearer and immediately understandable” so that they can effectively achieve the purpose of providing information to data subjects.
At the end of 2019, the Dutch Data Protection Authority (in Dutch: Autoriteit Persoonsgegevens, hereafter “Dutch DPA”) announced its focus areas for 2020-2023, with the overarching theme being “data protection in a digital society”. In its announcement, the Dutch DPA identified three key topics that it will keep an extra close eye on and enforce as a matter of priority. The Dutch DPA’s focus areas are built around its risk-based approach to supervision; in addition to its regular enforcement activities such as investigating data breaches, handling complaints from individuals and supporting DPOs, the Dutch DPA puts focus on subjects that it considers come with a high privacy risk for the general public. The Dutch DPA may use various instruments to effect its supervisory focus, including issuing regulatory guidance, information campaigns aimed at the general public and enforcement action.
The three key topics that form part of the Dutch DPA’s focus for 2020 – 2023 are (i) data brokering, (ii) digital government, and (iii) artificial intelligence and algorithms.
The data brokering focus area includes the sub-topics (re)sale of data, internet of things, profiling and behavioural advertising. Although the Dutch DPA recognizes that digitization and the data economy may have its benefits, it also flags that individuals are not always fully aware that their personal data are collected, sold and used for commercial purposes and are unable to exercise control of their personal data. This is a reminder that transparency and respecting data subjects’ rights are key aspects of compliance in relation to the commercial use of personal data.
The digital government focus area addresses the following sub-topics: data security, smart cities, data exchange and elections and micro-targeting. For example, in April 2021, the Dutch DPA fined the Dutch municipality of Enschede for making use of Wi-Fi tracking to measure the size of the crowd in the city centre. The tracking technique could be used to follow individuals in the city centre, and although there was no indication that people were actually followed on an individual basis, the Dutch DPA considered that the technique used was disproportionate for its purpose and therefore had no legal basis under the GDPR.
In respect of its third focus area, the Dutch DPA announced that together with various stakeholders it will work on the development of a supervisory framework in relation to artificial intelligence and algorithms that make use of personal data, in particular where these techniques are used for automated decision making and profiling. Again, the Dutch DPA flags that transparency and data subjects’ rights are key elements of compliance when it comes to AI and the use of algorithms.
As mentioned above, the DPA’s focus areas do not replace the DPA’s “regular” enforcement activities. While the Dutch DPA’s emphasis when the GDPR was introduced was on education and prevention, over time the Dutch DPA sharpened its teeth and now increasingly uses its powers to take corrective and punitive action in response to GDPR violations. To date, the Dutch DPA has issued around a dozen of administrative fines and a similar amount of administrative orders subject to a penalty in response to various types of violations. We note that the Dutch DPA maintains its own policy rules in relation to administrative fines, on the basis of which the Dutch DPA’s fines have not, and unlikely will, reach the maximum amounts possible under the GDPR.
Increasing enforcement activity
The first few months of the GDPR can be described as slow. During the first year of the GDPR, the Polish Data Protection Authority (Prezes Urzędu Ochrony Danych Osobowychor “PUODO”) focused on reminding people about the GDPR. Now that this initial period has ended, we now have an increasing number of both inspections and decisions, often with penalties imposed. The penalties focus mostly on data breaches: the biggest case of a well-known e-commerce brand resulted in a 2.8 million PLN fine being imposed (approximately 600,000 EUR). The subsequent request for appeal to the Voivodeship Administrative Court was denied. Similarly, the authority has been tough on reporting of breaches – see the recent 1.1 million PLN (approximately 240,000 EUR) penalty imposed on the local TV company (Cyfrowy Polsat). Those cases clearly confirm the tough approach of the Polish authorities, which does not seem to be softening with time.
In addition to the security concerns, mass processing also appeared to be targeted. One of the biggest penalties so far (943,000 PLN – approximately 220,000 EUR) was imposed on a company that was processing the data available in public registers on a mass scale and which (in the opinion of the authority) failed to fully fulfil its notification duties. However, this decision has been partially dismissed in the administrative court and will be considered by PUODO again.
An additional point to note is that PUODO is not the only authority to be aware of. The second and fairly active player is the Office for Protection of Competition and Consumers (Urząd Ochrony Konkurencji i Konsumentów) which sometimes refers to the data protection rules in its decisions. In practice it provides some general guidelines and these do not necessarily align with the approach suggested by the PUODO. The third authority is the Office for Electronic Communication (Urząd Komunikacji Elektronicznej). So far its focus has been on traditional telecoms, but in light of the introduction of the European Electronic Communication Code (“EECC”), a number of providers of electronic services may fall into its remit. Since the EECC provides for additional personal data protection terms (as well as cybersecurity issues), this is yet another player which companies need to watch out for. In practice, such decisions will shape the practice outside of the theoretical areas of interest of the authorities mentioned above: consumers and electronic communications.
Enforcement by the DPA
The largest fine imposed by Austrian Data Protection Authority (“DPA”) to date, 18 million EUR against the Austrian Postal Service for processing data concerning the likely political affiliation of its 3+ million customers without consent, was voided by the Federal Administrative Court in December 2020. The DPA had failed to make any findings concerning the culpability of any particular manager within the Austrian Postal Service which, under the plain language of the Austrian Data Protection Act, is a formal requirement for imposing any fine.
Private enforcement and damage claims
In a number of decisions, including from the Austrian Supreme Court, Austrian courts have clarified the rules of liability for immaterial damages under the GDPR. Through a number of decisions, the courts have also established a first baseline of approx. 600 EUR –900 EUR per data subject for immaterial damages in the case of serious GDPR violations.
During this third year of GDPR being in force, the Belgian Data Protection Authority (“DPA”) has continued focusing and implementing its Strategic Plan 2020-2025, working on its priorities and areas of focus, including important topics such as online data, sensitive data and images/CCTV.
The Authority also published new tools as part of its toolbox for data controllers and processors to assist companies and organizations in the implementation of the GDPR. Interesting to note is the Recommendation on data cleansing and media destruction techniques published by the DPA in December 2020. Questions relating to the “secure” disposal of data or data carriers are recurrent and reference documents in that respect at international level are scarce.
In light of COVID-19 and the challenges for organizations with balancing privacy rights and health and safety obligations, the Belgian DPA has also published a lot of guidance in relation to the processing of personal data in the context of COVID-19.
One important point to note is the adoption, on 20 May 2021, by the DPA of its first-ever European code of conduct (the “EU Cloud CoC”). At the same time, the DPA also accredited Scope Europe as the supervisory body for the EU Cloud CoC. This supervisory body will ensure that members of the EU Cloud CoC comply with its provisions.
This is the first transnational code of conduct adopted in the European Union, following a positive opinion issued by the European Data Protection Board (EDPB) on the EU Cloud CoC on 19 May 2021. By approving this EU Cloud CoC, the Belgian Data Protection Authority contributes to the harmonized interpretation of the GDPR for the cloud sector in the EU and contribute to better protection of personal data processed in the cloud.
Enforcement action is expected to continue.
As for other EU jurisdictions, Adtech data protection compliance is high on the agenda and continues to be a key focus area in Belgium. A decision from the Belgian DPA is expected later this year in relation to real-time bidding (RTB) for digital advertising and the Transparency and Consent Framework (TCF), a standard developed by IAB Europe to enable companies to comply with certain requirements of the GDPR and ePrivacy Directive.
2020 in regulator numbers
The Swedish regulator, the Swedish Authority for Privacy Protection (“SAPP”), has published the following numbers:
- Administrative sanction fees in 15 cases, totalling 150 million SEK (approximately 15 million EUR);
- 4,600 personal data breach notifications;
- 3,200 data subject complaints; and
- 1,000 decisions on camera surveillance permit cases.
Enforcement actions have spanned across various areas such as healthcare providers and their technical and organizational measures, search engine operations, camera surveillance and use of facial recognition to check children’s presence in school.
The SAPP continues to expand its resources and has announced that it will increase its supervisory activities, in particular by following up on complaints.
Impact of Brexit
Following the end of the Brexit transition period at the end of 2020, the GDPR no longer directly applies going forward in the United Kingdom. However, data protection laws in the United Kingdom continue to be closely aligned to the GDPR, at least for the time being.
Following the end of the Brexit transition period, the GDPR was incorporated into UK domestic law, known as the “UK GDPR”. As the UK GDPR is derived from the GDPR, there has not been a significant change to the obligations and requirements companies are expected to comply with in the UK from a data protection perspective. However, this may change over time. There have been some public statements from the Department for Digital, Culture, Media and Sport regarding the future of the UK’s data protection laws post Brexit and how these could diverge from the EU GDPR in the future. However, the Government has not yet published any legislative proposals or draft legislation to significantly change the UK’s data protection framework.
In terms of data transfers from the EU to the UK, following the end of the Brexit transition period, and pending an adequacy decision from the European Commission in relation to the UK, transfers of personal data from the EU to the UK have been subject to a temporary bridging mechanism agreed in the EU-UK Trade and Cooperation Agreement. This has prevented disruption to data flows from the EU to the UK, but is only an interim solution. In February this year the European Commission published draft adequacy decisions in respect of the UK under the GDPR and Law Enforcement Directive, which at the time of writing are yet to be approved.
One practical impact of Brexit has been whether companies outside of the UK need to appoint a UK representative under Article 27 of the UK GDPR. In addition, whether UK companies are required to appoint an EU representative under Article 27 of the EU GDPR. This is something that companies should consider and address, if they have not done so already.
A key issue that is sharply coming into force is the use of children’s data in relation to online services. The 12 month transition period for complying with the ICO’s Appropriate Design Code ends on 2 September 2021. With only a few month left to prepare, compliance with the 15 standards of the Code is an important area for companies to address. In addition, going forward companies will need to adapt to business as usual compliance with the Code and its practical implications. The Code has a broad scope, and applies to online services that are “likely” to be accessed or used by a child, which is anyone under the age of 18.
The ICO’s investigation into Adtech and real time bidding was paused from May 2020 due to COVID-19. However, in January this year the ICO announced it is resuming its investigation. Therefore, Adtech data protection compliance is back on the agenda and continues to be a key focus area in the UK.
The ICO’s Data Sharing Code of Practice was laid before Parliament on 18 May 2021. This is a statutory code of practice which the ICO is required to publish under the Data Protection Act 2018, which the ICO is also required to take into account when considering if an organisation has complied with data protection law when sharing personal data. You can read more about this in our update here.
Enforcement action in relation to significant data breach and cyber security issues has historically been a focus for the ICO, which we expect will continue. In addition, the focus on enforcement regarding direct marketing compliance will also likely continue.
In addition, we expect more focus on issues the ICO has been particularly active in terms of guidance, including AI and data protection implications, as well as use of biometric data and facial recognition technology.