Welcome to our Data Protection bulletin, covering the key developments in data protection law from April 2021. 

Data protection

Cyber security

Regulatory enforcement

Civil litigation

 

Data protection

European Commission releases Proposal for a Regulatory Framework on AI

On 21 April 2021, the European Commission released a proposal for a regulatory framework on AI. This marks the EU's latest attempt to establish Europe as a global leader in the implementation and development of trustworthy AI. The proposed regulation applies not only to providers of AI systems and services but also to users of AI systems located inside the EU or in a third country, where the output produced by the system is used in the Union. AI has been defined in the proposed regulation as "software that is developed with one or more of the techniques and approaches listed in Annex I and can, for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations, or decisions influencing environments they interact with."

It is intended that the regulatory framework on AI would regulate AI systems depending on their level of risk and the proposal states that: "AI should be a tool for people and be a force for good in society with the ultimate aim of increasing human well-being."

The proposed regulatory framework refers to a system which classifies risk according to the use, intended purpose and reach of the AI applications, in addition to the potential harm and outcomes caused by its use. The proposal refers to four categories of risk:

  • Unacceptable risk: the specified uses falling within this category contravene EU values and, as such, will be banned. The EDPB has provided a Q&A on the new regulatory framework and the following are listed as unacceptable risks that will be banned: "social scoring by governments, exploitation of vulnerabilities of children, use of subliminal techniques, and – subject to narrow exceptions – live remote biometric identification systems in publicly accessible spaces used for law enforcement purposes."
  • High-risk: the uses of AI that fall within this category are specified in an Annex attached to the proposal. Any item falling within this category is deemed to be high-risk in that it would create an adverse impact on people's safety or their fundamental rights. As such, providers of high-risk systems will be required to implement quality and risk management systems and complete conformity assessments. The EDPB Q&A specifies that all AI systems used (or to be used) for remote biometric identification of persons fall within this category.
  • Limited risk: for uses of AI applications that fall within this category (for example the use of chatbots), providers will be obliged to meet specific transparency requirements.
  • Minimal risk: there are no additional obligations on the uses of AI applications that fall within this category.

Further, the proposal states that "high quality data, documentation and traceability, transparency, human oversight, accuracy and robustness, are strictly necessary to mitigate the risks to fundamental rights and safety posed by AI and that are not covered by other existing legal frameworks."

The European Commission also released a proposal for a machinery products regulation and a revised Coordinated Plan on AI at the same time. The Proposed Machinery Products Regulation seeks to provide clarity and assist in the safe integration of AI systems. While the revised Coordinated Plan seeks to further enable the development of excellent AI and to establish the EU as a global leader in this field.

The proposals will now be considered and debated by the European Parliament and Council and, given the controversial nature of the topic and the interests at stake, it would be fairly safe to assume that we are at a very early stage in what it is likely to be a long-drawn out and highly contested process. In fact, a group of 40 cross-party EU lawmakers, in addition to other lawmakers and civil security groups, have already called for the adoption of tougher provisions and a more hard-line approach.

UK set to implement National AI Strategy later this year

On 12 March 2021, the UK's Digital Secretary, Oliver Dowden, set out his 10 Tech Priorities. Among those priorities, it was announced that a National AI Strategy would be published later this year, marking a significant step towards the establishment of the UK as a world leader in the development, commercialisation and use of responsible AI. The UK has already made progress in this area, Tortoise Media's Global AI Index ranks the UK as third globally in its level of investment, innovation and use of AI.

In the Government's press release, it is stated that "The new AI strategy will focus on:

  • Growth of the economy through widespread use of AI technologies
  • Ethical, safe and trustworthy development of responsible AI
  • Resilience in the face of change through an emphasis on skills, talent and R&D."

Further, Dowden stated: "Unleashing the power of AI is a top priority in our plan to be the most pro-tech government ever. The UK is already a world leader in this revolutionary technology and the new AI Strategy will help us seize its full potential - from creating new jobs and improving productivity to tackling climate change and delivering better public services."

It is not clear whether and to what extent the UK will adopt a similar approach to the EU in its plans for AI-specific legislation. The Government's announcement follows the publication of an AI Roadmap by the UK AI Council, an independent expert committee that provides advice to the UK Government on AI. The AI Roadmap stated that the development of an AI strategy was essential and further put forward 16 recommendations to assist in its development. Those 16 recommendations can be broadly split into the following areas of focus: (i) Research, Development and Innovation; (ii) Skills and Diversity; (iii) Data, Infrastructure and Public Trust; and (iv) National, Cross-Sector Adoption. The establishment of a national AI strategy was included within these recommendations, however, the extent to which the remaining recommendations will be adopted remains to be seen.

EDPB provides opinion on draft UK adequacy decisions

On 14 April 2021, the EDPB announced that it had adopted two opinions on the draft UK adequacy decisions at its 48th plenary session. Of particular note, the EDPB recognised that the UK's data protection framework largely aligned with the EU's. EDPB Chair, Andrea Jelinek said:

"The UK data protection framework is largely based on the EU data protection framework. The UK Data Protection Act 2018 further specifies the application of the GDPR in UK law, in addition to transposing the LED, as well as granting powers and imposing duties on the national data protection supervisory authority, the ICO. Therefore, the EDPB recognises that the UK has mirrored, for the most part, the GDPR and LED in its data protection framework and when analysing its law and practice, the EDPB identified many aspects to be essentially equivalent. However, whilst laws can evolve, this alignment should be maintained. So we welcome the Commission's decision to limit the granted adequacy in time and the intention to closely monitor developments in the UK.”

However, the EDPB identified a number of items requiring further assessment and/or close monitoring. In particular, the EDPB referred to the UK's Immigration Exception (Schedule 2, Data Protection Act 2018, Part 1) as "broadly formulated" and has called on the European Commission to provide further information in relation to the necessity and proportionality of such an exemption. In addition, the EDPB identified certain aspects of the UK's legal framework that could potentially undermine the protections afforded to personal data transferred from the EEA. The EDBP identified the UK's approach as being potentially in conflict with Article 44 GDPR which establishes that the onward transfer of data should only take place if the levels of protection afforded by the GDPR are not undermined. On this second point, however, the EDPB supported the establishment of an Investigatory Powers Tribunal to hear cases on the use of investigatory powers and intelligence services, and the introduction of Judicial Commissioners to ensure adequate oversight for national security purposes.

At the same time, the EDPB also adopted:

  • Guidelines on the application of Article 65(1)(a) GDPR, a "one-stop-shop" dispute resolution mechanism requiring the EDPB to issue a two thirds majority decision whenever a Lead Supervisory Authority refers a draft decision (within the meaning of Article 60(3) GDPR)that has been objected to by a Concerned Supervisory (a recent example of the application of this provision can be found in relation to Twitter's data breach here).
  • Guidelines on the targeting of social media users, attempting to clarify the roles and responsibilities of social media providers and those legal or natural persons who use data to target social media users.
  • A statement on the exchange of personal data between public authorities under existing international agreements.

Irish DPC to investigate Facebook's potential data breach

On April 3 2021, the personal data (including phone numbers) of 533 million Facebook users was leaked onto a free online database (of those, it is estimated that a significant amount were EU users), having been readily accessible since at least January 2021 to those willing to pay a fee.

Crucially, Facebook did not report the data breach to its users – the issue only came to light after news reporters uncovered it. In response, Facebook posted a blog post which sought to justify its decision not to disclose the apparent breach. Facebook explained that the data in question had not been obtained by a hacker, instead, it had been scraped off the platform prior to September 2019. Speaking on behalf of Facebook, Mike Clark, Facebook's product-management director, stated that the data in question was obtained by exploiting a particular vulnerability in Facebook's contact importer. That vulnerability was previously reported on and rectified in August 2019. Therefore, in Facebook's opinion, the data in question did not constitute a new breach and so it was not obligated to inform users.

Ireland's DPC has announced its intention to investigate the dataset further so as to clarify the accuracy of the explanation provided by Facebook.

The establishment of the ADGM Authority

On 12 April 2021, the Board of Abu Dhabi Global Market ("ADGM"), an international financial centre located in Abu Dhabi, announced that it had approved the establishment of the ADGM Authority (the "Authority") and that it would be led by Mark Cutis as Chief Executive Officer. It is hoped that the Authority will provide ADGM with vital insights that support ADGM's ongoing growth. It has also been revealed that the Authority's key functions will include the Office of Strategy and Business Development, Corporate Services, the Office of Information Security and Enterprise Risk Management. In a statement, H.E. Ahmed Al Sayegh, Minister of State (UAE) and Chairman of ADGM, referred to this period as a "crucial juncture in ADGM’s growth as we move to our next phase of development as an innovative global financial centre and digital business hub. ADGM will continue to play a key role in attracting strong capital inflows for the country as we promote the diversification of the economy spurred by the ongoing energy transition."

Cutis has 40 years of experience in global banking and investment management. Speaking on his new role, Cutis stated that: "Building a centre of excellence where financial counterparties can come together to transact business while enjoying the rule of law with independent courts, as well as enlightened and savvy regulation is the hallmark of ADGM."

Portugal's DPA bans the use of Cloudflare

On 28 April 2021, Portugal's DPA issued a decision stating that Portugal's National Statistic Office must stop using the US website security company, Cloudflare, and to suspend the transfer of any personal data to the US (or to any other country specified in the 2021 Census survey as lacking an adequate level of protection) within 12 hours. The DPA came to this decision over concerns that personal data was being transferred to the US in contravention of EU data protection law. This decision follows the CJEU's decision in Schrems II, where  the EU-US Privacy Shield Framework was invalidated as it did not provide sufficient safeguards against US government surveillance of transferred personal data.

Cyber Security

DCMS announces plans for new cybersecurity law to protect smart devices

The DCMS recently announced Government plans to introduce Secure by Design legislation, with the aim of providing greater protection against cyber-attacks and to makes smart devices more secure.

The introduction of this cyber-security law would ensure that the vast majority of smart devices would meet certain requirements, these include:

  • manufacturers to provide information to customers at the point of sale relating to the length of time that security software updates will be guaranteed on devices;
  • manufacturers banned from setting easily guessed default passwords on devices; and
  • the introduction of provisions to assist and enable the reporting of software vulnerabilities.

Supporting evidence was provided for the new law in two further Government research reports (accessible here and here).

The UK Government's 2021 Cyber Security Breaches Survey

The UK Government's 2021 Cyber Security Breaches Survey (the "2021 Survey") demonstrates the significant and alarming threat posed to businesses and charities by cyber-security breaches. In contrast to the 2020 Cyber Security Breaches Survey (the "2020 Survey"), where it was reported that 46% of businesses and 26% of charities had reported that they had suffered security breaches or attacks in the preceding 12 months, the 2021 Survey reported that 39% of businesses had suffered security breaches or attacks, while the results for charities remained unchanged. Although this initially seems positive, the effects of Covid-19 on trading activity in the last 12 months must be considered. Other evidence from the study indicates that the risk to businesses is potentially higher than ever as a result of the pandemic. In addition, the 2021 Survey reported a decrease in the number of businesses deploying security monitoring tools (35% as opposed to 40% last year) or undertaking any form of user monitoring (32% as opposed to 38% last year). It may therefore be the case that businesses this year are far less aware of the breaches they suffer.

Businesses and charities alike are likely to suffer costs and negative outcomes in relation to the reported breaches and attacks. On a more promising note, while the average cost of cyber-security breaches remains high (estimated to be £8,460 for all businesses) the 2021 Survey showed a decrease in the proportions experiencing such negative resulting outcomes (in comparison to the 2020 Survey). This may be due to more businesses and charities implementing successful cyber-security measures, in addition to the more frequent use of cloud storage and backups. Finally, it should be noted that the 2021 Survey reported that cyber-security remains a high priority for the directors and managers of 77% of businesses.

Regulatory enforcement

Dartington Hall Trust list breaches data protection rules

DHT has apologised for disclosing a list of individuals campaigning to stop development on its land. The list, which appears to have been populated by people who were particularly active on the Save Dartington Facebook page, contained details of staff members, DHT members, alumni, employees, local residents, and other interested parties. The information on the list also included details of where individuals lived and memberships of organisations such as Extinction Rebellion.

The ICO has said that DHT: did not have a valid reason to list details such as membership of other organisations; published the information unnecessarily; and did not take appropriate security measures in order to protect their personal data. The decision brings Article 6 and Article 5 of the UK GDPR into play as DHT appears not to have had a lawful bases for processing the relevant personal data and failed to take adequate security measures. The ICO has previously taken action against organisations, such as construction companies, which create lists containing sensitive personal data.

Italian DPA issues its fifth-largest fine

The Italian DPA, the Garante Per La Protezione Dei Data Personali (“GPDP”), has imposed a fine of €4,501,868 on the internet service provider Fastweb S.p.A (“Fastweb”). This is the fifth-largest fine handed down by the GPDP under the GDPR. The GPDP identified various breaches of the GDPR, including Articles 5, 6, 7, 12, 13, and 21, in its decision notice of 25 March 2021. Fastweb had “not adopted a system that facilitates the exercise of the rights of the data subject.

The GPDP investigated Fastweb following hundreds of complaints by members of the public about promotional calls made without data subjects’ consent. The GPDP found that Fastweb acquired various contact lists from external partners who themselves had not obtained the consent of the users to share their personal data. In addition, the security measures of the customer management systems were not sufficiently robust. This allowed for people claiming to be from Fastweb to contact these individuals.

In its decision, the GPDP acknowledged various mitigating factors, including Fastweb’s cooperation with the investigation, participation in roundtables to combat aggressive telemarketing, and its intention to improve relevant control systems. We have previously reported on GPDP enforcement action in relation to aggressive telemarketing in our July update.

Booking.com fined for delay in reporting data breach

The Dutch DPA has fined Booking.com (“BC”) €475,000 for a 22-day delay in reporting a data protection breach. The breach, which occurred in December 2018, involved a phishing attack which led to staff giving cyber-criminals the details of their BC staff account. The personal data accessed included telephone numbers, addresses, names and details of the bookings of over 4,000 data subjects. Credit card information of 283 data subjects was also obtained, together with, in 97 cases, the CVV code.

Under Article 33 GDPR, organisations must notify the relevant Supervisory Authority of a personal data breach without undue delay and, where feasible, not later than 72 hours after having become aware of it. BC discovered the data breach on 13 January 2019 but only notified the Dutch DPA on 7 February 2019.

The Dutch DPA stressed the importance of reporting breaches expeditiously:

“Taking rapid action is essential, not least for the victims of the breach. After receiving a report the DPA can order a company to immediately warn those affected. This can prevent criminals having weeks in which to attempt to defraud customers.”

Beyond the issue of data breach notification, this enforcement action also highlights the importance of vigilance against phishing attacks which, according to the 2021 Survey (discussed above in the cyber security section), are by far the most common type of cyber-security breach in the UK.

The CNIL fines a data processor for the first time

The French DPA (“CNIL”) has fined both a data controller and a data processor for failure to ensure data security, in breach of Article 32 GDPR. The decision, which was published on a ‘no-names’ basis, is notable in that it is the first time that the CNIL has imposed a fine on a data processor (having previously sanctioned only data controllers in relation to data breaches involving potential liability on the part of both data processors and data controllers pursuant to a policy which was applicable until 27 January 2021).

The data controller, an e-commerce platform, was the subject of multiple “credential stuffing” attacks on its website, which was operated by the data processor, over the course of an 18-month period. While security measures were eventually implemented to address these attacks, it took over a year to do so, at which point the personal data of approximately 40,000 customers had been compromised. The CNIL noted in this respect that more effective measures could have been implemented sooner, including limiting the number of requests allowed per IP address on the website, and introducing CAPTCHA authentication.

The CNIL found that both the data controller and data processor had failed to comply with their obligations in respect of data security under Article 32 GDPR. The decision emphasised that the data processor cannot simply play a passive role when it comes to data security. While the data controller must “decide on the implementation of measures and give documented instructions to its processor”, the data processor must “also seek the most appropriate technical and organisational solutions to ensure the security of personal data and put them forward to the controller”. In other words, the data processor should not simply take the word of the data controller as gospel: it should actively suggest technical and organisational measures, especially in circumstances where existing arrangements have been shown to be inadequate.

The decision serves as an important reminder of the ability of supervisory authorities to impose sanctions on both data controllers and data processors where they are both culpable for breaches of data protection legislation, and that such parties should keep under careful consideration the extent to which existing security measures are appropriate in all the circumstances.

Summary of recent fines issued by the Spanish DPA

The Spanish DPA (“AEPD”) has recently issued three fines for GDPR breaches, continuing the uptick in enforcement activity by the AEPD highlighted in our March Bulletin:

  • Vodafone Spain (“VS”) was fined €150,000 (reduced to €90,000) for processing personal data in breach of Article 6(1) of the GPDR. VS was sending messages to data subjects who had previously asked for their data to be erased. VS was found to not have a legitimate basis for processing their data. VS had their fine reduced after assumption of responsibility and early payment of the fine.
  • Kuxtabank, S.A (“KB”) was fined €100,000 (reduced to €60,000) for failing to adequately deal with a request for erasure under Article 17 of the GDPR. The individual requested his data be erased and therefore KB blocked his data from being used. When the individual tried to re-open an account, KB required him to sign a form stating that the individual was revoking his right to erasure and allowing the previous data to be used by KB. AEPD emphasised that this temporary blocking did not comply with the right to erasure even if the new contractual relationship used the same data. KB had their fine reduced after assumption of responsibility and immediate payment.
  • Flexográfica del Mediterráneo, S.L. (“FDM”) was fined €3,000 for installing third-party cookies without the requisite consent of individuals (in breach of Article 7 of the GDPR) and for not explaining the purpose of the cookies (in breach of Article 13 of the GDPR). The AEPD found that users could not consent specifically for the types of cookies which were used. In addition, there was no privacy and cookies policy available to users of FDM. The AEPD required that FDM adapt its website in order to include the necessary information in a privacy policy and to provide for a way for users to specify their consent.

Civil litigation

Supreme Court considers the lawfulness of opt-out data protection group claims

In a landmark hearing, whose significance cannot be understated, Google’s appeal of the Court of Appeal's decision in Lloyd v Google LLC [2019] EWCA Civ 1599 (detailed coverage of which can be found here) was heard by the Supreme Court on 28 and 29 April 2021 (videos of the hearing can be found here). The Supreme Court’s eventual decision will have very significant implications for the viability of opt-out representative claims brought under CPR 19.6 for breaches of data protection law.

The claim is brought by Richard Lloyd, a former executive director of Which?, as representative of a class of affected data subjects (the "Data Subjects") whose personal data was unlawfully processed by Google harvesting, without consent, browser generated information from Apple’s browser “Safari” on iPhones between June 2011 and February 2012.

The Court of Appeal, overturning the first instance decision of Warby J, held that Mr Lloyd had demonstrated that: (a) the Data Subjects were entitled to recover damages pursuant to s.13 of the Data Protection Act 1998, based on the loss of control of their personal data alone, regardless of whether they had suffered pecuniary loss or distress; (b) the Data Subjects represented in the claim did, in fact, have the same interest for the purposes of CPR 19.6(1); and (c) the Court should exercise its discretion to permit Mr Lloyd to act as a representative for the Data Subjects; and accordingly should be permitted to serve the claim on Google out of the jurisdiction.

The Supreme Court was asked by Google to find that the Court of Appeal's decision on each of these three foundational issues was wrong and accordingly that Mr Lloyd should be refused permission to serve the claim out of the jurisdiction.

If the Supreme Court, whose decision is expected this Autumn, upholds the Court of Appeal's judgment, it would potentially open the floodgates to US-style class actions by representative Claimants on behalf of classes of affected data subjects in relation to breaches of data protection law. Indeed, since the claim was issues, numerous similar claims have been issued (for example against Marriott, Tik Tok, YouTube, Experian, etc), which are presently stayed pending the Supreme Court's decision.

Ticketmaster granted stay in appeal proceedings

In Ticketmaster UK Limited v Information Commissioner [2021] UKFTT 0083 the First-tier Tribunal (General Regulatory Chamber) granted Ticketmaster’s application to stay its appeal (the "Appeal") against a MPN imposing a £1.24 million fine issued by the ICO for breaches of Articles 5(1)(f) and 32 of GDPR (which is covered in detail here) pending the outcome of civil proceedings, Jack Collins and Others v Ticketmaster UK Limited (BL-2019-LIV-000007) (the "Civil Proceedings") involving related issues of fact.

The data breach to which the MPN relates, did not affect Ticketmaster’s own systems but rather those of a third-party data processor, Inbenta Technologies Ltd (“Inbenta”), which was under contract to supply a chatbot to Ticketmaster. The stay was granted because related issues of fact are due to be considered by the High Court in the Civil Proceedings in determining claims by: (1) affected data subjects against Ticketmaster; (2) Ticketmaster against Inbenta; and (3) Inbenta against Ticketmaster; arising from the same data breach which are due to be heard at trial in September 2022.

The Tribunal held that it "will be materially assisted by a substantive judgment in the High Court proceedings" as findings of fact by the High Court would narrow the scope of liability and would help avoid the risk of any overlap in determinations by the Tribunal and the High Court:

"whilst the length of the delay is likely to be substantial, and noting that the delay strikes against the principles of minimising litigation delays and that every litigant has a right to expeditious justice, the prejudice caused to the respondent by such a delay will be minimal. This is to be contrasted with the significant factual and/or legal assistance likely to be gained by the Tribunal in awaiting the decision in the High Court. In particular, it is of relevance that the High Court has, as a party before it, an integral player in the Incident, Inbenta. This party will not be playing a direct part in the Tribunal proceedings and I have no doubt that justice will be enhanced in the Tribunal by awaiting a judgment of the High Court that has considered Inbenta’s evidence and submissions. In addition, as alluded to above, there is also a novel legal issue which strides across both sets of proceedings, the High Court’s conclusion in relation to which will also materially assist the Tribunal."

German Federal Labour Court reaches decision on employee information requests under Article 15(3) GDPR

On 27 April 2021, the German FLC reached a landmark decision (2 AZR 342/20) regarding requests for information under Article 15(3) GDPR and, specifically, the extent to which an employer is obliged to provide its employees with copies of their personal data under this provision.  The German FLC ruled that blanket data subject requests which are not sufficiently specified (for example, those that do not specify the exact emails required) will not be enforceable under German procedural rules. While not binding in the UK, the decision may be of interest as an indication that some supervisory authorities are becoming sympathetic to the tactical use of DSARs in employment disputes.