Welcome to our data protection bulletin, covering the key developments in data protection law from November 2022. Please note that this is our last bulletin of the year. We wish you all a happy and healthy festive season and, as usual, we will be back in 2023 with our bumper December/January bulletin.

REMINDER - 27 December 2022 is the deadline for updating your old EU SCCs, as after that date they will no longer be valid as a safeguard for transfers of personal data under EU GDPR. If your live contracts contain the old SCCs, do get in touch about our remediation services – including our automated software tools – for assistance in remediating your transfer documentation, whether for the new EU SCCs or the post-Brexit UK GDPR transfers safeguards. Our timeline pullout (here) has the key international transfers deadlines for your calendar. Keep a lookout for news of our automated transfer remediation tools via our Data Protection law hub.

Data protection

Cyber security

Enforcement

Civil litigation

Data protection

ICO publishes updated guidance on international transfers

On 17 November 2022, the Information Commissioner's Office ("ICO") released its highly anticipated updated guidance on international transfers, including a new section on transfer risk assessments ("TRAs") and a new TRA tool (together the "Transfer Guidance").

This Transfer Guidance follows the ICO's publication of the UK's draft transfer mechanisms, the International Data Transfer Agreement ("IDTA") and the Addendum to the European Union Standard Contractual Clauses ("UK Addendum") as detailed in our February 2022 article New UK Data Transfer Tools.

The Transfer Guidance provides a detailed, practical application of the provisions on international transfers of personal data in Chapter V of the UK General Data Protection Regulation ("UK GDPR") and outlines the alternative approaches taken by the ICO and the European Data Protection Board ("EDPB") in relation to international transfers of personal data.

Our key takeaways from the Transfer Guidance are as follows.

Restricted transfers and TRAs

The Transfer Guidance confirms the ICO's existing position that a transfer of personal data will be a "restricted transfer" if the receiver is in a country outside of the UK without UK adequacy regulations and is a different legal entity to the exporter. Article 46 of the UK GDPR requires appropriate additional safeguards, such as the IDTA, the Addendum to the SCCs or Binding Corporate Rules ("BCRs"), to be put in place for any restricted transfers.

The Transfer Guidance makes it clear that organisations that rely on one of the Article 46 transfer mechanisms must complete a TRA before making a restricted transfer of personal data, to assess whether the "relevant protections for people under the UK data protection regime will be undermined" by the transfer.

This approach differs to the EDPB's requirement for the completion of a Transfer Impact Assessment to compare the laws and practices of the EU exporting jurisdiction against the laws of the intended recipient country. The ICO has confirmed in the Transfer Guidance that, for UK GDPR purposes, both approaches are acceptable for assessing the risks prior to a transfer.

The TRA tool

The ICO has also provided a substantially reworked version of its TRA tool, in an effort to simplify, and provide practical guidance on, the process of conducting TRAs. The new TRA tool requires organisations to consider six key questions that relate to the specific circumstances of the transfer, the protections contained in the chosen Article 46 transfer mechanism and the associated level of risk in the destination country.

The ICO confirms that restricted transfers should not take place if, at the conclusion of the TRA, it is decided that the chosen transfer mechanism alone will not provide the appropriate safeguards or protect the rights of data subjects.

Risk-based approach

The ICO's TRA tool identifies which measures and steps are appropriate to safeguard a restricted transfer, using a risk-based approach. Question 2 of the TRA tool asks organisations to assign a risk level to the personal data being transferred. To assist with this exercise, the Appendix to the TRA tool includes a list of standard processing activities with an initial risk score – low, moderate or high. The tool clarifies that low-risk transfers may proceed without further investigation of the protections available in the destination jurisdiction. The level of investigation required for medium and high-risk transfers should also be tailored in proportion to the level of risk.

Mitigating steps

If a TRA concludes that the safeguards are not adequate, then the restricted transfer should not take place. However, in such circumstances, the tool encourages organisations to consider whether extra steps and protections can be put in place to mitigate the risks. The TRA could then be re-run, taking into account any extra protections, to see whether the transfer could proceed. The TRA tool provides a non-exhaustive list of example measures that can be taken to mitigate risks in relation to a restricted transfer (such as access controls, changes to the personal information, organisational changes or contractual changes). It should be noted that an organisation may be required to amend the Article 46 transfer mechanism to add the extra steps and protections and ensure that they are legally binding.

Clarification on responsibility for the transfer safeguards

The Transfer Guidance provides welcome clarification that it is the entity (controller or processor) that agrees to and instigates the restricted transfer that must meet the relevant obligations under the UK GDPR. Previously, there was some confusion within organisations about who was responsible for complying, with responsibility tending fall heavily on controllers.

The Transfer Guidance explains that it is usually the entity that enters into the relevant contract that is the one "initiating" the transfer. This places responsibility for safeguarding a transfer onto the contracting entity, rather than the entity that is actually sending the data (which is relevant where a different group entity or third party transmits it).

This does not absolve processors that initiate and agree to send restricted transfers to their sub-processors from complying with the transfer rules. Here, the processor should put the safeguards in place, and their controller need not do so in respect of the same transfer.

The Transfer Guidance also provides welcome reassurance that it is never a restricted transfer for a processor to send or return data to its own controller of that same data.

Whilst the ICO's clarification could increase processors' responsibilities for safeguarding transfers, it also provides an opportunity for processors to take full control of their data flows with sub-processors.

Impact and next steps

The European Commission is likely to consider the Transfer Guidance when reviewing its adequacy decisions in respect of the UK, in order to ensure that onward transfers of EU personal data sent to the UK will still be adequately protected. The Transfer Guidance does take a more risk-based approach than seen in certain EU member states, so it remains to be seen how the EU will react.

The Transfer Guidance also reminds organisations that the deadline for moving to the UK Addendum plus the EU Standard Contractual Clauses ("SCCs") or the IDTA as a safeguard for existing UK GDPR international transfer arrangements is 21 March 2024.

The ICO is currently producing further guidance on how to use the IDTA and the UK Addendum. Organisations can also expect the publication of worked examples of how to use the ICO's TRA tool.

UK finalises adequacy regulations for data transfers to South Korea

The UK Government has laid before Parliament adequacy regulations for the Republic of Korea (the "Adequacy Decision"). These are the UK Government's first independent adequacy regulations since leaving the EU, aside from the existing EU adequacy decisions that the UK simply adopted. The Adequacy Decision follows the designation of South Korea as a priority country for adequacy assessments in August 2021 and a data adequacy agreement in principle reached in July 2022. The regulations show that the UK Government has concluded that South Korea's data protection laws are sufficiently protective of inbound personal data transfers, while upholding the rights of UK data subjects whose data is transferred.

The Adequacy Decision will facilitate the transfer of personal data between UK companies and companies within South Korea, without the requirement for additional safeguards such as SCCs, BCRs and IDTAs. The European Commission has already granted adequacy status to South Korea, making the UK's regulations uncontroversial. However, the UK Government's Adequacy Decision is broader than the adequacy decision granted to South Korea by the European Commission, in that it will also allow credit information to be shared with entities in South Korea. It is expected that the free movement of credit information will facilitate customer identification and payment verification processes, with resultant economic benefits.

The Statutory Instrument implementing the Adequacy Decision has now been introduced to Parliament and is expected to come into force on 19 December 2022. The Statutory Instrument can be read in full here.

EU's Digital Services Act in force, imposing new obligations regarding digital platforms and services

On 16 November 2022, the Digital Services Act ("DSA") came into force after its initial publication in the Official Journal of the EU on 27 October 2022. Directly applicable across the EU, the DSA introduces new responsibilities for digital services, enhanced safeguards for users' fundamental rights online and new supervisory powers for the European Commission, providing a new transparency and accountability framework for digital platforms (including social networks and app stores), as discussed in our July 2022 bulletin.

The DSA applies to all digital services that connect consumers to goods, services, or content and places comprehensive new obligations on online platforms to reduce and combat online risks. The obligations in the DSA are structured to match the size, role and impact of the organisation offering digital services, with micro and small enterprises (namely companies with fewer than 250 employees, an annual turnover not exceeding €50 million, and/or an annual balance sheet total not exceeding €43 million) benefitting from certain exclusions under the DSA. Very Large Online Platforms ("VLOPs") and Very Large Online Search Engines ("VLOSEs"), with more than 45 million users, will need to comply with further obligations, such as conducting annual assessments of the risks of their services causing online harms.

The DSA complements existing rules under the EU General Data Protection Regulation ("EU GDPR"); including with respect to users' consent and individuals' rights to object to targeted digital marketing, and does not seek to modify any existing data protection safeguards. For example:

  • Transparency obligations: the DSA requires users of digital services to be informed about how the technology is being used, including how algorithms are being used to recommend content. This is a similar obligation to the EU GDPR's transparency principle; and
  • Targeted advertising: the DSA bans profiling children for the purpose of targeted advertising and prevents the profiling of any individuals based on special category data, such as race or sexuality.

Non-compliance with the DSA could lead to fines of up to 6% of global turnover for offenders or a ban on operating in the EU single market for repeat serious offenders.

EDPB adopts recommendations on EU Controller BCRs

The EDPB has recently adopted Recommendations on the application for approval and on the elements and principles to be found in Controller Binding Corporate Rules (the "Recommendations") during its November plenary. The Recommendations will merge with the standard application form for Binding Corporate Rules for controllers ("BCR-C") and update the current BCR-C referential which identifies the requisite elements for BCR-C approval under the EU GDPR.

BCR-Cs are transfer mechanisms which, under the EU GDPR, can be used by multinational corporate groups, groups of undertakings or enterprises engaged in a joint economic activity ("Group") to establish a level of data protection essentially equivalent to the level provided by the EU GDPR when transferring personal data from a controller subject to the EU GDPR to a controller or processor in the same Group based in a non-EEA jurisdiction without an adequacy decision. The UK GDPR provides for BCR-Cs too, but the EDPB guidance is not applicable here.

The Recommendations aim to:

  • Update the current standard application form for approval of BCR-Cs;
  • Clarify the content of BCR-Cs with further explanation; and
  • Distinguish between content of the BCR-C and content of the application to be presented to the BCR Lead Supervisory Authority.

Of particular note is the clarification in the Recommendations that, when a Group seeks to rely on the BCR-Cs for transfers of personal data from the EEA, if the Group has its headquarters in the EEA it should ensure that the form is filled out by an entity located in the EEA. If headquartered outside the EEA, the Group should appoint a Group member within the EEA that has "delegated data protection responsibilities" and submits the form on behalf of the Group.

The Recommendations shall replace the previous Article 29 Working Party Recommendations on the Standard Application for Approval of Controller Binding Corporate Rules for the Transfer of Personal Data (WP264).

The EDPB is currently developing another set of recommendations for BCRs for processors.

The EDPB are inviting comments on the Recommendations until 10 January 2023.

Cyber security

EDPS releases opinion on the EU cybersecurity rules

The European Data Protection Supervisor ("EDPS") has published Opinion 23/2022 (the "Opinion") on the European Commission's Proposal for a Regulation of the European Parliament and of the Council on horizontal cybersecurity requirements for products with digital elements and amending Regulation (EU) 2019/1020 (the "Proposal"). The Proposal applies to a broad range of digital products, such as firewalls, browsers, operating systems and smart meters, and aims to introduce a number of EU-wide cybersecurity requirements for hardware and software products and their remote data processing solutions. The Opinion provides further insight on the key features of the Proposal and the EDPS' reactions and recommendations.

The EDPS emphasises its approval of the Proposal, and support of the European Commission's general objective to establish a uniform legal framework for essential cybersecurity requirements for placing products with digital elements on the market in the EU and improve the functioning of the internal market.

A key focus of the Opinion was on the interaction between data protection obligations and cyber-security requirements, with the EDPS suggesting a need to clarify the synergies between the Proposal and existing European data protection laws, specifically in the area of market surveillance and enforcement.

Of particular note is the EDPS' strong recommendation that the Proposal include the Data Protection by Design and by Default principle as an integral part of the cybersecurity requirementsof products with digital elements. This would ensure that data protection is built into processing operations and information systems prior to the commencement of processing activities. The EDPS supports the Proposal's requirement for data protection principles to be considered throughout the development of technologies which process personal data, therefore encouraging the inclusion of data minimisation techniques and privacy enhancing technologies (such as pseudonymisation and encryption) within a digital product's infrastructure.

The Opinion highlights the EDPS' view that the proposed European cybersecurity certificate, under the cybersecurity certification scheme, should not serve as a replacement for EU GDPR certification, further recommending that it be made clear in the Proposal that obtaining a European cybersecurity certification does not guarantee that a particular product with digital elements is compliant with the EU GDPR.

Finally, the Opinion welcomed the application of penalties similar to those of the EU GDPR for a breach: under the Proposal, non-compliant organisations would face a range of penalties, with the highest being administrative fines of up to the greater of €15 million or 2.5% of the organisation's global revenue.

The Opinion can be read in full here.

Enforcement

Department for Education reprimanded by ICO for children's information data breach

The Department for Education ("DfE") has been reprimanded by the ICO for a data breach arising from the unlawful processing of personal data, including children's data contained in approximately 28 million records, between 2018 and 2020. The DfE had provided the screening company Trust Systems Software UK Ltd ("Trustopia") with access to the Learning Records Service ("LRS"), a database containing pupil's learning records used by schools and higher education institutions. Despite not being a provider of educational services, Trustopia was allowed access to the LRS and used the database for age verification services, which were offered to gambling companies (to confirm their customers were over 18).

The ICO determined that the DfE had failed to protect against the unauthorised processing of data contained in the LRS. As the data subjects were unaware of the processing and unable to object or withdraw consent to the processing, the ICO deemed that DfE had breached Article 5(1)(a) UK GDPR. Additionally, the DfE had failed to ensure the confidentiality of the data contained in the LRS in breach of DfE's security obligations pursuant to Article 5(1)(f) UK GDPR.

In the reprimand the ICO noted that, but for the DfE being a public authority, the ICO would have fined the DfE just over £10 million. The reprimand from the ICO sets out the remedial actions that the DfE needs to take to improve its compliance with the UK GDPR, including: (1) improving the transparency of the LRS so that data subjects are able to exercise their rights under the UK GDPR; and (2) reviewing internal security procedures to reduce the likelihood of further breaches in the future. The DfE has since removed access to the LRS for 2,600 of the 12,600 organisations which originally had access to the database.

The ICO's reprimand to the DfE can be read in full here.

Meta receives further €265 million fine from Irish supervisory authority over EU GDPR breaches

The Irish Data Protection Commission ("DPC") has again fined Meta Platforms Ireland Limited ("Meta") for breaches of the EU GDPR. The DPC commenced its inquiry following reports that 530 million Facebook users' personal data had been made available on a hacking forum after it was scraped from the social media platform.

In issuing its fine to Meta, the DPC found that Meta had failed to ensure that sufficient technical and organisational measures were in place to protect users' personal data, in breach of Article 25 EU GDPR.

In addition to the €265 million fine issued to Meta, the DPC also issued a reprimand requiring Meta to take a number of remedial actions to ensure that such a breach does not occur in the future.

The DPC's press release announcing the issuance of the fine to Meta can be read here.

Discord fined 800,000 Euros by French supervisory authority for failure to safeguard user data

The French data protection supervisory authority ("CNIL") has fined software company Discord Inc. ("Discord") €800,000 for various breaches of the EU GDPR.

The CNIL found that Discord did not have a written data retention policy and that there were a large number of user accounts that had not been used for over three years. Discord's failure to remove these accounts and delete the personal data associated to them arose from Discord's failure to identify and enforce a data retention period appropriate to the purpose of the data collection, in breach of Article 5(1)(e) EU GDPR.

Discord was also found to have breached several other provisions of the EU GDPR including:

  • Failing to provide information to data subjects in accordance with Article 13;
  • Failing to ensure data protection by design in accordance with Article 25(2), as Discord's voice recording features remained active even when users thought they had closed the application;
  • Failing to ensure the security of personal data pursuant to Article 32, as Discord permitted passwords under its password management policy that were deemed by the CNIL to be weak; and
  • Failing to carry out data protection impact assessments under Article 35.

During the investigation by the CNIL, Discord took a number of steps to remedy its non-compliance with the EU GDPR, resulting in the fine that the CNIL imposed being reduced.

The CNIL's statement detailing the action taken against Discord can be read in full here.

Civil litigation

Meta sued in England for breach of individuals' right to object to processing for direct marketing

Human rights campaigner Tanya O'Carroll has brought proceedings against Meta over its continued processing of her personal data for digital marketing, which she claims is in breach of her rights under the UK GDPR. O'Carroll is not seeking damages for the alleged breaches, but instead is seeking only injunctive relief restraining Meta from processing her personal data for digital marketing, which she claims is in breach of her right to object under Article 21 UK GDPR.

The basis of O'Carroll's claim is that Meta does not provide its users with any option to reject to data processing for marketing purposes and instead proceeds on the basis that in agreeing to use of its platforms, its users have contractually agreed to the processing of their data in this manner. O'Carroll claims that this is a breach of the fundamental right to object under the UK GDPR and therefore that Meta is in breach of its data privacy obligations.

O'Carroll's claim further asserts that Meta was still assigning 'sensitive ad interest' to its users, based on sensitive data such as political interests and relationship data, as recently as October 2022, despite announcing in 2021 that it would no longer collect sensitive personal data.

Should O'Carroll's claim be successful, it would set a precedent for social media users in both the UK and the EU in relation to their fundamental right to object to their personal data being processed under privacy laws.

The Claim Form and Particulars of Claim submitted for Tanya O'Carroll in the High Court can be read in full here.

CJEU rules public registers of beneficial ownership breach fundamental Charter rights

The Court of Justice of the European Union ("CJEU") has ruled that local laws establishing publicly accessible registers listing the personal information of companies' beneficial owners, as required under the EU anti-money-laundering directive ("AML Directive"), breach Articles 7 and 8 of the EU Charter of Fundamental Rights ("Charter").

Following two requests for a preliminary ruling to the CJEU by the Luxembourg District Court in cases involving claims against the Luxembourg Business Registers, the CJEU ruled that laws granting the public a general right of access to beneficial ownership information constitute a "serious interference with the fundamental rights to respect for private life and to the protection of personal data" under the Charter. The CJEU held that, whilst the objective of general interest of the legislation enacted pursuant to the AML Directive could theoretically justify the interference with the fundamental rights under the Charter, the legislation enacted in Luxembourg did not limit such interference to that which is strictly necessary or proportionate to the pursuit of the objective.

Whilst the judgment by the CJEU has direct implications on EU beneficial ownership registers and the application of anti-money-laundering provisions in line with EU GDPR, it is also likely to lead to indirect consequences in the UK. The UK's public beneficial ownership register (the PSC register) and the Register of Overseas Entities, which was introduced earlier this year, could both face challenge under analogous principles under English law and, in particular, the UK GDPR.

The CJEU's judgment can be read in full here.

Meta seeks to overturn record fine in Irish High Court

Following the record €405 million fine issued to Meta under the EU GDPR by the DPC (which we reported on in our September update), Meta has applied for leave to appeal the fine in the Irish High Court.

In a brief hearing before the Irish High Court on 14 November, Meta outlined its grounds for appeal. Meta claims that the DPC's fine was in breach of the Charter and that parts of the Irish Data Protection Act, under which the fine was issued, are incompatible with EU law, including the European Convention on Human Rights. Additionally, Meta has claimed that the DPC treated the decision given by the EDPB as binding, when it was in fact an expression of non-binding views. In its application to the Irish High Court, Meta has also requested that any appeal proceedings are heard in private.

Meta's application for leave to appeal the DPC's decision was adjourned by the Irish High Court until January. Meta has also announced that it will seek an annulment of the EDPB's decision in the CJEU.

The DPC's original decision against Meta can be read here.

Netherlands Data Protection Foundation prepares claim against Twitter over sale of personal data

The Netherlands Data Protection Foundation ("NDPF") has announced that it is preparing a mass claim against Twitter, Inc. ("Twitter") over the social media giant's collection and sale of data without permission.

The NDPR claims that Twitter gathered personal data contained in over 30,000 mobile apps through an advertising company called MoPub, which Twitter incidentally sold earlier this year. The personal data, some of which was sensitive health data taken from health tracking apps, is estimated to have included data from up to 11 million residents of the Netherlands.

The amount of the claim has not yet been made public by the NDPF but is open to any residents of the Netherlands who installed any of the mobile apps over an eight-year period from 2013 to 2021.

Google settles with US States for nearly $400 million in privacy misrepresentation lawsuit

Google Inc. ("Google") has agreed a US$391.5 million settlement with forty US states over its alleged tracking of users' locations without consent. Attorney generals from these states had opened an investigation into claims that Google was illegally tracking its users' locations, in particular by misleading users into thinking that they had turned off their location sharing, when in fact Google was continuing to collect location data.

In additional to the financial settlement, Google will also be required to improve transparency over its location tracking by providing additional information to users over how their data is collected and used, and by signposting this information more clearly on its webpage.

The investigation was led by the Attorney General for Iowa, who announced the settlement in a statement that can be read here. The full settlement with Google can be read here.