Included in this issue of Data & Privacy News: the Swedish data protection authority fines Klarna Bank, political agreements reached on major legislation for EU's Digital Agenda and more.



On 23 April 2022 the European Commission announced that consensus had been reached between the European Parliament and the EU Member States regarding the Digital Services Act (DSA).

First proposed by the Commission in December 2020, the DSA intends to establish a new accountability framework for the platform economy, and seeks to address a number of societal issues that have emerged through the promulgation of harmful and unlawful online content.

Whereas historically there has been very limited regulation of online content, with platforms able to adopt their own takedown policies, under the DSA "Intermediary Service Providers" (ISPs) would be compelled to act quickly to remove hate speech, terrorist propaganda and other material defined as illegal under EU laws. Organisations meeting the thresholds for "VLOP" status ("Very Large Online Platforms") will be subject to particularly rigorous new requirements, including:

  • conducting annual assessments to detect and mitigate systemic risks presented by their services;
  • sharing data with regulators and academic researchers to improve scrutiny; and 
  • full transparency about the decision-making processes that sit behind "recommender" systems.   

In addition, there will be a qualified prohibition on platforms deploying "dark patterns", which exploit cognitive biases in order to extract personal data from individuals and make it harder for users to unsubscribe from online services. The DSA will also introduce bans on using special category data (e.g. relating to health, sexual orientation or political views) for targeted advertising, and on using any personal data whatsoever to serve targeted ads to children.

The proposals also include substantial penalties for infringement, including fines of up to 6% of global turnover for substantive breaches and up to 1% of global turnover for failing to provide sufficient, accurate information to regulators. 

The political agreement between the European Parliament and the Council will now await formal approval by the co-legislators. It will become directly effective in all EU member states either 15 months from the date that it enters into force or on 1 January 2024, whichever date falls later. The rules for VLOPs (and for very large search engines) will apply four months from their designation as such by the Commission. 


In addition to reaching this key milestone on the DSA, on 25 March it was announced that the European Parliament and Council had also reached an agreement on a provisional text for the DSA's "sister" legislation: the Digital Markets Act (DMA). In essence the DMA is a sector-specific competition law, seeking to sharpen existing antitrust tools to better police dominant companies operating in digital markets. It has been widely acknowledged that existing competition laws have proven ineffective in seeking to address market failures caused by special features of the digital economy, such as network effects and mass concentration of valuable datasets. 

The DMA introduces ex ante obligations on "Gatekeepers". These are organisations which operate core platform services and with whom organisations have no option but to deal, in order to access critical markets or obtain essential inputs. These Gatekeepers include online marketplaces, app stores, search engines, social networks, cloud services providers, and web browsers. Once they are designated as a Gatekeeper, organisations will be required to comply with a detailed new rulebook of obligations and restrictions on their market conduct, many of which are informed by recent competition cases involving technology giants. 

Failure to meet these onerous obligations could carry extreme penalties, particularly for repeat offenders. Recidivists could face administrative fines of up to 20% of their global annual turnover, prohibitions on acquiring other companies, or obligations to divest businesses they have already purchased. 

As with the DSA, the next stage is for the political agreement to be formally approved by the European Parliament and the Council. It will be directly applicable across the EU six months after its entry into force. 

In addition to the DSA and DMA, European lawmakers are also progressing a raft of additional proposals relating to the Digital Agenda through the legislative process. These include the Data Governance Act and EU Data Act (designed to promote dissemination of industrial data to fuel innovation and growth), and the Artificial Intelligence Act. The relatively rapid rate of progress of this ambitious program of reform is demonstrative of the consensus among the EU's member states regarding their priorities for the digital economy, and their desire to hold the world's most powerful technology companies accountable for their impact on EU markets.


On 28 March 2022, the Swedish Authority for Privacy Protection (IMY) issued an administrative fine of SEK 7,500,000 (approx. €725,510) against Klarna Bank AB, after an investigation showed that the company had failed to comply with several articles of the GDPR.

In the spring of 2020, the IMY began an audit of the FinTech company to investigate Klarna's personal data processing operations. The main issue that was identified was that Klarna had not been able to provide a clear explanation of how the company handles personal data, as the information that they provided during the investigation was constantly changing. 

The IMY found that Klarna did not provide the relevant information to explain the purpose(s) for which personal data was being processed, nor the legal basis on which personal data was being processed in relation to one of the company's services. Further, the documentation provided by Klarna gave incomplete and misleading information about the recipients of different categories of personal data. Klarna also failed to provide information about the countries outside the EU/EEA to which personal data was transferred, or where and how individuals could obtain information on the safeguards that applied to those transfers. Bizarrely, Klarna had edited their privacy policy eleven times since the IMY opened its investigation.  

Klarna has confirmed that it will appeal against the decision, and has issued a statement regarding the IMY's findings. It highlights the significant improvements made to its privacy notices since the commencement of the IMY investigation. Klarna argues that the frequent changes to its privacy information were designed to ensure that it was fully transparent regarding its data processing activities -  which were constantly evolving – and the challenges faced by data controllers in ensuring that privacy notices are both fully transparent and sufficiently concise to be readily understood by data subjects. This challenge, however, was first presented by the GDPR in 2018 and, since then, data controllers have generally been able to establish the appropriate balance between these competing goals in their policies.

The fine is the latest in a number of data protection compliance issues Klarna has experienced in recent years. Klarna also suffered a material data breach in May 2021, and was investigated by the UK's Information Commissioner in 2020 following complaints from individuals that received newsletters from Klarna, despite not being customers or having ever provided their contact information to Klarna.


The Irish Data Protection Commissioner (DPC) has fined Meta (Facebook's parent company) 17 million euro following an inquiry into 12 data breaches between June and December of 2018. The DPC considered that Meta had failed to comply with GDPR articles 5(2) (the accountability principle) and 24(1) (the requirement to implement appropriate technical and organisational measures to ensure and demonstrate that personal data is processed in compliance with the GDPR). 

An interesting aspect of the decision is that it relates not to the data security breaches themselves, but to Meta's inability to demonstrate that the preventative measures were implemented effectively in practice. While GDPR's accountability and governance obligations are broad and onerous, it is unusual for data protection regulators to issue large financial penalties in relation to breaches of this nature. As such, the decision serves as a useful reminder to data controllers that they don't just need to pursue compliance with the requirements of the GDPR; they also need to show their work. 

A spokesperson for Meta said: "This fine is about record keeping practices from 2018 that we have since updated, not a failure to protect people's information". This is consistent with the DPC's position that, despite the occurrence of data breaches, the penalty relates to Meta's ability to demonstrate appropriate technical and organisational measures, rather than the security of the data itself.

Another noteworthy aspect to the decision is that it is the first time that the cooperation provisions under Article 60 GDPR have been used to achieve consensus among affected supervisory authorities (SAs). As Meta is headquartered in Dublin, the DPC led the inquiry as Meta's Lead Supervisory Authority under the GDPR's One-stop shop mechanism. However, the investigation concerned cross-border personal data processing by Meta which affected data subjects in numerous EU member states, and objections to the DPC's draft decision were initially raised by SAs in Germany and Poland. It took further engagement between the affected SAs to agree to the DPC's proposed approach; an outcome which may be all the more satisfactory to the DPC in light of recent criticism it has faced for perceived failures to reign in the tech giants that choose Ireland as their EU residence.


Brussels Airport Company (BAC) has been fined €200,000 by the Belgian data protection regulator (APD) for processing special category personal data without a valid legal basis. As part of BAC's efforts to prevent the transmission of COVID-19 for travellers, thermal cameras were deployed from June 2020 to January 2021to check whether passengers had a body temperature of 38ºC or above. 

Another organisation, Ambuce Rescue Team (ART), carried out a secondary check on passengers by administering a questionnaire which collected information regarding possible COVID symptoms and other health data. ART has also been fined €20,000.

Biometric data (including body temperature measurements) and data concerning health constitute special category personal data under the GDPR. As such, lawful processing of such data requires one of the conditions under GDPR Article 9(2) to be satisfied. These conditions include: 

  • where the processing is necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued (Article 9(2)(g)); and 
  • where processing is necessary for reasons of public interest in the area of health, such as protecting against serious cross-border threats to health…on the basis of Union or Member State law which provides for suitable and specific measures to safeguard the rights and freedoms of the data subject (Article 9(2)(i)). 

APD reasoned that since the checks were conducted pursuant to a non-binding "Protocol" rather than a Member State law, neither BAC nor ART were able to rely on the exceptions relating to public health or public interest. The data controllers were also unable to demonstrate that it was necessary to process the collected personal data for compliance with a legal obligation, or a task carried out in the public interest. In addition, APD stated that neither the controllers' lawful basis, purposes, nor conditions for processing were sufficiently clear, precise or foreseeable to data subjects.  

Throughout the pandemic, data protection regulators have emphasised that the GDPR should not serve to frustrate Member States' ability to collect and process personal data that is necessary to deal with emergencies such as public health crises; but that data controllers must nevertheless continue to operate within its framework. In the absence of the ability to rely on either Article 9(2)(g) or 9(2)(i) of the GDPR, BAC and ART were required to ensure that their processing satisfied an alternative condition under Article 9, such as explicit consent.


The Bank of Ireland (BoI) has been fined €463,000 by the DPC in relation to numerous personal data breaches, and other GDPR infringements relating to BoI's breach response.

Between November 2018 and June 2019 BoI notified 22 breach incidents to the DPC, 19 of which were found to meet the GDPR definition of a Personal Data Breach. The incidents arose in connection with the supply of data by BoI to the Central Credit Register (CCR), a system operated by the Central Bank of Ireland to process information relating to loans. Within the CCR, borrowers can request their credit report to check what information banks have submitted about their loans, and banks can use these reports to ascertain a person’s existing loans and credit history.

BoI learned that their data feed to the CCR had been corrupted, leading in some cases to unauthorised, accidental disclosures of personal data to the CCR. In other cases, inaccurate data was disclosed, leading to some data subjects' records erroneously reflecting that they were in financial distress. This was likely to have a significant adverse impact on many of the 47,000 data subjects ultimately found to have been affected by the breach (though BoI initially reported to the DPC that only one data subject was affected), who may have been denied loans as a result.  

In addition to BoI's breaches of Article 32 of GDPR (obligation to implement appropriate measures to ensure the level of security is appropriate to the risk of processing), BoI was also found to have committed multiple breaches of Article 33 (obligation to notify data breaches to the supervisory authority without undue delay), and Article 34 (obligation to notify data subjects where a breach is likely to result in a high risk to their rights and freedoms). 

The DPC's decision highlights that the GDPR's definition of "Personal Data Breach" is not limited to breaches of confidentiality of personal data, but also breaches affecting the availability or integrity of that data. This includes situations where personal data is accidentally or unlawfully altered, destroyed, or lost (even temporarily) in addition to situations of unauthorised access or transmission.