Executive summary
Welcome to the latest edition of Updata!
Updata is an international report produced by Eversheds Sutherland's dedicated Privacy and Cybersecurity team it provides you with a compilation of key privacy and cybersecurity regulatory and legal developments from the past quarter.
This edition covers October to December 2021 and is full of newsworthy items from our team members around the globe, including:
the emergence of new national cybersecurity strategies and new cyber-related legislation, including developments in Austria, China, the US and various ones in the EU and the UK
draft proposals for security assessment of cross-border data transfers from China and a ruling regarding cross border data flows from a German court
an increasing volume of significant case law decisions emerging in relation to the GDPR, when viewed across Europe, including in Austria, Belgium, France, the Netherlands and Germany
a continued emphasis on transparency and redressability in the use of artificial intelligence and automated decision-making, including a new measure from New York City
new guidance issued by regulators in the EU on the interplay between the application of GDPR's Article 3 and rules on international transfers, China on network data classification and grading, France on data protection officers and Hong Kong on implementation of rules criminalising acts of doxxing
developments in the regulation of data gathering and sharing practices amidst the COVID-19 pandemic, including in Germany, Ireland, the Netherlands, Slovakia and the UK
important new guidance on cookies in France and Germany
continued monitoring and enforcement of the implementation of adequate security measures, including in France, Lithuania, the Netherlands and Sweden
proposed legislation on data sovereignty and the cloud in South Africa
We hope you enjoy this edition of Updata.
Follow us on Twitter at:
@ESPrivacyLaw
Paula Barrett Co-Lead of Global Cybersecurity and Data Privacy T: +44 20 7919 4634 [email protected] eversheds-sutherland.com
Michael Bahar Co-Lead of Global Cybersecurity and Data Privacy
T: +1 202 383 0882 [email protected] eversheds-sutherland.com
Updata Edition 14 October to December 2021 | Executive summary
EU and International Austria Belgium China France
Germany Hong Kong
Ireland Lithuania Netherlands Slovakia South Africa Sweden United Arab Emirates United Kingdom United States
1
EU and International
Contributors
Paula Barrett Co-Lead of Global Cybersecurity and Data Privacy
T: +44 20 7919 4634 [email protected] eversheds-sutherland.com
Lizzie Charlton Senior Associate PSL (Data Privacy)
T: +44 20 7919 0826 [email protected] eversheds-sutherland.com
Development
A resolution calling for AI safeguards passes through European Parliament
EDPB releases guidelines on the interplay between the application of GDPR's Article 3 and rules on international transfers
Summary
Date
In a resolution adopted by 377 in favour, 248 against and 62 abstentions, the European Parliament has announced that a resolution calling for stronger safeguards for when artificial intelligence is used in law enforcement has passed. The resolution highlights algorithmic bias in AI applications and stresses that human supervision and strong legal powers are necessary to prevent discrimination by AI, especially so when such instruments are used for law enforcement or bordercrossing. Consequently, MEPS supporting the resolution demanded that human operators are to always make the final decisions and that those who will be monitored by AI systems should have a route to seek remedy. Whilst this won't impact UK firms, both the UK Government's Centre for Data Ethics and Innovation, and the ICO (see below) have confirmed they are looking into these EU safeguards and so we may see something similar in the UK soon.
6 October 2021
The EDPB has issued draft guidance on the interplay between the application of the EU GDPR's territorial scope provisions and rules on international transfers for consultation. The consultation closes on 31 January 2022.
19 November 2021
The guidance is particularly relevant to any organisations subject to the EU GDPR and transferring personal data internationally.
The guidance explains that the following three elements are required for there to be a transfer of personal data to a third country or to an international organisation:
i. a controller or a processor is subject to the GDPR for the relevant processing;
Links
European Parliament News
Draft guidance
Updata Edition 14 October to December 2021 | EU and International
2
General EU and International
Development
EDPB launches first coordinated action proposal
Summary
Date
ii. this controller or processor ("exporter") discloses by transmission or otherwise makes personal data, subject to the processing, available to another controller, joint controller or processor ("importer"); and
iii. the importer is in a third country or is an international organisation. The guidance clarifies that this element applies irrespective of whether or not the importer is subject to the EU GDPR in respect of the relevant processing in accordance with Article 3.
This draft guidance is noteworthy in that it clarifies that a cross-border transfer will take place, even if the importer is subject to the EU GDPR (pursuant to Article 3(2)) a query we raised in the client briefing we published when the new EU standard contractual clauses ("new EU SCCs") were released in June this year.
Recital 7 of the new EU SCCs specifies that the clauses can only be used if the importer is not subject to the EU GDPR. Where an importer is subject to the EU GDPR (pursuant to Article 3(2)) but based in a third country, an appropriate safeguard is required to protect the transfer of personal data. However, there are no official standard contractual clauses currently available to address this. The guidance confirms that any safeguard implemented to cover this scenario should be adapted in order not to duplicate the EU GDPR obligations but rather address the elements and principles that are "missing". It is possible that a set of specific clauses covering this scenario, will be released by the European Commission in due course. In its guidance, the EDPB says it "encourages and stands ready to cooperate in the development of a transfer tool, such as a new set of standard contractual clauses" in this regard.
On a related note, readers will be aware that the UK's ICO has recently consulted on tools and guidance to be used in the context of transfers of personal data out of the UK. The ICO's consultation included a question regarding whether processing by the importer must not be governed by the UK GDPR for a restricted transfer to occur. The ICO is expected to publish its final guidance and transfer tools in the new year.
Following the EDPB's establishment of its Coordinated Enforcement Framework, it has launched its first coordinated action on the use of Cloud based services in the public sector.
18 October 2021
Updata Edition 14 October to December 2021 | EU and International
Links
Coordinated Enforcement Framework
3
General EU and International
Development
regarding cloud based services in public sector
Summary
In a coordinated action, the EDPB will prioritise a certain topic which supervisory authorities consider and work on, at a national level. The outcome of the national actions will be collated and analysed, allowing a deeper insight into the relevant topic and providing the opportunity for more specific follow-up actions to be taken on both a national and EU level.
Date
EDPB finalises guidelines on restrictions of data subject rights under Article 23 GDPR
The EDPB has finalised its Guidelines on restrictions of data subject rights under Article 23 GDPR, following public consultation.
13 October 2021
Article 23 GDPR provides that EU or member state law may restrict certain rights contained in the GDPR when such a restriction respects the essence of the fundamental rights and freedoms and is a necessary and proportionate measure in a democratic society to safeguard matters including (among others): national security; defence; the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security; and the protection of judicial independence and judicial proceedings.
The guidelines seek to provide guidance on the application of Article 23 and provides an analysis of:
- the criteria to apply restrictions;
- the relevant assessments;
- how data subjects can access their rights once a restriction is lifted; and
- the consequences of infringing Article 23.
According to the EDPB, "the guidelines analyse how the legislative measures setting out the restrictions need to meet the foreseeability requirement and examine the grounds for the restrictions listed by Art. 23(1) GDPR, and the obligations and rights which may be restricted."
European Commission publishes joint statement on first review of EU-Japan adequacy agreement
The European Commission and the Personal Information Protection Commission of Japan ("PPC"), along with other relevant authorities of Japan and EU data protection authorities, published a joint statement on the first review of the EU-Japan mutual adequacy agreement which has been in place since 2019.
The review covers all aspects of the functioning of the adequacy
26 October 2021
Updata Edition 14 October to December 2021 | EU and International
Links
EDPB Press Release
EDPB Press Release Guidelines on restrictions under Article 23 GDPR
Joint statement on the first review of the EUJapan mutual adequacy arrangement
4
General EU and International
Development
Summary
decisions adopted by the EU and Japan, including considering the application to broader legal developments in the data protection sphere and government access to data. The review also provided an opportunity for the sharing of information and experience on issues of common interest.
During the meeting, participants stressed the shared commitment by both parties to high privacy standards, which are considered to be an essential element of the human-centric approach to the opportunities and challenges of our digital age.
Following the meeting and to conclude the review process, the European Commission and PPC will publish separate reports on the functioning of the decisions.
Date
ENISA publishes ninth edition of Threat Landscape report
The European Union Agency for Cybersecurity ("ENISA") released the ninth edition of its Threat Landscape Report, which identifies the prime threats and trends in relation to cybersecurity and reports on relevant mitigation measures that may be implemented.
27 October 2021
Nine categories of threat are reported on with Ransomware coming out on top as the prime threat for 2020-2021. By contrast, there has been a decline in other types of malware throughout 2021. Monetisation has been reported as one of the key motivations for cybercriminals with a record high in cryptojacking infections in the first quarter of 2021 which is thought to have been incentivised by the financial gains associated with cryptojacking. Other threat categories include threats against data, crypto-jacking and e-mail related threats.
The report focusses on 4 key threat actors: state-sponsored actors; cybercrime actors; hackers-for-hire actors; and hacktivists.
The report suggests COVID-19 is considered to have been the "dominant lure" in campaigns for email attacks and there has been a surge in healthcare related breaches. COVID-19 also led to an increase in non-malicious incidents due to human errors and system misconfigurations.
European Commission strengthens the cybersecurity of wireless devices and products
The European Commission ("Commission") took action to improve the cybersecurity of wireless devices, including mobile phones, smart watches, fitness trackers and wireless toys, available on the European market, through a delegated act to the Radio Equipment Directive
29 October 2021
Updata Edition 14 October to December 2021 | EU and International
Links
ENISA Report Press release
5
General EU and International
Development
EDPB adopts statement on the Digital Services Package and Data Strategy
Summary
(Directive 2014/53/EU) ("Act").
The Act aims to make sure that all wireless devices are safe before they are sold on the market. The Act sets out legal requirements, in relation to cybersecurity safeguards, for manufacturers when designing and producing relevant products.
The new measures will help to:
- improve network resilience: through features which avoid harming communication networks and prevent devices from being used to disrupt website or other services functionality;
- better protection of consumer's privacy: through features to guarantee the protection of personal data; and
- reduce the risk of monetary fraud: through features to minimise the risk of fraud when making electronic payments.
The measures also include making the protection of children's rights essential to legislation in this area.
The Act will now face a two-month scrutiny period after which, presuming that no objections are raised by the European Council or European Parliament, it will come in to force. Once in force, manufacturers will have 30 months to start complying, providing the industry sufficient time to adapt relevant products.
The Commission has also produced a Q&A setting out further detail as to why they are strengthening cybersecurity in this area.
Date
The EDPB adopted a statement on the European Commission's proposed Digital Services Package and Data Strategy, including the Digital Services Act, the Digital Markets Act, the Data Governance Act and the Regulation on a European approach for Artificial Intelligence.
18 November 2021
The statement sets out three key areas of concern:
- lack of protection of individuals' fundamental rights and freedoms;
- fragmented supervision; and
- risks of inconsistency.
The EDPB considers that without further amendments the proposals will lead to a significant level of legal uncertainty which would undermine
Links
Q&A: Strengthening cybersecurity of wireless devices and products
EDPB statement
Updata Edition 14 October to December 2021 | EU and International
6
General EU and International
Development
Provisional agreement reached on proposed Data Governance Act
Summary
Date
the current and any future legal framework, which the EDPB suggests in turn may result in failure to create the conditions for innovation and economic growth envisaged by the proposals.
Moreover, the EDPB sets out the risk of parallel supervision structures in which different competent authorities supervise the same entities without proper cooperation between the authorities.
In addition, the EDPB stated that there is a lot of potential for the operative text of the proposals to create ambiguity.
The EDPB has suggested that: the Commission avoids ambiguities; the proposals should not affect or undermine the application of existing data protection rules and ensure that data protection rules prevail whenever personal data is processed; and forthcoming legislative proposals include specific defined data protection safeguards at the outset.
The Council of the European Union ("Council") and the European Parliament announced that they had reached a provisional agreement on the Data Governance Act ("DGA"), which is designed to promote the availability of data. The DGA's purpose is to improve data protection standards whilst also introducing new mechanisms in order to enable better use of data going forward. This will involve public sector data being able to be reused over a defined period with a single register used.
30 November 2021
The DGA will provide a framework to foster "data intermediation services", a new business model aimed at providing companies and individuals with a secure environment within which they can share data. Such services can take the form of digital platforms, which support voluntary data sharing, and which can be used by companies without the risk that data will be misused or result in loss of competitive advantage.
The DGA is designed to foster "data altruism for the common good" by making it easier for individuals and companies to make data available voluntarily for key areas, such as medical research projects.
The DGA provides for voluntary certification by service providers, to make it easy to identify providers of data intermediation services and data altruism organisations. The DGA will also create safeguards for public-sector data, data intermediation services and data altruism organisations to prevent the unlawful transfer of non-personal data
Links
European Commission press release European Parliament press release European Council press release
Updata Edition 14 October to December 2021 | EU and International
7
General EU and International
Development
Proposal for harmonised rules on AI
EU Council agrees a general approach on measures for common level of cybersecurity
Summary
and/or governmental access to such data, mirroring the safeguards for personal data that already exist under the EU GDPR.
Alongside this, a European Data Innovation Board will be created to assist and advise the European Commission on key issues in this area, including "the interoperability of data intermediation services" and how to facilitate the development of data spaces.
The provisional agreement is subject to approval by the Council, and if approved the DGA will apply 15 months after its entry into force.
Date
The European Union Council has released proposed legislative text for harmonised rules on the Artificial Intelligence ("AI"). This follows the European Commission's proposals regarding the same in April 2021.
In brief, the legislative text will ensure that national security remains within the remit of Member States in order to comply with Article 4(2) of Treaty of the European Union. The legislative text has been amended to clarify that it applies to more recent technology that is classified as AI. The text also sets out what AI practices are to be prohibited, such as in relation to biometric remote identification systems in public spaces, and protection extended to stop AI presenting a threat to vulnerable people be it due to societal or economic factors. The text also defines what high risk AI systems are and how to identify these. In addition, the European Commission will be responsible for monitoring the enforcement of the rules and reporting will be required every two years.
The legislative text is be further reviewed before progressing to the next stage.
29 November 2021
The EU Council has agreed a general approach on measures for a common level of cybersecurity across the EU, further improving the resilience and incident response capacities of the EU, in both public and private sectors. This forms part of the EU's continual focus on building resilience to ever-evolving cyber threats and keep the digital society and economy safe and secure.
The new directive, "NIS2", will replace the current NIS directive, on security of network and information systems and "aims to remove divergences in cybersecurity requirements and in implementation of cybersecurity measures in different member states". It sets the baseline
3 December 2021
Links
Proposed legislative text
Press release
Updata Edition 14 October to December 2021 | EU and International
8
General EU and International
Development
Summary
for cybersecurity risk management measures and reporting obligations across all sectors covered by the directive, including energy, transport, health and digital infrastructure. NIS2 sets minimum rules for a regulatory framework and lays down mechanisms for effective cooperation between member states. It will also formally establish the European Cyber Crises Liaison Organisation Network (EU-CyCLONe) to support the co-ordinated management of large-scale security incidents.
NIS2 also introduces a size-cap rule which aims to ensure that all medium-sized and large entities operating within the relevant sectors or providing the relevant services will fall within its scope. There are also additional provisions to ensure proportionality, a higher level of risk management, and clear-cut criticality criteria for determining which entities are covered.
The Council has also aligned the text with sector-specific legislation, streamlined the reporting obligations and introduced a voluntary peerlearning mechanism to increase mutual trust and learning from good practices and experiences.
The Council will now start negotiations with the European Parliament, to agree the final text.
Date
Negotiating mandate agreed by European Parliament on Digital Markets Act
The Digital Markets Act is directed at the `gatekeepers' to the single market and aims to ensure fair markets in the digital sector. At the European Parliament plenary session on 15 December 2021, the European Parliament failed to adopt its first reading position on the proposal. The Parliament instead introduced amendments to the text originally proposed by the Commission and this amended text has been sent back to the lead committee for informal trialogue negotiations.
The position adopted by the Internal Market and Consumer Protection Committee of the European Parliament on 23 November 2021 was approved by European Parliament at the plenary. The key changes approved by Parliament to the Commission's proposal are summarised in the accompanying Press release.
15 December 2021
EDPB December 2021 plenary outcomes announced by EDPB
Following its December 2021 plenary, the EDPB has published its contribution to the review of the Directive (EU) 2016/680 (the Data Protection Law Enforcement Directive ("LED")) which tries to harmonise individuals' data protection across the EU. In particular, the EDPB emphasised that for effective implementation of the LED member
14 December 2021
Links
Press release Digital Marketing Act Press release
Updata Edition 14 October to December 2021 | EU and International
9
General EU and International
Development
Summary
states will need to have appropriate resources available. The EDPB also reaffirmed its commitment to provide continued guidance on how the LED should be interpreted. Following the establishment of a Support Pool of Experts ("SPE") the EDPB has agreed the SPE will aim to provide material support to EDPB members via expertise and to enhance cooperation and solidarity between EDPB members.
The EDPB also adopted a formal response to MEP Ujhelyi in relation to hacking spyware Pegasus, confirming that the EDPB will be vigilant in relation to developments that interfere with fundamental privacy rights and data protection.
Further, the EDPB adopted a final version of the Guidelines on examples regarding data breach notifications which aim to help controllers decide how to handle data breaches and the navigate factors that should be considered in risk assessments.
Date
EU adopts adequacy decision for transfer of personal data to the Republic of Korea
The Commissioner for Justice Didier Reynders and Chairperson of the Personal Information Protection Commission Yoon Jong have announced that the European Commission has adopted an adequacy decision under the EU GDPR in relation to transfers of personal data from the European Union to the Republic of Korea. This is the final step in the process, since adequacy talks were concluded in March 2021.
As a result, personal data will be able to be transferred from the EU to the Republic of Korea without the need for the adoption of further authorisations or additional tools. The decision also supports the EU Republic of Korea Free Trade Agreement.
17 December 2021
ENISA releases report on security of machine learning algorithms
The ENISA has published a new report on Securing Machine Learning Algorithms, which presents a taxonomy of machine learning techniques and core functionalities and maps out the threats targeting ML techniques and the vulnerabilities of ML algorithms. The report also includes a list of recommended security controls to enhance cybersecurity in systems relying on ML techniques.
14 December 2021
Links
Press Release Press release
Updata Edition 14 October to December 2021 | EU and International
10
Austria
Development
European Court of Human Rights: Online platforms may be required to disclose users' identity only after a balancing of interest
Contributors
Georg Roehsner Partner
T: +43 15 16 20 160 [email protected] eversheds-sutherland.at
Manuel Boka Partner
T: +43 15 16 20 160 [email protected] eversheds-sutherland.at
Michael Roehsner Legal Director
T: +43 15 16 20 160 [email protected] eversheds-sutherland.at
Rmy Schlich Associate
T: +33 15 57 34 206 [email protected] eversheds-sutherland.at
Summary
Date
In its decision Standard Verlagsgesellschaft MBH v. Austria (app no. 39378/15) the European Court of Human Rights ("ECtHR") ruled that Austria had violated the complainant's right to freedom of expression by requiring an online newspaper platform to disclose the identity of certain users of the platform.
The users had posted harsh criticism against an Austrian political party, including polemic allegations of corruption as well as insinuating a connection between the party and extreme right ideology and national socialist actions (which are prohibited by law in Austria).
The party and one of its politicians requested that the platform disclose the individuals' identity in order to be able to initiate civil and criminal proceedings against them. The platform deleted the posts but refused to disclose the users' identity. The party and the politician then filed a lawsuit against the platform. The platform was ultimately ordered by the Austrian Supreme Court to disclose the users' identity.
The platform filed a complaint at the ECtHR for violation of the Fundamental Right of Freedom of Speech and Freedom of the Press (Article 10 ECHR).
7 December 2021
Links
Decision (English)
Updata Edition 14 October to December 2021 | Austria
11
Austria
Development
Austrian Federal Administrative Court: No right to appeal against determination of lead supervisory authority under Article 56 GDPR
Summary
Date
The ECtHR considered that such comments and the identity of the user were not protected under Freedom of the Press, as this case did not concern the protection of journalistic sources.
However, the ECtHR stated that anonymity on the internet can be an important value under the Right to Freedom of Expression. While this anonymity must yield on occasion to other legitimate overriding interests, the ECtHR stated that this balancing of interests must be assessed on a case-by-case basis.
In the present case, the ECtHR ruled that there were no such overriding interests. The ECtHR considered the statements did not amount to hate speech or incitement to violence and were not otherwise clearly unlawful. Furthermore, the comments concerned politicians and a political party and were expressed in the context of a public debate on issues of legitimate public interest.
As the Supreme Court had not conducted the required balancing of interest, the ECtHR ruled that the Supreme Court's order to disclose the users' identity was a violation of Article 10 ECHR.
The Austrian Federal Administrative Court decided an appeal by a complainant against a decision by the Austrian data protection authority ("DPA").
The complainant had filed a complaint against an international streaming provider for alleged violation of the right to access (Article 15 GDPR) at the Austrian DPA.
Date of Decision: 7 December 2021
Published: 20 December 2021
The Austrian DPA had stated that the matter was subject to the competence rules of Article 56 GDPR and halted the proceeding to start the consistency mechanism and decide the competent lead supervisory authority. In this mechanism, the Austrian DPA agreed with the Irish DPC that the Irish DPC would be lead supervisory authority.
The complainant contested the competence of the Irish DPC as lead supervisory authority and following a formal dismissal of this contest filed an appeal to the Federal Administrative Court.
The Federal Administrative Court upheld the decision by the DPA. It ruled that GDPR does not grant a right of appeal against agreements between supervisory authorities on the competent
Updata Edition 14 October to December 2021 | Austria
Links
Decision (German)
12
Austria
Development
Summary
lead supervisory authority. The fact that a supervisory authority is not considered to be competent could only be made subject to an appeal against the final decision on the matter by the DPA.
Furthermore, the complainant could have filed an appeal against the DPA's decision to halt the proceeding in the first place, arguing that Article 56 GDPR was not applicable to this case. As the complainant had not done so, the Federal Administrative Court deemed Article 56 GDPR to be applicable.
Date
Higher Regional Court Linz: data subject's representation costs in successful complaint can be claimed as damages under Article 82 GDPR
The Higher Regional Court of Linz ruled that a complainant in a DPA procedure in which the unlawfulness of the data processing by the defendant controller is established may claim their representation costs (including appropriate lawyer fees) as compensation under Article 82 GDPR from the defendant in a civil law proceeding.
It remains unclear from the decision whether this applies to all DPA cases or if this is limited to cases where the violation of GDPR had to have been obvious to the defendant.
The court also stated obiter dictum that this may not apply to the defendant's representation costs where a DPA complaint is unsuccessful. According to the Court, it may be argued that data subjects filing a complaint at the DPA do not have to bear the defendant's representation costs, even if the complaint is unsuccessful.
10 December 2021
Austrian DPA: Purchase of Customer Data Base via asset deals does not entitle purchaser to send direct marketing messages to these customers
The Austrian DPA decided on a complaint claiming a violation of the direct marketing rules implemented by Austria under the EU's ePrivacy Directive.
The defendant acquired assets of an insolvent company, including goodwill and customer base. The defendant then sent email newsletters to the former customers of the insolvent company listed in the purchased customer base for direct marketing purposes. The recipients of the messages had not consented to receiving these direct marketing messages by the purchaser of the assets.
The Austrian DPA ruled that this direct marketing was a violation of the ePrivacy Directive and its Austrian implementation.
Date of Decision: 10 December 2021
Published: 3 November 2021
Links
Decision (German)
Summary of Decision in DPA Newsletter (German)
Updata Edition 14 October to December 2021 | Austria
13
Austria
Development
Summary
Date
According to these rules, the sending of electronic mail for direct marketing purposes without consent is only lawful if the sender has received the contact information for the messages in connection with a sale or service to its customers. In this case, since the purchaser had no direct contractual relationship with the recipients, the defendant was considered to have violated the rules of the ePrivacy Directive and its Austrian implementation.
The decision is not yet legally binding, the defendant can appeal to the Federal Administrative Court.
In a similar case, the Austrian Supreme Court recently voiced doubts as to whether this legal assessment by the DPA was correct. However, the Austrian Supreme Court only made these comments informally, so there is no final binding decision on this matter by the Austrian Supreme Court yet.
Links
Austrian Federal Administrative Court and Higher Regional Civil Court of Vienna issue contradicting judgments on permissibility of teacher evaluation app under data protection law
The defendant operates a teacher evaluation app. In this app, students can give scores to their teachers and rate them in different categories. This is supposed to provide students and parents with a better picture of the teaching quality at school.
Several teachers filed complaints and a lawsuit against this app based on alleged violations of the GDPR. As GDPR violations can be subject to both administrative complaints at the DPA and the administrative courts as well as to civil lawsuits filed at the civil courts, the matter was handled in separate proceedings both at the administrative courts and at the civil courts.
Following appeals in both proceedings, the Federal Administrative Court and the Higher Regional Civil Court of Vienna issued contradicting rulings.
The Federal Administrative Court upheld a decision by the Austrian DPA, according to which the app did not violate GDPR. It ruled that public interest would overrule the teacher's interest in preventing these evaluations. It also ruled that the security measures taken by the app to prevent abuse were sufficient. While the Court considered abuse of the app not unthinkable, it considered it very unlikely. Therefore, the Federal Administrative Court ruled that the app does not violate GDPR. It is not known
Date of Decision by the Federal Administrative Court: 10 December 2021
Date of the Decision by the Higher Regional Civil Court: 7 December 2021
Decision by the Federal Administrative Court (German)
Media report on decision by Regional Civil Court of Vienna (German)
(NB the decision itself is not yet published)
Updata Edition 14 October to December 2021 | Austria
14
Austria
Development
Summary
whether the complainant has filed an appeal to the Administrative Supreme Court.
In a parallel proceeding, a teacher filed a lawsuit against the app at the Civil Court of Vienna, requesting deletion of the claimant's personal data from the app under Article 17 GDPR. The Civil Court of Vienna deemed the app to be compliant with the GDPR, the claimant appealed to the Higher Regional Court of Vienna. The Higher Regional Court overruled the decision, stating that the app violates GDPR, as the security measures taken were not sufficient to prevent abuse of the app and to prevent harassment via the app. The Court ordered the app to delete the claimant's personal data within 14 days after enforceability of the judgment. It is not known whether the defendant has filed an appeal to the Supreme Court.
Date
Austrian Federal Administrative Court: Information about assumed political affinity is special category personal data (Article 9 GDPR)
The Austrian Federal Administrative Court decided on an appeal against a decision by the Austrian DPA.
Date of Decision: 10 December 2021
The defendant had used publicly available data and other collected data to calculate certain assumptions about data subjects. This included assumed affinity to certain political parties.
Published: 20 December 2021
The claimant filed a complaint against this practice at the Austrian DPA. The DPA ruled in favour of the complainant, the defendant appealed.
The Austrian Federal Administrative Court followed prior case-law by the Austrian Supreme Court in a civil court case. It ruled that data about an assumed affinity to certain political parties is personal data, as even assumptions about an identified or identifiable person can be personal data. As the data referred to political opinion, it was considered special category personal data pursuant to Article 9 GDPR.
As the defendant had not obtained the complainant's consent for the processing of the relevant data, the Court ruled that the defendant had violated the rights of the complainant under GDPR.
Austrian Federal Administrative Court rules on required detail of
The Austrian Federal Administrative Court decided on an appeal against a decision by the Austrian DPA regarding the alleged
Date of Decision: 10 December 2021
Links
Decision (German)
Link to the decision (in German)
Updata Edition 14 October to December 2021 | Austria
15
Austria
Development
privacy policies and responses to data access requests
Summary
Date
violation of GDPR's right to information (Article 14) and right to access (Article 15) by the defendant credit agency.
Published: 21 October 2021
The Court ruled that under Article 15(1)(g) GDPR the controller has to inform the data subject about the sources of the processed personal data only insofar as this information is still available to the controller. The controller is not obliged to document the data sources. It is sufficient for the controller to provide general information about the sources from which the controller collects the personal data.
The Court also ruled that the defendant violated Article 15 GDPR regarding the provided information about storage periods, as the defendant merely stated that personal data is stored for as long as the content was correct, there was no legal reason for deletion and the storage fulfilled the purpose of the processing. The Court ruled that insofar as possible, the controller has to name the different storage periods for the different categories of data and/or different processing purposes. If these storage periods are not mentioned, this violates the right to access and the principle of transparency.
Furthermore, the Court ruled that under Article 14 GDPR, the credit agency has to proactively inform the data subject when the data subject's personal data is transferred to a new recipient (e.g. a customer conducting a credit check on the data subject).
Austrian Supreme Court stays class action proceeding regarding algorithm-based credit rating assessment until CJEU preliminary ruling on Article 80 GDPR
In a class action by an Austrian consumer protection body regarding several contract clauses and business practices under the consumer credit umbrella, the use of an automated in-house scoring system based on information provided by a credit agency, i.e. an algorithm for a first-level credit rating assessment was challenged under Article 22 GDPR (rules on profiling).
The plaintiff based the class action on alleged violation of consumer protection legislation, which they deemed to be a breach of the GDPR. However, the Austrian legislator has not implemented the mandate under Article 80(2) GDPR that would allow organisations such as this consumer protection body to enforce GDPR violations aside from specific mandates from data subjects.
Date of Decision: 10 December 2021
Published: 15 December 2021
Links
Decision (German)
Updata Edition 14 October to December 2021 | Austria
16
Austria
Development
Summary
The first two instances, therefore, have dismissed this part of the class action, stating that the organisation is not authorised under Article 80 GDPR. The Supreme Court is staying the proceeding until the Court of Justice of the European Union has issued its decision in C-701/20 a similar Austrian case concerning the same plaintiff organisation, covered in Updata Edition 11.
Date
Federal Administrative Court: Processing of personal data can be based on more than one lawful basis DPA has to review all possible lawful bases
The Federal Administrative Court decided on an appeal by the operator of a loyalty programme against a decision by the Austrian DPA. The loyalty program had based their processing of personal data on the data subjects' consent. The DPA considered this consent not to be valid and therefore ruled that the loyalty programme violated GDPR.
Date of Decision: 10 December 2021
Published: 8 December 2021
The Federal Administrative Court repealed this decision and referred the matter back to the DPA for further investigation and issuing of a new decision. It ruled that the DPA violated procedural law by only assessing the loyalty programme against the lawful basis of consent. The Court stated that processing of data can be based on more than one lawful basis. Even if the controller only claimed consent as lawful basis, the DPA would have been obliged to assess if the programme could be based on another leg lawful basis (including legitimate interest). Only if the DPA concludes that the processing cannot be based on any applicable lawful basis may it rule that the processing violates GDPR's principle of lawfulness. While referring to the wrong lawful basis may be a violation of the principle of fairness and transparency, this does not automatically render the processing unlawful.
Therefore, the matter was referred back to the DPA for further investigation and issuing of a new decision.
Austrian Strategy for Cybersecurity The Austrian Federal Chancellery has published the Austrian
2021 published
Strategy for Cybersecurity 2021 (sterreichische Strategie
fr Cybersicherheit 2021).
22 December 2021
Austrian DPA fines company EUR 3,000 for lack of cooperation
The Austrian DPA has issued a fine of EUR 3,000 to a company for violation of Article 31 GDPR. In a proceeding, where several employees had filed complaints against the company at the DPA, the DPA requested the company to file a statement regarding
Date of Decision: 22 December 2021
Links
Decision (German)
Austrian Strategy for Cybersecurity 2021 (German) Decision (German)
Updata Edition 14 October to December 2021 | Austria
17
Austria
Development
Summary
these complaints within two weeks. The company did not comply with the request, and only responded months later.
The DPA considered this a violation of Article 31 GDPR and issued a fine of EUR 3,000 to the company for its lack of cooperation. The DPA considered this low fine to be sufficient, taking into account the company's size and financial position.
Date
Published: 17 December 2021
Links
Updata Edition 14 October to December 2021 | Austria
18
Contributors
Belgium
Koen Devos Partner
T: +32 2 737 9360 [email protected] eversheds-sutherland.be
Stefanie Dams Associate
T: +32 2 737 9364 [email protected] eversheds-sutherland.be
Caroline Schell Senior Associate
T: +32 2 737 9353 [email protected] eversheds-sutherland.be
Development
Summary
Date
Belgian DPA finds taking photographs of visitors of gaming establishments disproportionate in an opinion on the EPIS-system and access register
The Belgian Data Protection Authority ("DPA") issued an opinion on a Draft Royal Decree amending two Royal Decrees of 15 December 2004 concerning the Excluded Persons Information System ("EPIS-system") listing the persons who should be denied access to gaming establishments and the access register.
4 October 2021
The DPA stated that it would be disproportionate and unnecessary to take a photograph of each player each time they visit a gaming establishment and to store the photograph in the access register. The DPA further recommended an alternative technological solution based on the use of the electronic authentication module of a players' identity card instead of taking a photocopy of the card on each visit. This would avoidthe Gaming Commission from being aware of (i) the identity of persons visiting gaming establishments as well as (ii) the exact time of such visits.
Belgian Supreme Court rules that the loss of discount must be taken into account in relation to the free nature of consent
In 2019, a merchant was fined by the Belgian DPA due to his request to customers to present their Belgian electronic identity card ("eID") in order to benefit from a loyalty scheme. In the case, the claimant refused to have her eID read in the merchant's computer system to create a customer loyalty card and, as a result - and in the absence of an alternative (such as providing the strictly necessary personal data on paper) - she was unable
7 October 2021
Links
Decision (Dutch) Decision (French)
Judgment (Dutch)
Updata Edition 14 October to December 2021 | Belgium
19
Belgium
Development
Belgian DPA dismissed complaint due to lack of personal interest
Summary
to benefit from the advantage of certain discounts. Later, the Court of Appeal of Brussels annulled the fine, particularly because the personal data of the claimant was not actually processed as she refused to provide her eID to the merchant.
The Belgian Supreme Court ruled on two aspects: (i) that the loss of discounts must be taken into account when evaluating the nature of consent under Article 4(11) GDPR and whether it has been freely given; and (ii) it confirmed that the Belgian DPA is allowed to act on a complaint of a data subject, even if no personal data of the data subject is being processed. This ruling was on the basis that a data subject's rights under the GDPR are triggered when they are obliged to have their personal data processed in order to benefit from an advantage or service.
The Belgian Supreme Court referred the case back to the Belgian Court of Appeal (the Market Court in a different composition). It remains to be seen how that court will deal with these aspects when it re-assesses the merits of the case.
Date
The Belgian DPA issued a decision in a procedure on the merits regarding a complaint due to the unsecured connection of a hospital website. In particular, the hospital website made use of a contact form and a form for the hospital ombudsman, which could both be filled in by website visitors in unencrypted format. According to the complainant, third parties could learn about the (health) data of the data subjects if they filled in these forms using an unsecured connection.
22 October 2021
The Inspection Service of the Belgian DPA established several breaches to the GDPR, amongst them, Articles 32(1), 31(2) and 31(4) GDPR for failing to take adequate measures to ensure the security of special category personal data processed through the website.
Of particular note is that - in order to avoid the insecure processing of their health data - the complainant never made use of these website forms. Therefore, the Litigation Chamber of the Belgian DPA dismissed the complaint as inadmissible due to lack of personal interest.
Links
Decision (Dutch)
Updata Edition 14 October to December 2021 | Belgium
20
Belgium
Development
Summary
This decision differs from the judgment of the Belgian Supreme Court on 7 October 2021. In both cases, the personal data was not processed. However, in this case, the complainant was unable to argue that they were not able to make use of a service, because there were other options (e.g. telephone contact or filling in the forms on the hospital premises). Therefore, it was decided that the complainant had no personal interest because he was not able to demonstrate any disadvantage.
Date
Belgian DPA shares a draft ruling on the IAB Europe Transparency and Consent Framework with other DPA's
In October, the Interactive Advertising Bureau, IAB Europe, announced that the Belgian DPA planned to share with other European data protection authorities a draft ruling that will conclude its investigation of IAB Europe and its role in the Transparency & Consent Framework ("TCF"). The IAB Europe TCF is the global cross-industry effort to help publishers, technology vendors, agencies and advertisers meet the transparency and user choice requirements (i.e. collecting consent to cookies) under the GDPR.
The draft ruling is expected to identify that the TCF infringes the GDPR, because the Belgian DPA considers IAB Europe to be a controller, and the digital signals the TCF creates to establish data subjects' consent to cookies are personal data under the GDPR. Until now, the IAB Europe has not considered itself a controller in this regard and therefore does not comply with the obligations for controllers under the GDPR. The ruling is also expected to state that the infringements should be remedied within six months following the issuing of the final ruling.
On 25 November, the Belgian DPA shared its draft ruling with the other European data protection authorities. The data protection authorities had 4 weeks to provide the Belgian DPA with feedback. If the data protection authorities object to the draft ruling, the EDPB may issue a binding decision on the matter. The draft ruling is expected to be released at the beginning of 2022.
Date of the announcement: 22 October 2021
Date of draft ruling: 25 January 2022
Brussels Court of Appeal rules that a controller can refuse an access request if the data subject abuses this right
The Brussels Court of Appeal (Market Court) ruled on the question of whether a controller always needs to comply with a data subject's access request, determined by Articles 14 and 15 GDPR. In this case, the data subject requested access to files of the Belgian Ministry of Finance in response to the mentioning of
1 December 2021
Links
Press release (Dutch) Press release (French)
Judgment (Dutch)
Updata Edition 14 October to December 2021 | Belgium
21
Belgium
Development
Summary
their name in various files concerning tax investigations. The Ministry of Finance rejected this request. As a consequence, the data subject filed a complaint with the Belgian DPA, which ordered the Ministry of Finance to comply with the access request.
The Ministry of Finance appealed the Belgian DPA decision at the Market Court which made a ruling on the following points:
1. a controller can refuse a request to access if the data subject abuses their right of access;
2. the Belgian DPA is obliged to examine whether a complaint regarding access rights is abused if used for another purpose other than what the right of access is intended for. Therefore, the DPA's decision should be annulled; and
3. a controller is allowed to exceed the term of one month to fulfill the request of access (Article 12(3) GDPR) when it provides a reasonable justification. Of particular importance is that a controller replies within a "reasonable" term, referring to recital 86 GDPR.
Date
Belgian DPA publishes recommendation on the processing of biometric data
The Belgian DPA published its recommendation on the processing of biometric data.
The processing of such data is a priority in the Belgian DPA's Strategic Plan of 2020-2050.
Biometric data (e.g. fingerprints, iris scans, facial images, gait recognition and navigation habits) are unique and sensitive data belonging to the special categories of personal data as set out in Article 9(1) GDPR. In principle, the processing of such data is prohibited, unless one of the lawful bases in Article 9(2) GDPR (in addition to Article 6(1) GDPR), is invoked.
The Belgian DPA distinguished two main lawful bases which are (i) consent; and (ii) substantial public interest. In Belgium, to date, the lawful basis of substantial public interest can only be invoked in the context of the eID and passport.
6 December 2021
Links
Recommendation (Dutch) Recommendation (French)
Updata Edition 14 October to December 2021 | Belgium
22
Belgium
Development
Summary
The recommendation also specifies that controllers always need to verify whether it is essential to use biometric data for the purpose(s) they are pursuing.
In addition, the Belgian DPA listed certain examples of legitimate purposes for the processing of biometric data, including among other things: authentication for security purposes; recording working time in a professional context; direct marketing; and screening of public places to prevent criminal activity.
Date
Act transposing the EU Electronic Communications Code and amending provisions on electronic communications
The Belgian legislator has issued an Act transposing the Directive (EU) 2018/1972 concerning the establishment of the European Electronic Communications Code in Belgium and amending various provisions of the Act of 13 June 2015 on electronic communications ("Telecom Act").
21 December 2021
The Act, among other things, sets out an introduction of a new exception to the general principle of the secrecy of electronic communications in the Telecom Act. The Act aims to combat phishing and `smishing' (i.e. phishing carried out over mobile text messaging) by targeted machine screening of the content of those messages. Operators must be transparent to the end-users, the data concerned may only be processed by persons charged with anti-fraud activities on behalf of the operator and the processing of the data is limited to the acts and duration necessary to combat fraud or to the end of the period during which a judicial challenge is possible.
In addition, importantly, Article 129 of the Telecom Act concerning the storage of information or gaining access to information (i.e. concerning cookies) is repealed its content will be moved to the new Article 10/2 in Belgium's primary data protection statute, the Act of 30 July 2018 (however without any material changes at this point).
Links
Act (French and Dutch)
Updata Edition 14 October to December 2021 | Belgium
23
Contributors
China
Jack Cai Partner
T: +86 21 61 37 1007 [email protected] eversheds-sutherland.com
Jerry Wang Associate
T: +86 21 61 37 1003 [email protected] eversheds-sutherland.com
Sam Chen Of Counsel
T: +86 21 61 37 1004 [email protected] eversheds-sutherland.com
Development
Measures for Security Assessment of Cross-border Data Transfer (Draft for Comment)
Summary
On 29 October 2021, the Cyberspace Administration of China ("CAC") issued the Draft Measures for Security Assessment of Cross-border Data Transfer (the "Draft Measures") for public consultation. Whilst the Draft Measures are yet to be finalised, the key points are summarised follows:
- data processors will be subject to mandatory CAC-led data cross-border transfer security assessments in the following circumstances:
- transfer of personal information and important data collected and generated by Critical Information Infrastructure Operators ("CIIOs");
- transfer of important data;
- transfer of personal information by data processors who process 1 million or more individuals' personal information;
- cumulatively transferring personal information of 100,000 or more individuals or the sensitive personal information of 10,000 or more individuals; or
- other conditions to be specified by the CAC.
Date
29 October 2021
Links
Measures for Security Assessment of Crossborder Data Transfer (Draft for Comment)
Updata Edition 14 October to December 2021 | China
24
China
Development
Summary
- before transferring data outside of China, all data processors will be required to conduct an internal risk assessment, regardless of whether they are subject to a CAC-led security assessment.
- the responsibilities and obligations in data security protection in respect of cross-border data transfers will need to be fully stipulated in the relevant contract with the overseas data recipient. A standard form contract is yet to be published by the competent government authority.
Date
Administrative Measures for Data Security in the Field of Industry and Information Technology (for Trial Implementation) (Draft for Comment)
On 30 September 2021, the Ministry of Industry and Information Technology ("MIIT") published the Draft Administrative Measures for Data Security in the Field of Industry and Information Technology (for Trial Implementation) (the "Draft Measures") for public consultation.
The Draft Measures intend to capture data processing activities in the industrial and telecommunication sectors. The industrial sector includes businesses engaged with the processing and use of raw materials, equipment, electronic information manufacturing, software and information technology services and civil explosives.
The Draft Measures take a risk based approach with stricter obligations and controls imposed when important and core data is involved. Further, the Draft Measures propose a "departmentlocality-enterprise" three-tier protection mechanism. Responsibilities for pre-emptive controls, detections and emergency managements are intricately divided among the MIIT, its local counterparts and the local telecommunications regulatory body and the data processors themselves leveraging the unique traits of different bodies to build a comprehensive data safety working mechanism.
A key highlight is the classification of core, important and ordinary data. The Draft Measures classifies data based on the degree of harm following any tampering, destruction, leakage, illegal access and illegal use.
30 September 2021
Links
Administrative Measures for Data Security in the Field of Industry and Information Technology (for Trial Implementation) (Draft for Comment)
Updata Edition 14 October to December 2021 | China
25
China
Development
Regulations on Network Data Security Management (Draft for Comment)
Summary
The Draft Measures sets out data safety management requirements for each stage of a data's life cycle with key measures including:
- "Main body" responsibilities: Processors shall assume the general responsibility for solidifying data safety control.
- Work systems: When important data and core data are involved, the processor will need to establish a specialised data security regulatory body and ascertain the responsibilities of the personnel in charge.
- The term "data destruction" and its details (including subject and procedures) are newly defined, whereby, upon destruction, important and core data shall not be recovered for whatever reason or means.
- The MIIT will establish a filing platform for data processors handling important and core data, covering content such as quantity, category, processing purpose and method.
Date
On 14 November 2021, the CAC released the Draft Regulations on Network Data Security Management (the "Draft Regulations") for public consultation.
As with the Personal Information Protection Law ("PIPL") and the Data Security Law, the Draft Regulations have an extraterritorial reach. They capture data processing activities of individuals and organisations located outside the PRC: (i) for the purpose of providing products or services in the PRC; (ii) to analyse and evaluate the behaviour of individuals and organisations within the PRC; or (iii) which involve the processing of important data in the PRC.
While the term "important data" has been used for some time, it has not until this set of Draft Regulations been clearly defined. The Draft Regulations summarise it as data which may cause harm to national security or the public interest following a data compromise. The Draft Regulations helpfully sets out a nonexhaustive list on what this may include, such as undisclosed government affairs-related data, national economic operation data, data concerning operation of critical information infrastructure.
14 November 2021
Links
Regulations on Network Data Security Management (Draft for Comment)
Updata Edition 14 October to December 2021 | China
26
China
Development
Regulations on Data of Shanghai Municipality
Summary
Date
The Draft Regulations also impose on data processors and internet platform operators enhanced or even novel obligations under certain circumstances. These include, but are not limited to:
- Special obligations on processors of important data and processors of personal information of more than 1 million users to: (i) designate a data protection officer and team; (ii) file with city-level CAC within 15 business days of identifying important data; (iii) arrange regular trainings; and (iv) conduct annual security assessments.
- Special obligations on internet platform operatorsregarding: (i) terms, privacy notices and algorithms; (ii) liability for third party products and services; (iii) annual security audits; (iv) instant messaging interfaces; and (v) not manipulating data to discriminate, mislead or prevent mid or small scale enterprises from fairly obtaining their data.
- Network security assessments: The Draft Regulations propose a new type of regulatory approval that processors will require from the CAC prior to (i) mergers, restructurings and spin-offs occuring if national security may be endangered as a result; (ii) listing on overseas stock exchanges, if it will handle personal information of more than one million data subjects; and (iii) listing in Hong Kong, if national security may be endangered.
On 25 November 2021, the General Office of the Standing Committee of the Shanghai Municipal People's Congress published the Regulations on Data of Shanghai Municipality (the "Shanghai Regulations"). Key takeaways from the new regulations include:
1 January 2022
- Personal data: There is a chapter covering the special protection of personal data, the provisions of which are generally in line with the PIPL of the PRC.
- Public data: Similarly, the Shanghai Regulations also discuss public data at length in a separate chapter, referring to the data collected and generated by the government authorities, public institutions, authorised organisations performing public administration functions, and organisations providing public services (e.g. water/power/gas supply) in the course of
Updata Edition 14 October to December 2021 | China
Links
Regulations on Data of Shanghai Municipality
27
China
Development
Measures for Cybersecurity Reviews
Summary
performance of their duties or services. Public data shall be shared through the big data resources platform and can be operated and managed by government-authorised organisations.
In light of the Pudong New Area and the Yangtze River Delta Region's tactical positions, the Shanghai Regulations dedicate two separate chapters to support their innovative and pivotal roles in data development as follows:
- establishing a data exchange in the Pudong New Area including exploring a transaction mechanism integrating regulatory audits and information disclosure in order to progress China's position in the data exchange arena;
- formulating a catalogue of low-risk data for cross-border flow in the Lingang Special Area to facilitate freer while legally compliant cross-border flow of data; and
- launching data cooperation initiatives in the Yangtze River Delta Region, including formulating resource catalogues, implementing a data sharing platform and associated quality and safety controls so as to support long term digital enhancement and transformation across industries within the region.
Date
On 28 December 2021, the CAC, the National Development and Reform Commission, the MIIT and other government authorities jointly passed the final version of the Measures for Cybersecurity Reviews (the "Measures") with the Measures coming into force on 15 February 2022.
The Measures reiterate the priority of safeguarding national security from cybersecurity risks, with the key takeaways being:
- Scope of application: The Measures will apply to the data processing activities of internet platform operators and the procurement of network products and services by critical information infrastructure operators. This contrasts against the previous draft which proposed to capture data processing activities of data processors instead of internet platform operators.
15 February 2022
Links
Measures for Cybersecurity Reviews
Updata Edition 14 October to December 2021 | China
28
China
Development
Summary
- Overseas listing: The Measures retain the requirement from the revision draft that internet platform operators in possession of personal information of more than 1 million users seeking an overseas listing must apply for a cybersecurity review. Operators affected will need to submit the relevant application before it puts forward its listing proposal with the foreign securities regulator.
As such, there would be 3 possible scenarios in relation to applications for cybersecurity review: (i) no cybersecurity review is required; (ii) the entity may proceed with overseas listing if a cybersecurity review determines that national security is not endangered; or (iii) cessation of overseas listing procedures if a cybersecurity review concludes that national security may be endangered.
- Special review period: Following the extension of the review period to 3 months in its previous revision draft, the Measures standardise the period to 90 business days regardless of the number of days in the months concerned.
Date
Practical Guide to Cybersecurity Standards Guide to Network Data Classification and Grading --
On 31 December 2021, the National Information Security Standardization Technical Committee published the Practical Guide to Cybersecurity Standards Guide to Network Data Classification and Grading (the "Practical Guide"). The Practical Guide has a classification model for different categories and levels of data, with a recommended flow/process for systematically classifying data using the model.
The Practical Guide encourages a multi-faceted categorisation approach setting out five different dimensions to be considered, as essential components of how data should be categorised. These are as follows: (i) individual civilians; (ii) public management; (iii) information dissemination; (iv) industrial sectors; and (v) organisational management.
For its level-based grading, the Practical Guide reinforces the mechanism under the Data Security Law and ranks data into core, important and ordinary data based on the target and degree of harm following any tampering, destruction, leakage, illegal access and use.
31 December 2021
Links
Practical Guide to Cybersecurity Standards Guide to Network Data Classification and Grading
Updata Edition 14 October to December 2021 | China
29
China
Development
Administrative Provisions on Algorithm Recommendation of Internet Information Services
Summary
Date
On 31 December 2021, the CAC, the MIIT, the Ministry of Public Security and the State Administration for Market Regulation ("SAMR") jointly published the final version of the Administrative Provisions on Algorithm Recommendation of Internet Information Services (the "Administrative Provisions").
The Administrative Provisions aim to require service providers to commit to promoting mainstream values and a healthy, orderly and fair algorithms landscape, as well as protect the rights of public users (including the right to know and right to choose and special protection accorded to certain groups).
We highlight below the key updates and enhanced protection conferred under the Administrative Provisions compared to the previous draft:
- Special protection for the elderly: In the last quarterly update, we covered that minors are to be shielded from information detrimental to their physical and psychological health. The Administrative Provisions now extends its protection to elderly encouraging the provision of smart elderly-oriented services in light of their travel, medical, consumption and work needs; and detecting and defending against internet fraud.
- Service providers offering internet news recommendation services would be required to obtain the relevant legal license. The Administrative Provisions expressly prohibit the generation of disinformation and the dissemination of news published by units falling outside the state's regulatory scope.
- Updates to the filing timeline: In the event of any update to information filed, relevant service providers shall undergo the formalities within 10 business days of such change. Deregistration procedures shall be conducted within 20 business days upon the date on which service is terminated.
- Service providers would be required to retain a web log in order to facilitate safety evaluation and checks by the authorities.
The Administrative Provisions will take effect on 1 March 2022.
1 March 2022
Updata Edition 14 October to December 2021 | China
Links
Administrative Provisions on Algorithm Recommendation of Internet Information Services
30
France
Development
CNIL publishes white paper on data and payment instruments
Contributors
Gatan Cordier Partner
T: +33 1 55 73 40 73 [email protected] eversheds-sutherland.com
Emmanuel Ronco Partner
T: +33 6 15 40 00 47 [email protected] eversheds-sutherland.com
Charlotte Haddad Associate [email protected] eversheds-sutherland.com
Edouard Burlet Associate [email protected] eversheds-sutherland.com
Mlanie Dubreuil-Blanchard Associate [email protected] eversheds-sutherland.com
Summary
On 6 October 2021, the CNIL (the French data protection authority) published a white paper on data protection and payment methods, with the aim to improve both the general public's and professionals' understanding of related issues in these areas.
The white paper addresses a wide range of topical issues, including the international circulation of payment data, anonymity and the use of cash, the new risks arising from the increasing digitisation of payment transactions; the use of
Vincent Denoyelle Partner
T: +33 1 55 73 42 12 [email protected] eversheds-sutherland.com
Camille Larreur Associate
[email protected] eversheds-sutherland.com
Clmence Dubois Ahlqvist Associate [email protected] eversheds-sutherland.com
Killian Lefevre Associate [email protected] eversheds-sutherland.com
Naomi Bellaiche Associate [email protected] eversheds-sutherland.com
Date
6 October 2021
Links
CNIL's statement (in French)
CNIL's white paper (in French)
Updata Edition 14 October to December 2021 | France
31
France
Development
Summary
Date
"crypto-currencies"; and the application of the main principles of the GDPR in the field of payments
The white paper sets out the CNIL's concerns in this field, in the form of eight key messages for both the field and general public:
1. the protection of the anonymity of payments, the use of cash and the freedom to choose methods of payment;
2. the importance of protecting the confidentiality of transactions from the outset in the ongoing Digital Euro project, launched by the European Central Bank in July;
3. the importance of focussing on mobile payments, which have considerable potential for development;
4. the advantages of making compliance with the GDPR an asset of trust for customers who are encouraged to entrust their data for new uses;
5. the main points of application of the GDPR, on which the CNIL wishes to provide legal certainty;
6. the importance of the security of payment data and the work on the "tokenisation" of this data as good practice;
7. a debate on the location of payment data in Europe, as a contribution to the ongoing debate on European digital sovereignty; and
8. recommendations for the future European Payments Initiative ("EPI"), which is the European card network currently being created.
The CNIL wishes to develop a reference framework for compliance with the GDPR for all stakeholders in the field. To this end, the white paper is accompanied by an online public consultation, open until 15 January 2022.
CNIL publishes guidance on alternatives to third-party cookies for targeted advertising
The CNIL released a statement on the alternatives to third-party cookies used for online targeting and advertising. It stresses that such technologies must comply with the data protection legal framework, including the rules regarding consent and the rights of data subjects.
13 October 2021
Updata Edition 14 October to December 2021 | France
Links
CNIL's guidelines (in French)
32
France
Development
Summary
The CNIL describes what third-party cookies are and how they are used on the Internet by the advertising industry. In particular, it explains how third-party cookies allow a user's specific online behaviour to be tracked (e.g. what they have clicked on, search history, device details and shopping preferences) and thereby enable advertisers to target consumers at an individual level. These tracking practices have led to the development of the "adtech" industry (short for advertising technologies). The CNIL then chronicles the tracking limitation initiatives introduced by certain web browsers since 2017 (e.g., Safari and Firefox) as well as Google's commitment to phase out third-party cookies on its Chrome web browser by 2023.
In this context, the CNIL notes the development of several alternatives to the use of third-party cookies including (1) fingerprinting, which aims to identify users by leveraging the characteristics of their web browser, (2) single sign-on (a single user account and single authentication for websites, applications and services), (3) unique identifiers, which aim to identify users with indicative hashed data collected during the users' web browsing, and (4) cohort-based targeted advertising, which focuses on building a group of users with similar characteristics and identifying users of a same group.
However, the CNIL stresses that these alternatives must not be developed at the expense of the users' right to privacy and their right to the protection of their personal data. They must also comply with the legal framework in force such as the GDPR and the ePrivacy Directive. In particular, the CNIL underlines that any tracking for advertising purposes must rely on the informed consent of the user, even though it may be based on information/characteristics from a user's web browser or device.
The CNIL's guidelines and recommendation on cookies and other tracking devices also apply to the use of these techniques. The CNIL further stresses that digital stakeholders should make other key data protection considerations when developing alternatives to third-party cookies, including (i) allowing and facilitating the exercise of all data subjects' rights, (ii) avoiding the processing of sensitive data and (iii) assessing their roles and responsibilities in the implementation of these techniques in order to determine their respective obligations.
Date
Updata Edition 14 October to December 2021 | France
Links
33
France
Development
Summary
The CNIL will continue to monitor ongoing developments and may publish more detailed analyses moving forwards.
Date
Links
Paris public transportation organisation fined EUR 400,000 for several breaches in relation to employee Data
The CNIL has fined a public transportation operator in the Paris region) EUR 400,000 after finding out that several bus centres had included number of strike days in employee evaluation files which were used to prepare promotion choices. It also found that the data had been kept for an excessive period of time and that there had been breaches of data security.
Following a complaint from a trade union organisation in May 2020, the CNIL carried out several investigations at the bus centres and found three breaches of data protection laws. This case highlights the enhanced level of scrutiny by trade unions in France, in the event of non-compliance with the GDPR in relation to employee data.
CNIL's statement (in French): 13 October 2021
CNIL's deliberation (in French): 29 October 2021
The three main areas in which the transport operator breached data protection laws were:
1. Excessive collection of personal data
As part of its internal promotion process, each bus centre relies on a file prepared by the HR department. This file should only contain the information necessary for employee evaluation purposes. However, the CNIL found that in the bus centres it investigated, an additional column had been added to the employees' file regarding the number of days on which employees had been on strike for every given year. During the CNIL procedure, the transport operator recognised that this practice was unlawful and contrary to its general policy.
Pursuant to the principle of data minimisation set out in the GDPR, the CNIL found that the use of the data was not necessary for promotion evaluation purposes and that the total number of absent days would have been sufficient for those purposes.
2. Excessive data retention periods
The CNIL's investigations revealed that the transport operator was retaining employee data for longer than necessary to achieve the intended purposes. In particular, the relevant data
CNIL's statement (in French)
CNIL's deliberation (in French)
Updata Edition 14 October to December 2021 | France
34
France
Development
Supreme Court decides that unlawfully collected CCTV footage may be used in judicial proceedings in certain circumstances
Summary
Date
was stored on HR systems such that a significant number of individuals had access the data.
In addition, the operator retained employee evaluation files for over three years after the period during which those files were used (instead of 18 months).
However, the CNIL took into consideration the fact that the operator implemented remediation measures during the procedure.
3. Breach relating to data security
The CNIL found that the operator did not sufficiently distinguish between the different levels of agent clearance. Authorised agents had access to all categories of data contained in the application (in particular all HR data) without distinguishing between the agents' functions or missions. In addition, such agents had access to the data relating to all bus centres (not limited to their own centre). The agents also had the possibility to of extracting all the data contained in the application.
The CNIL determined that this configuration created a risk that the data would be misused and meant that the data was not sufficiently confidential.
During the procedure, the operator announced that it had taken measures to remediate the breach identified by the CNIL.
On 10 November 2021, the French Supreme Court (Cour de cassation) made a ruling relating to the use of unlawfully collected CCTV footage. In particular, that CCTV footage may be used to justify the dismissal of employees even subject to certain conditions where such CCTV footage has been collected unlawfully in the judicial proceedings.
In this case, a cashier at a pharmacy was dismissed for gross misconduct. The dismissal was substantiated using CCTV footage. However, the dismissed employee argued that the use of such CCTV footage as evidence was unlawful. The employees had not been informed that the CCTV system would be used for employee monitoring purposes, and this was therefore a breach of privacy
10 November 2021
Updata Edition 14 October to December 2021 | France
Links
French Supreme Court decision (in French)
35
France
Development
CNIL publishes guidance on DPOs
Summary
Date
and data protection laws applicable at the time of the dismissal. When rolling out the CCTV system, the employer issued an internal note, signed by employees, stating that CCTV would be used for safety purposes but did not mention any use for employee monitoring purposes. Similarly, the company's works council was not informed of the use of CCTV for these purposes. As a result, the production of the unlawfully collected CCTV footage as part of the wrongful termination judicial proceedings constituted unlawful evidence. As a general principle under French law, unlawful evidence may not be used in judicial proceedings.
However, this principle is limited by several exceptions. In this case, the French Supreme Court held that unlawful evidence should not automatically be dismissed from the judicial proceedings. The court must assess whether the use of such evidence could hinder due process, by balancing the employee's right to have respect for their personal life and the right to evidence. The latter may justify the useof evidence that infringes an employee's privacy, provided that it is essential to the exercise of the right to evidence and that the infringement is strictly proportionate to the aim pursued.
As a result of this decision, first-instance courts must assess on a case-by-case basis whether unlawfully collected CCTV evidence should be admitted as part of the parties' pleadings.
In November 2021, the CNIL published a guide on data protection 16 November 2021 officers ("DPOs") containing practical advice and best practices for both companies with DPOs and DPOs themselves can refer.
The CNIL's guide provides information and answers to frequent questions on the following topics, including:
1. The role of the DPO The CNIL explains in detail how the DPO has to provide advice and support to the company, control enforcement of the GDPR, be the point of contact for both the CNIL and data subjects, and maintain the documentation on personal data processing;
2. The designation of a DPO The CNIL provides guidance on how to interpret the list of entities as set out in the GDPR who are required to appoint a DPO, who can be designated
Updata Edition 14 October to December 2021 | France
Links
CNIL's statement (in French) CNIL's guide (in French)
36
France
Development
CNIL continues cookie banner sweep and sends formal notices additional organisations
Summary
Date
as DPO (including practical advice on where there can be a conflict of interests incompatible with the activity of DPO), how the DPO function can be externalised or mutualised, and how the DPO should be designated in practice; and
3. The activities of a DPO This section contains details on the day-to-day role of the DPO, their required independence and status, and how to handle their departure or annual leave.
The CNIL also answers certain frequently asked questions, including on the DPO's location, use of foreign language and training.
On 14 December 2021, the CNIL published a statement indicating that it is continuing to verify compliance with cookie regulations. Online checks conducted by the CNIL have revealed that a number of organisations still do not enable internet users to refuse cookies as easily as they can accept them.
As of December 2021, the CNIL has sent about 30 additional formal notices, taking the total number of formal notices issued to various organisations since May 2021 to approximately 90.
In the CNIL's recent online checks they found the following issues:
1. cookies subject to consent being automatically deposited on the user's device before consent is provided;
2. cookie banners which do not enable users to reject cookies as easily as to accept them; and
3. cookies subject to consent being deposited even though the user has rejected them.
Formal notices have been issued to various types of organisations (including public institutions, universities, companies in the clothing, transportation and retail sectors). These organisations have one month to update their practices so they are compliant. Failure to comply may result in fines of up to 2% of the global turnover of these organisations.
14 December 2021
Links
CNIL's statement (in French)
Updata Edition 14 October to December 2021 | France
37
France
Development
CNIL publishes formal notice against facial recognitition platform for unlawful use of sensitive data
Summary
Date
Links
After receiving complaints from data subjects and a warning from the Privacy International association, the CNIL opened an investigation against US facial recognition platform in May 2020.
The CNIL found that the platform had breached the GDPR in two instances, for:
1. the lack of appropriate lawful basis for the processing of individuals' photographs and videos; and
2. the failure to comply with data subjects' requests.
The platform's facial recognition software makes use of a database comprising photographs and videos scraped from publicly available internet sites, including on social media. The database contains over 10 billion images and fuels a search engine in which a individual's photograph can be used to attempt to identify them. A "biographic template" (i.e. a digital impression of the person's physical characteristics) is created for every individual whose image is collected by the facial recognition platform.
According to the CNIL, the facial recognition platform did not rely on any lawful lawful basis to process the photographs or videos it collects. No consent was collected from the data subjects, and the CNIL und concluded that the platform was not entitled to rely on legitimate interests, since the millions of French internet users whose images were collected (from social media or other websites) did not expect that their photographs or videos to be used for this purpose (especially given that the company indicated that its software was commercialised to law enforcement agencies).
In addition, the CNIL found that the platform:
- limited the ability of data subjects to exercise their rights under the GDPR (e.g. by restricting the scope of the right of access to data collected during the previous year without justification, and by limiting the number of times per year a data subject may exercise this right);
- did not respond to some requests relating to the rights of access or erasure; and
CNIL's statement (in French): 14 December 2021
CNIL's deliberation (in French): 26 November 2021
CNIL's statement (in French)
CNIL's deliberation (in French)
Updata Edition 14 October to December 2021 | France
38
France
Development
Summary
Date
- only provided partial responses to some requests and/or or did not respond within the timelines provided by the GDPR.
The CNIL has therefore formally ordered the platform to:
- stop collecting and using personal data of individuals located in France, in the absence of an appropriate lawful basis; and
- facilitate the exercise of data subjects' rights and comply with the erasure requests it receives.
The platform must comply with the injunctions within two months' of the CNIL's decision or the CNIL may impose a sanction (including an administrative fine).
Links
Updata Edition 14 October to December 2021 | France
39
Germany
Development
Competent court for actions for damages under the GDPR
Supervisory authorities are not authorised to order the
Contributors
Alexander Niethammer Managing Partner
T: +49 89 54 56 52 45 [email protected] eversheds-sutherland.com
Nils Mller Partner
T: +49 89 54 56 51 94 [email protected] eversheds-sutherland.com
Lutz Schreiber Partner
T: +49 40 80 80 94 444 [email protected] eversheds-sutherland.com
Sara Apenburg Senior Associate
T: +49 40 80 80 94 446 [email protected] eversheds-sutherland.com
Constantin Herfurth Associate
T: +49 89 54 56 52 95 [email protected] eversheds-sutherland.com
Isabella Norbu Associate
T: +49 89 54565 191 [email protected] eversheds-sutherland.com
Philip Kuehn Associate
T: +49 40 80 80 94 413 [email protected] eversheds-sutherland.com
Jeanette da Costa Leite Associate (PSL)
T: +49 89 54 56 54 38 [email protected] eversheds-sutherland.com
Summary
Date
According to a ruling by the Berlin-Brandenburg Fiscal Court, the civil courts have jurisdiction over disputes concerning the existence of a claim for damages under the GDPR. This applies even if the data subject wishes to assert a claim for damages against the tax office. According to the court, a split allocation of legal proceedings would lead to an additional burden for the data subject and is therefore not constitutionally justified.
27 October 2021
According to the Cologne Administrative Court, data protection authorities are not authorised to order the appointment or
10 November 2021
Links
Judgment
Decision
Updata Edition 14 October to December 2021 | Germany
40
Germany
Development
appointment/ dismissal of a data protection officer
Summary
dismissal of an in-house data protection officer. Section 58 of the GDPR does not provide a lawful basis to do so. Rather, the supervisory authority may order the controller to bring processing operations into compliance with the GDPR in a specific manner and within a specific period of time.
Date
Use of the cookie consent preference tool found unlawful due to data transfer to US
A service making it possible (in collaboration with third party tools) to obtain and store user consent preferences regarding the use of cookies has been found to be unlawful. The tool enables monitoring of all cookies used and automatically blocks those for which there is no consent. However, according to the Administrative Court of Wiesbaden, the use of this tool is unlawful, because a third-country transfer of personal data to the US takes place in which the user's IP address is transferred, which makes the data subject identifiable. The transfer requires the consent of the user and transparent information to be provided about the associated risks, which was not obtained in this case. Companies should check their existing consent management tool and carefully select providers going forward.
1 December 2021
Admissibility of customer satisfaction surveys by e-mail
The Thuringian data protection commissioner states in his activity report that customer satisfaction surveys sent by e-mail without the explicit consent of the data subject not only constitute a competition infringement, but also a GDPR infringement. This is because there is no lawful basis for the respective data processing. In particular, legitimate interests cannot be used as a justification, as the considerations of competition law regarding unreasonable harassment must be taken into account in the context of the balancing of interests under Article 6(1)(f) GDPR. This view is in line with existing case law.
1 October 2021
Misuse of the GDPR right to information
The plaintiff had health insurance with the defendant insurance company and argued against premium increases by the insurance company, which he considered to be unjustified. During the legal dispute, the plaintiff also demanded information from the defendant pursuant to Article 15 GDPR.
The Higher Regional Court commented on the question of when a GDPR request for information is an abuse of rights and therefore the requested company can reject the request: "When interpreting what constitutes abuse of rights in this sense, the
15 November 2021
Updata Edition 14 October to December 2021 | Germany
Links
Judgment
Activity report Not yet published
41
Germany
Development
Compensation for damages in the event of an unjustified credit report
Compensation for damages due to delayed or incomplete GDPR information
Summary
protective purpose of the GDPR must also be taken into account. As can be seen from recital 63 to the Regulation, the purpose of the right of access is to enable the data subject to become aware, easily and at reasonable intervals, of the processing of personal data concerning him or her and to be able to verify the lawfulness of that processing. However, according to the plaintiff's own statement of claim, it is not at all about such an awareness for the purpose of a review of the permissibility of the processing of personal data under data protection law. Sense and purpose is rather ... the examination of possible premium adjustments. However, such a procedure is not covered by the protective purpose of the GDPR ...".
Date
A company had reported a claim to a German credit reference agency without authorisation. The Regional Court of Mainz then awarded the data subject damages in the amount of EUR 5,000 pursuant to Article 82 GDPR. This claim was justified due to the immaterial damage suffered by the data subject as a result of the unjustified entry. In the court's view, a serious violation of personal rights, as previously required by German courts, is not necessary. In the specific case, the social stigmatisation caused by a negative credit score/entry was sufficient.
Companies should note that the issue of grounds for damages under the GDPR is heavily disputed in Germany. Until a final decision is made by the CJEU (decisions pending), companies should defend against such claims, in particular if the data subject cannot substantiate damage at all.
12 November 2021
The Hanover Regional Labour Court ruled that incomplete or delayed information pursuant to Article 15 GDPR can give grounds for a claim for damages by the data subject. According to the court, a materiality threshold partially assumed by other courts is not to be applied. In this case, the court awarded a former employee a claim of EUR 1,250 against his former employer.
22 October 2021
Companies should note that the issue of grounds for damages under the GDPR is heavily disputed in Germany. Until a final decision is made by the CJEU (decisions pending), companies
Links
Judgment Judgment
Updata Edition 14 October to December 2021 | Germany
42
Germany
Development
Summary
should defend against such claims, in particular if the data subject cannot substantiate damage at all.
Date
No liability of the external data protection officer for GDPR violations of the client
In its ruling, the Munich Higher Regional Court found that an external data protection officer cannot be held liable if their client violates the GDPR. This is because the external data protection officer is not the controller for the purposes of the GDPR.
27 October 2021
Assignability of a GDPR claim for damages
The Essen Regional Court commented on the question of whether a claim for damages under the GDPR can be assigned or not due to its highly personal nature. Provided that the assignment is sufficiently defined, the court considers an assignment of a claim for immaterial damages to be permissible.
23 September 2021
Scope of the GDPR right to information
The Munich Higher Regional Court ruled on the scope of the right to information under Article 15(3) GDPR. The court held that the right to information is to be understood comprehensively, so that it also includes telephone notes, file notes, minutes of conversations, emails and letters. This is in accordance with previous rulings made by other German courts.
4 October 2021
EUR 900,000 fine for nontransparent information of data subjects
The Hamburg Data Protection Commissioner imposed a fine of more than EUR 900,000 on an energy supplier due to a breach of the transparency obligations under data protection law from Articles 12 and 13 GDPR. The company had compared data of new customers with data from previous customer contracts in order to identify "frequent changers" who would like to take advantage of new customer bonuses. In assessing the amount of the fine, it was taken into account that the infringement affected around 500,000 people and that the company cooperated fully.
24 September 2021
Technical and organisational measures are not at the disposition of parties
The Data Protection Conference stated in a decision that the technical and organisational measures to be provided by the controller pursuant to Article 32 GDPR are based on objective legal obligations and are not at the disposition of the parties involved. For this reason, a waiver of the technical and organisational measures or a lowering of the legally prescribed standard on the basis of consent pursuant to Article 6(1) GDPR is not permissible. In compliance with the right of selfdetermination of the data subject and the rights of other data
24 November 2021
Links
Judgment Judgment Judgment Press Statement
Decision
Updata Edition 14 October to December 2021 | Germany
43
Germany
Development
Summary
subjects, it may, however, be possible in individual cases (to be documented by the controller) and at the express, own-initiative request of the informed data subject, not to apply certain technical and organisational measures.
Date
New law on cookies applies as of 1 December 2021
On 1 December 2021, the Telecommunications and Telemedia Data Protection Act came into effect. The Act will regulate privacy and data protection in Germany. The law, also known as TTDSG (Telekommunikation-Telemedien-Datenschutzgesetz) provides rules on cookies and similar technologies.
The rules are not new to German companies, as various court decisions in the past already confirmed that the use of nonessential cookies requires the opt-in of the user: Section 24 of the TTDSG now states that cookies can only be used on a website if the visitor has given their informed and clear consent. This consent should be in accordance with GDPR cookie consent i.e. Article 4 and Article 7. In essence, the TTDSG now incorporates "case law" as statutory law.
Companies should also be aware that the TTDSG not only covers cookies but broadens its scope to include "storage of information of the terminal equipment of an end user". The maximum fine for a violation of Section 24 TTDSG is EUR300,000.
Another key change is the broader scope of the TDDSG, which now also covers over the top (OTT) services. A consequence of this for companies is the application of the telecommunication secrecy when providing e.g. chat functions and collaboration tools as part of their services.
1 January 2022
New guidelines on the use of cookies
The German Datenschutzkonferenz (DSK) issued new guidelines on the use of telemedia, in particular cookies, in light of the TTDSG, the new law governing the use of cookies (see above).
1 December 2021
The guidelines distinguish clearly between two acts of processing cookie data: (i) the technical act which is subject to the TTDSG; and (ii) the processing of personal data act which is subject to the GDPR.
The DSK provides details on how to obtain consent lawfully and when consent may not be needed. The DSK also concludes that
Updata Edition 14 October to December 2021 | Germany
Links
Link
Guidance from the supervisory authorities for telemedia providers
44
Germany
Development
New Guidelines on processing COVID-19 data
Summary
personal data that is processed in connection with the regular tracking of user behaviour on websites or in apps, cannot in principle be transferred to a third country on the basis of consent.
Date
The DSK issued comprehensive guidelines on the processing of data in the context of COVID-19. In the guidance, the DSK provides opinions and examples on various scenarios, such as controlling the 3G status of employees. Companies should check whether their internal processes are in line with the guidelines and amend them, where needed.
20 December 2021
Links
Guidelines on processing Covid data
Updata Edition 14 October to December 2021 | Germany
45
Contributors
John Siu Partner
T: +852 2186 4954 [email protected] eversheds-sutherland.com
Jennifer Van Dale Consultant
T: +852 2186 4945 [email protected] eversheds-sutherland.com
Hong Kong
Cedric Lam Partner
T: +852 2186 3202 [email protected] eversheds-sutherland.com
Duncan Watt Consultant
T: +852 2186 3286 [email protected] eversheds-sutherland.com
Development
Personal Data (Privacy) (Amendment) Ordinance 2021 takes effect
Rhys McWhirter Partner
T: +852 2186 4969 [email protected] eversheds-sutherland.com
Summary
Clive Lam Trainee Solicitor
T: +852 2186 3283 [email protected] eversheds-sutherland.com
On 8 October 2021, the Personal Data (Privacy) (Amendment) Ordinance 2021 ("Amendment Ordinance"), previously published in the gazette, came into effect.
The objectives of the Amendment Ordinance include the criminalisation of doxxing acts, empowering the Privacy Commissioner to carry out criminal investigations and institute prosecutions for doxxing and related offences, and conferring on the Privacy Commissioner statutory powers to demand the cessation of disclosure of doxxing messages.
Philip Chow Associate
T: +852 3918 3401 [email protected] eversheds-sutherland.com
Catherine Wang Trainee Solicitor
T: +852 2186 4939 [email protected] eversheds-sutherland.com
Date
Links
8 October 2021
Personal Data (Privacy) (Amendment) Ordinance 2021
PCPD's Implementation Guideline
Updata Edition 14 October to December 2021 | Hong Kong
46
Hong Kong
Development
PCPD make first arrest for suspected doxxing offence
Summary
Pursuant to the Amendment Ordinance, anyone who discloses the personal data of another person without consent, whether recklessly or with intent to cause specified harm to the person or his or her family, such as harassment, molestation, pestering, threat, intimidation, bodily or psychological harm or damage to property, commits the offence of doxxing.
The Office of the Privacy Commissioner for Personal Data ("PCPD") has published the Implementation Guideline for the Amendment Ordinance ("Implementation Guideline") on the same day to provide guidance to the public in relation to the key requirements of the Amendment Ordinance.
Specifically, the Implementation Guideline outlines:
- the scope of doxxing offences under the new section 64 of the Amendment Ordinance
- the criminal investigation and prosecution powers of the Privacy Commissioner
- the powers of the Privacy Commissioner to serve cessation notices and apply for injunctions
- the procedures for lodging a complaint to the Privacy Commissioner regarding doxxing acts
Date
On 13 December 2021, the PCPD arrested a Chinese male aged 31 in the West Kowloon region for a suspected contravention of section 64(3A) of the Personal Data (Privacy) Ordinance (PDPO) relating to "disclosing personal data without consent". The act originated from a money dispute. The PCPD seized one smartphone during the operation. This is the first arrest made by the PCPD pursuant to the new doxxing regime introduced under the Amendment Ordinance.
13 December 2021
Links
PCPD's Media Statement
Updata Edition 14 October to December 2021 | Hong Kong
47
Contributors
Ireland
Marie McGinley Partner
T: +35 31 64 41 45 7 [email protected] eversheds-sutherland.ie Leona Chow Associate
T: +35 31 66 44 25 8 [email protected] eversheds-sutherland.com
Sophie Delaney Associate
T: +35 31 66 44 36 5 [email protected] eversheds-sutherland.ie
Development
Summary
Date
DPC publishes updated Guidance on the Use of Domestic CCTV
The DPC has updated their guidance on the use of domestic CCTV to incorporate the DPC's general approach to handling complaints made in respect of the use of domestic CCTV. The DPC's approach will include assessing the complaint to see if there is evidence of personal data processing, to engage with the parties to try to resolve the matter and to identify the relevant data protection issue(s) for the parties and provide appropriate advice. The update aligns with the DPC's forthcoming Regulatory Strategy 2022-2027.
1 November 2021
DPC publishes updated Guidance on processing COVID-19 vaccination data in the context of employment
The DPC has published updated guidance for employers on the processing of employee COVID-19 vaccination data.
The DPC has clarified that advice from public health authorities in Ireland should indicate what data processing is necessary and legitimate in the context of managing COVID-19 in the workplace and that the primary source of such information is in the Work Safety Protocol: COVID-19 National Protocol for Employers and Workers. The DPC notes clearly that the Protocol does not currently require employers to collect or process the vaccination status of its employees.
1 November 2021
DPC publishes guidance on vaccine certificate checks
The DPC has published guidance for both controllers and data subjects on the checking of vaccination certificates by controllers prior to granting data subjects access to certain premises.
1 November 2021
Links
DPC Guidance
DPC Guidance
DPC Guidance
Updata Edition 14 October to December 2021 | Ireland
48
Ireland
Development
Summary
The DPC notes that a controller must identify a lawful basis for checking the vaccination status of its customers and that such lawful basis should be determined with reference to current public health advice. The DPC has provided a list of premises that are currently required to check vaccination status, in line with current public health guidelines.
Date
Advocate General issues Opinion on Case C-140/20 Commissioner of the Garda Sochna and Others relating to the Communications (Retention of Data) Act 2011
Advocate General Campos Snchez-Bordona has issued his Opinion in Case C-140/20, relating to the Irish regime established under the Communications (Retention of Data) Act 2011 which governs the retention of, and access to, telecommunications metadata by national authorities in Ireland and, in particular, by the Irish police force (An Garda Sochna) in the course of the detection, investigation and prosecution of serious crime.
18 November 2021
DPC welcomes the resolution of proceedings relating to the Irish Public Services Card
Legal proceedings in which the Department of Social Protection ("DSP") appealed against an Enforcement Notice issued by the DPC in relation to the DSP's processing of personal data when issuing Public Services Cards have been resolved. The DPC has welcomed the fact citizens are now being provided with significantly enhanced levels of information to explain what personal data is processed when an individual applies for a Public Service Card and how the personal data is processed.
The DPC also welcomed the DSP's acknowledgment that, in the absence of legislation making specific provision for this, other public sector bodies cannot compel any individual to acquire a Public Service Card as a precondition to the provision of access to public services. At least one other option must now be provided in any case where an individual is required to verify their identity before accessing public services.
Significant changes are also to be made to the DSP's approach to the retention of applicants' personal data.
10 December 2021
DPC publishes final version of Children Front and Centre: Fundamentals for a Child-Oriented Approach to Data Processing guidance
The DPC has published the final version of its guidance `Children Front and Centre: Fundamentals for a Child-Oriented Approach to Data Processing' ("Fundamentals"). The Fundamentals introduce child-specific data protection interpretative principles and recommended measures that will enhance the level of protection afforded to children against the data processing risks
17 December 2021
Links
Advocate General's Opinion DPC Statement
DPC Guidance
Updata Edition 14 October to December 2021 | Ireland
49
Ireland
Development
Summary
posed to them by their use of/access to services in both an online and offline world. The Fundamentals will also assist organisations that process children's data by clarifying the principles and obligations under the GDPR, to which the DPC expects such organisations to adhere.
The Fundamentals have immediate application and operational effect, forming the basis for the DPC's approach to supervision, regulation and enforcement in the area of processing of children's personal data.
The full guidance is available at the link provided.
Date
DPC publishes Regulatory Strategy for 2022-2027
The DPC has published its Regulatory Strategy for 2022-2027 (the "Strategy"), in which it sets out its vision for what it expects to be five crucial years in the evolution of data protection law, regulation and culture. In preparing the Strategy, the DPC engaged in a period of consultation with a broad range of internal and external stakeholders to collate insights and experiences of how the application of the GDPR has impacted individuals and organisations across a wide range of sectors.
A copy of the press release and the Strategy are available at the links provided.
22 December 2021
Links
DPC Statement DPC Strategy
Updata Edition 14 October to December 2021 | Ireland
50
Poland
Contributors
Lithuania
Rimtis Puisys Partner
T: +370 5 239 2391 [email protected] eversheds.lt
Neringa Bubnaityt Senior Associate
T: +370 612 49179 [email protected] eversheds.lt
Development
Summary
Date
Fine imposed on a car rental company for breaching data security under the GDPR
On 29 November 2021, the State Data Protection Inspectorate imposed a fine of EUR 110 000 on car rental platform operator for disclosing customer personal data, including personal identification numbers.
29 November 2021
The breach lasted for almost three years, from 27 February 2018 to 16 February 2021 and involved 110,302 individual claimants, whose personal data was disclosed and made public. The data was obtained from a BACPAC file (DB file) of a non-secure database backup which was created on 27 February 2018.
The Inspectorate concluded that the confidentiality of personal data stored in the DB file should have been protected by basic security measures, as a minimum. However, these were absent and enabled the breach to occur, which led the Inspectorate to impose a sanction for the breach.
Lithuanian Data Protection Inspectorate approves Standard Contractual Clauses for controllers and processors, pursuant to Article 28(8) GDPR
The Lithuanian Data Protection Inspectorate has approved Standard Contractual Clauses ("SCCs") for controllers and processors, pursuant to Article 28(8) of the GDPR.
It is stipulated that the SCCs may not be changed when used for processing of personal data in accordance with Article 28 of GDPR, where the agreement is between a controller and a processor. This requirement is without prejudice to the controller's right to request the inclusion of additional provisions and/or safeguards, provided that they do not directly or indirectly conflict with the SCCs and/or infringe the fundamental rights or freedoms and protections of GDPR.
The approved clauses are available in both Lithuanian and English.
27 December 2021
Links
Inspectorate press release (Lithuanian)
Press release
Updata Edition 14 October to December 2021 | Lithuania
51
Poland
Contributors
Netherlands
Olaf van Haperen Partner
T: +31 6 1745 6299 [email protected] eversheds-sutherland.nl
Judith Vieberink Senior Associate
T: +31 6 5264 4063 [email protected] eversheds-sutherland.nl
Frdrique Swart Junior Associate
T: +31 6 4812 7136 [email protected] eversheds-sutherland.nl
Robbert Santifort Senior Associate
T: +31 6 8188 0472 [email protected] eversheds-sutherland.nl
Sarah Zadeh Associate
T: +31 6 8188 0484 [email protected] eversheds-sutherland.nl
Development
Class action against software companies declared inadmissible by Court of Amsterdam
Summary
Date
In this class action, the Privacy Collective ("TPC") was acting on behalf of Dutch internet users, arguing that two major software companies had violated the privacy of 10 million Dutch Internet users.
Damages of up to EUR 11 billion were claimed under the Settling of Large-scale Losses or Damage (Class Actions) Act ("WAMCA"). The preliminary question was whether TPC had standing in its claims under the rules of the WAMCA.
One requirement for bringing a claim is that the claimant foundation must be able to demonstrate that it is sufficiently representative. However, the District Court in Amsterdam ruled that TPC cannot demonstrate that its claims are adequately supported by interested parties. It stated the following on its website: "(...) We are suing two large tech companies to demand compensation for the large-scale invasion and sale of data of millions of Dutch citizens, without valid consent (...)". By clicking
29 December 2021
Links
Court Ruling (Dutch)
Updata Edition 14 October to December 2021 | Netherlands
52
Netherlands
Development
Dutch Chamber of Commerce has no database right to the trade register
Summary
Date
on the text "support with 1 click" with the thumbs up, internet users could indicate their support.
According to TPC, it obtained more than 75,000 'likes' this way. The District Court is of the opinion that these 'likes' do not establish that TPC has standing on behalf of a sufficiently large part of the group of injured parties affected. It had also not been established whether the persons who supported the represenative action in this way belonged to the group of injured parties. Moreover, no contact details were registered, so TPC could not maintain contact with its supporters, which is required by law.
The District Court did not offer TPC the opportunity to remedy this defect, but declared it inadmissible due to a lack of standing. In this case, the court did not assess the relationship between the GDPR and the WAMCA.
The District Court of Midden-Nederland ruled that the Dutch Chamber of Commerce has no database right to the trade register in a case between the Association for Business B2B Information ("VVZBI") and the Chamber of Commerce.
22 December 2021
The reason for the proceedings was that the Chamber of Commerce had a number of long-term contracts with providers of business information for credit management, risk and compliance and marketing purposes. These parties use the data in the trade register for their own services to their clients.
The Chamber of Commerce had cancelled current contracts so that new terms of use (more favorable to the Chamber of Commerce) could eventually take effect. The new terms of use would stipulate that the re-use of (substantial parts of) the trade register would require permission under database law.
In these proceedings VVZBI claimed a declaratory judgment that the Chamber of Commerce has no database right to the information in the Commercial Register.
The District Court considered that the rationale of database law is to protect substantial investments, by offering the person who bears the risk of the investment the certainty that they will be compensated for it. Therefore, database law has an economic incentive. The District Court then considered that this economic
Links
Court Ruling (Dutch)
Updata Edition 14 October to December 2021 | Netherlands
53
Netherlands
Development
Dutch Supreme Court rules that legitimate interest is the lawful basis for the Dutch Credit Registration Office (BKR) registration
Summary
incentive did not exist for the Chamber of Commerce. After all, the Chamber of Commerce is simply carrying out a statutory task.
The District Court held that the trade register is a "database," but that the Chamber of Commerce bears no risk for substantial investments and therefore does not qualify as a "producer" within the meaning of the Databases Act and so is not entitled to database protection on the trade register.
Date
The Supreme Court has ruled that the appropriate lawful basis for data processing in the Dutch Credit Registration Office ("BKR") is legitimate interest. This means that the data subject has the right to object under GDPR. This right to object does not exist for processing operations that arise from a legal obligation. In the case, the Supreme Court emphasised in the case that the data subject can use other means to challenge disproportionate processing operations.
3 December 2021
By way of background, BKR registrations are the subject of many legal proceedings. The question regularly arises what the lawful basis for BKR registrations is. To date, this has been interpreted differently by various courts. In one of the many BKR proceedings, the Amsterdam District Court therefore referred preliminary questions directly to the Dutch Supreme Court.
The Supreme Court had already established a principle ruling on BKR registrations back in 2011, known as the Santander framework, which emphasised that with all processing of personal data, regardless of the lawful basis, the requirements of proportionality and subsidiarity must always be met in order to uphold the protection of privacy as a fundamental right.
In this current judgment, the Supreme Court confirmed that the Santander framework is still valid. In order to rely on legal obligation as a lawful basis for the processing of personal data (Article 6(1)(c) GDPR), the legal obligation must be based on a sufficiently clear and predictable general binding regulation.
Under the Financial Supervision Act, credit providers are obliged to participate in and consult a credit registration system, but these statutory provisions are not sufficiently clear and precise and their application is not sufficiently predictable (as Art. 6 (3)
Updata Edition 14 October to December 2021 | Netherlands
Links
Court Ruling (Dutch)
54
Netherlands
Development
Tax administration fined for discriminatory and unlawful processing activities
Summary
GDPR requires). It is not clear from these legal provisions which personal data should be registered in the Central Credit Information System, what the conditions for registration are, or the conditions under and time limits within personal data must be removed.
The Supreme Court refers to the basis of legitimate interest: if it is not a legal obligation, and the other lawful bases do not apply either, it must be processing under Article 6(1)(f) GDPR. Having concluded that the basis for the processing is "legitimate interest", the subsequent conclusion is that data subjects have the right to object and the right to erasure of their personal data.
Date
The Dutch Data Protection Authority ("DDPA") has imposed a penalty of EUR 2.75 million on the Dutch tax and customs administration. For years the tax and customs administration has been processing the (dual) nationality of applicants for childcare benefits in an unlawful, discriminatory and therefore improper manner. The DDPA considers it to be a serious violation of the GDPR.
Unlawful processing:
The tax and customs administration should have deleted the data on the dual nationality of Dutch citizens back in January 2014. The dual nationality of Dutch citizens plays no role in the assessment of an application for childcare benefits. Nevertheless, the administration retained and made use of this data. In May 2018, a total of 1.4 million people still had dual nationalities registered in the administration's systems.
In addition, the administration processed the nationality of applicants to combat organised fraud. The data was not necessary for this purpose.
Finally, the administration used the nationality of applicants (Dutch or non-Dutch) as an indicator as part of a system which automatically determined the risk rating of certain applications. This data was also not necessary for the purpose.
In all cases - assessing applications, combating fraud and the risk system - the processing was deemed unlawful and prohibited.
25 November 2021
Links
DDPA Statement (Dutch)
Updata Edition 14 October to December 2021 | Netherlands
55
Netherlands
Development
Summary
Discriminatory processing:
By unnecessarily including data on nationality in all kinds of systems, the administration acted in a discriminatory manner. In this case, the discriminatory processing of nationalities led to infringement of the fundamental right not be discriminated against under GDPR.
Investigation conclusion:
The administration updated its internal systems in line with the DDPA's investigation, and by the summer of 2020, the dual nationalities of Dutch citizens had been completely removed from the systems. The fine was imposed on the Minister of Finance, as the person with responsibility for the processing of personal data at the administration. The administration is able to lodge an appeal against the decision.
Date
Court of Appeals rules that Dutch UBO legislation does not need to be set aside
In preliminary relief proceedings, the foundation Privacy First claimed to render the Dutch Ultimate Beneficial Owners ("UBO") legislation inoperative. The Court of Appeal of The Hague dismissed this action.
The UBO register is a part of the trade register that contains certain personal data of the UBOs of Dutch companies and other legal entities. Part of the UBO information recorded in the UBO Register is accessible to everyone. Another part of the UBO information is accessible only to competent authorities and the Financial Intelligence Unit.
The Privacy First foundation argued in court that the UBO register, and in particular public access to UBO information, violates the right to privacy and goes beyond what is necessary to achieve the purpose of the UBO register. Privacy First argued it is not necessary for the public to have access to UBO information for the purpose of combatting money laundering and terrorist financing with an UBO register.
The Court rejected these claims and ruled that it saw no reason to render the obligation to register UBO information and the (partly) public nature of the UBO register inoperative. The Court came to this conclusion because the EU-derived UBO legislation is in force and the European Court of Justice has yet to rule on its
16 November 2021
Links
Court Ruling (Dutch)
Updata Edition 14 October to December 2021 | Netherlands
56
Netherlands
Development
DDPA fines airline as a result of data security failings
Summary
legality. As long as no such CJEU judgment has been made, the Court cannot put the Dutch state in a position where it is clearly in conflict with the EU legislation. In a similar case, on 13 November, 2020, a Luxembourg court referred questions to the CJEU about the legality of the UBO register. The CJEU's ruling is expected in mid-2022.
Date
The DDPA has issued a fine of EUR 400,000 to an airline for inadequate security of personal data.
The facts were that a hacker was able to penetrate the airline's systems in 2019, gaining access to systems comprising personal data of 25 million passengers, including names, dates of birth, gender, email addresses, telephone numbers and flight and booking details. There is no indication that the hacker actually viewed or copied this data, but they would have been able to due to the insufficient security controls. The unauthorised access lasted until the end of November 2019, at the point at which the airline resolved the breach.
Three key features of this breach were:
1. Passwords were easy to guess. The hacker penetrated the airline's systems in September 2019, via two accounts belonging to the company's IT department. The password security was lacking in three areas:
- the passwords were easy to guess;
- that passwords alone were sufficient to enter the system, in the absence of multi-factor authentication; and
- once the hacker had control of these two accounts, they gained access to a large number of other systems (ie the access controls granted to the two IT accounts were not limited to those necessary for those members of IT staff).
After discovering the data breach, the airline reported the breach to the DDPA in a timely manner and informed the data subjects, taking remedial measures immediately to better protect personal data going forward.
2. Personal data downloaded
12 November 2021
Links
DDPA Statement (in Dutch)
Updata Edition 14 October to December 2021 | Netherlands
57
Netherlands
Development
Summary
The hacker had downloaded personal data from approximately 83,000 data subjects, including a list of passenger data from 2015 which contained names, dates of birth and flight information, and the medical data of 367 individuals retained for onboard assistance reasons.
3. International investigation
The investigation was international in nature given its international customer base. As the airline has its registered office in the Netherlands the DDPA was authorised to conduct the investigation but, due to its international nature, the DDPA also coordinated the investigation with other European data protection supervisors.
The fine is final given that the airline has not lodged an appeal.
Date
Passenger personal data is lawfully processed by rail operator
The Council of State has ruled that the Dutch rail operator processes passenger personal data in a lawful manner. This ruling follows an individual's an appeal to the Council of State, stating they did not want the rail operator to process his personal data during their journeys using their public transport travelcard (which contained a chip).
The individual asked the DDPA to investigate whether it was necessary for the rail operator to process the passenger's personal data from the travelcard. They also asked the DDPA to take enforcement action, but the authority rejected this request.
The District Court of Gelderland had previously ruled in favour of the DDPA, after which the individual appealed to the Council of State. According to the District Court and the Council of State, a transport contract is formed at the moment a passenger boards the train. It must then be assessed whether the processing of personal data is necessary for the performance of that contract.
The Council ruled that it is necessary for the rail operator to process the personal data, given that "the purpose of these data processing operations is to establish the determining obligations of the parties to the agreement" . According to the Council, the alternative of a paper ticket proposed by the individual is less verifiable and can lead to increased fraud, misunderstandings and altercations with train conductors.
10 November 2021
Links
Court Rling (in Dutch)
Updata Edition 14 October to December 2021 | Netherlands
58
Netherlands
Development
Summary
The Council also ruled that the travelcard separates travel data and personal data as bringing the data together is only permitted for sending invoices and for the operator's services, with a code of conduct applicable to the carrier.
Date
Anti-fraud helpdesk not permitted to process criminal data
The DDPA has rejected a second license application from the Dutch national anti-fraud helpdesk to collect and store personal data of possible perpetrators of fraud. The decision was taken on the basis that the helpdesk is not an investigative body and does not need criminal data to fulfil its task. Moreover, individuals with personal data in the helpdesk systems would not havethe same rights as fraud suspects would have with the police, and could be wrongly labelled as fraudsters.
The elpdesk was established in 2011 as a government initiative to advise fraud victims and to provide information to citizens and businesses in order to help prevent fraud. The helpdesk is established as a foundation and not a government body, and collection of criminal personal data is not required for its purposes. The helpdesk has filed an objection to the DDPA's decision.
26 October 2021
Testing for access does not lead to violation of various fundamental and human rights, including GDPR
As of 1 June 2021 the Temporary Regulation on COVID-19 certification entered into force, establishing temporary rules on the use of coronavirus entry passes.
On 16 September 2021, the House of Representatives voted in favour of this regulation, as a result of which, as of 25 September 2021, hospitality facilities, venues for art and culture, and other larger scale events will only be accessible to members of the public who are in possession of a valid coronavirus entry pass in combination with a valid identity document.
In preliminary relief proceedings, the District Court ruled that, in view of all the context indicators used to determine the level of risk, the State could reasonably decide that the use of the coronavirus entry pass was an appropriate and proportionate measure.
The District Court did not consider the question of whether the coronavirus entry pass conflicted with the GDPR. However, according to the DDPA's Protection of Individuals advice with
6 October 2021
Updata Edition 14 October to December 2021 | Netherlands
Links
DDPA Statement (in Dutch)
Court Ruling (in Dutch)
59
Netherlands
Development
GDPR violation leads to nonmaterial damages (damages for distress)
Summary
regard to the Temporary Regulation on COVID-19 certification, the provision of personal data must be limited to what is strictly necessary. In view of this, the coronavirus entry pass is not considered to contravene the GDPR.
Date
The District Court of Rotterdam determined that an applicant was entitled to non-material damages (damages for emotional distress) because her medical records had been unlawfully processed in breach of the GDPR.
12 July 2021
The District Court found that the medical data was unlawfully processed and such processing was sufficient to award the nonmaterial damages.
It is important to note that the special category personal data was retained by the respondent for approximately ten years, despite several requests by the applicant to destroy it. The District Court deemed it to be sufficiently plausible that in the ten years that the applicant's personal data was processed several persons and/or authorities were able to access the contents without authorisation and consequently the applicant had suffered non-material damage on that basis. The court estimated the damage at EUR 2,500.
The District Court determined, unlike other Court decisions, that the mere violation of the GDPR entitled the victim to non-material damages. Of note is that the amount was determined by considering a number of factors, such as the duration and type of violation. In previous rulings, non-material damages on the basis of a GDPR violation have not been awarded without question in this way, and claimants - when successful - only received limited compensation.
Links
Court Ruling (in Dutch)
Updata Edition 14 October to December 2021 | Netherlands
60
Poland
Contributors
Slovakia
Jana Sapkov Counsel
T: + 421 2 3278 6411 [email protected] eversheds-sutherland.sk
Dasa Derevjanikov Associate
T: + 421 2 3278 6411 [email protected] eversheds-sutherland.sk
Development
Employer's right to know about employees' vaccination status and COVID-19 test results
Summary
Date
In its Decision No 01340/2021-OS-10 from 24 November 2021, effective on 10 December 2021, the Data Protection Authority of the Slovak Republic found a violation of the principle of lawfulness under Article 5(1)(a) GDPR had occurred. This case concerned an employer who (during the months of June and July 2021) obtained information about vaccination status of his employees without lawful grounds and in breach of the principle of transparency due to the failure to provide information within the meaning of Article 13 GDPR.
24 November 2021
Of particular note are two statements from the Data Protection Authority:
- Firstly, the provisions of Article 6(1)(d) GDPR in conjunction with Article 9(2)(i) GDPR cannot be used by an employer as a lawful basis for processing employees' vaccination status personal data, even if the employers intention is to ensure a safe working environment (which is also a legal duty of the employer under Slovak legislation) by increasing the vaccination coverage of his employees.
- Secondly, if there is no proof that the employer systematically recorded data from inspection of the results of AG tests/PCR tests, there is no fulfilment of the conditions under Article 2(1) GDPR and the activity would not fall within the scope of the GDPR.
Despite the stated violations, the Data Protection Authority did not impose a corrective measure or a fine on the employer, as the subsequent Act No 412/2021, which amended certain acts in connection with the third wave of the COVID-19 pandemic (i.e. Labour Code and Decree No 264 of the Slovak Public Health Office), effective as of 15 November 2021, gives the employer the right to request vaccination information from employees in the workplace.
Links
N/A
Updata Edition 14 October to December 2021 | Slovakia
61
Poland
Contributors
South Africa
Grant Williams Partner
T: +27 11 575 3647 [email protected] eversheds-sutherland.co.za
Matthew Anley Senior Associate
T: +27 10 003 1382 [email protected] eversheds-sutherland.co.za
Development
Summary
Date
Guide on how to use the Promotion of Access to Information Act
The Information Regulator issued a Guide on How to Use the Promotion of Access to Information Act ("PAIA"). The purpose of the guide is to provide information that is needed by any person who wishes to exercise any right contemplated in PAIA and in the Protection of Personal Information Act, 2013 ("POPIA").
The guide is intended to specifically assist a data subject access their personal information in accordance with section 23 of POPIA.
The guide is also intended to assist requesters in:
4. understanding PAIA, its benefits and background;
5. learning the step-by-step process by which to make a request and additional tips for making that process easier;
6. learning the types of information which can be requested using PAIA;
7. understanding the process by which a requester can challenge a decision taken in relation to their request; and
8. being introduced to the changes that will occur to PAIA once POPIA is fully operational.
1 October 2021
Draft National policy on data and cloud
In March 2021, the Minister of Communications and Digital Technologies issued a draft National Policy on Data and Cloud for public comment.
1 April 2021
The deadline for public comments was 11 June 2021. The Minister has not provided further comments following the receipt of public comments.
Links
Guide on How to Use the Promotion of Access to Information Act
Draft National Policy on Data and Cloud
Updata Edition 14 October to December 2021 | South Africa
62
South Africa
Development
Summary
Amongst other things, the draft policy states that national security and the need for data sovereignty demands changes to legislation. It further goes on to state that South African data belongs to South Africa, and that any cross-border transfers must be carried out in adherence with South African privacy protection policies and legislation (POPIA), the provisions of the Constitution, and in compliance with international best practice, and a copy of such data must be stored in South Africa for the purposes of law enforcement.
Date
Links
Updata Edition 14 October to December 2021 | South Africa
63
Poland
Sweden
Development
Swedish DPA initiates audit on a debt collection agency
Swedish DPA initiates audit on Sweden's Equality Ombudsman
Contributors
Torbjrn Lindmark Partner
T: +46 8 54 53 22 27 [email protected] eversheds-sutherland.se
Sina Amini AssociateT: +46 8 54 53 22 17 [email protected] eversheds-sutherland.se
Summary
Date
The Swedish Authority for Privacy Protection ("Swedish DPA") has commenced an audit on a debt collection agency operating in Sweden which alledgedly sent out misleading advertisement with the appearance of an invoice on behalf of a client company.
The Swedish Consumer Agency, who reported the activity to the Swedish DPA, has received over 800 complaints from customers of the advertised company which the debt collection agency claims to represent.
The primary purpose of the audit is to determine what systems and processes the debt collection agency has in place when their client company has no lawful basis to send out advertisements to individuals.
1 October 2021
The Swedish Equality Ombudsman ("DO") has notified the Swedish DPA of a personal data breach concerning a web form used for submitting tips and complaints on the DO's website.
According to the DO, an analysis tool which is used to improve the website's functionality has in some cases been able to collect and store personal data, including data from the web form. The Swedish DPA has, as a result of the the DO's notification, decided to initiate an audit. The Swedish DPA has further stated that the DO, as a controller, has the responsibility to continuously monitor their IT-systems' use and to ensure that appropriate security measures have been implemented.
As part of the audit process, the Swedish DPA will conduct an onsite inspection to investigate the extent of the personal data
12 October 2021
Links
Press statement (in Swedish) Audit statement (in Swedish)
Press statement (in Swedish)
Updata Edition 14 October to December 2021 | Sweden
64
Sweden
Development
Summary
breach and determine what measures have been taken by the DO to prevent it from occurring again.
Date
Swedish DPA initiates collaboration with AI Sweden, the Swedish national center for applied artificial intelligence
The Swedish DPA and AI Sweden have started collaborating to provide support and guidance on issues relating to AI and data protection.
The collaboration is part of an assignment that the Swedish DPA received from the Swedish government in early 2021, to raise the general level of knowledge about privacy and data protection issues among innovation actors. Together, the Swedish DPA and AI Sweden will identify recurrent issues and themes relating to data protection and AI.
14 October 2021
The Swedish DPA and AI Sweden will implement efforts to provide support and guidance regarding the issues identified. The project is intended to run until 31 March 2023.
Swedish DPA initiates audit on a Swedish municipality for their use of CCTV
The Swedish DPA has launched an audit of a Swedish municipality's use of CCTV in public places without first having applied for a CCTV license in accordance with national data protection laws.
In the audit statement, the Swedish DPA has asked the municipality to provide an explanation of:
- why they have not applied for a CCTV license;
- where the cameras are positioned; and
- whether the municipality have implemented technical measures to anonymize the personal data collected by the use of CCTV.
21 October 2021
Swedish DPA initiates audit on the Swedish Association of Local Authorities and Regions
The Swedish DPA has launched an audit of the Swedish Association of Local Authorities and Regions ("SKR"), an employers' organisation that represents all of Sweden's municipalities and regions.
SKR manages a national waiting time database to produce statistics regarding waiting times in public healthcare the Swedish DPA will be reviewing the SKR's processing of this personal data.
9 November 2021
Updata Edition 14 October to December 2021 | Sweden
Links
Press statement (in Swedish)
Press statement (in Swedish) Audit statement (in Swedish)
Press statement (in Swedish) Audit statement (in Swedish)
65
Sweden
Development
Swedish DPA warns the Swedish Migration Agency
Summary
The database consists of information collected from patient records that have been transferred from Swedish regions to the SKR. The main purpose of the audit is to obtain clarification on the lawful basis SKR relies on as a private organisation that is not a healthcare provider, to process personal data from patient records for the purposes of the database.
Date
The Swedish DPA has warned the Swedish Migration Agency that the public authority's processing of personal data relating to the national implementation of the Visa Information System ("VIS") risks breaching the GDPR. VIS is a central IT system that exchanges personal data between EU member states when facilitating checks for the issuance of visas to individuals. The system can perform biometric matching, primarily of fingerprints, for identification and verification purposes.
18 November 2021
According to the Swedish DPA, an accurate and updated documentation of the public authority's IT architecture for VIS is needed, with the IT documentation given playing a crucial role in protecting personal data processed by the system by reducing potential risks and vulnerabilities, e.g. such as when staff are replaced.
The Swedish DPA concluded that it is likely, due to inconsistencies in the applicable IT-documents, that errors occurred during the development and management of VIS. Consequently, the Swedish Migration Agency may not have taken sufficient technical and organisational measures to ensure an appropriate level of security pursuant to Article 32 GDPR.
Links
Press statement (in Swedish) Decision (in Swedish)
Updata Edition 14 October to December 2021 | Sweden
66
Contributors
United Arab Emirates
Nasser Ali Khasawneh Partner
T: +971 50 655 3198 [email protected] eversheds-sutherland.com
Christine Khoury Principal Associate
T: + 971 4 389 7064 [email protected] eversheds-sutherland.com
Geraldine Ahern Partner
T: +971 2 494 3632 [email protected] eversheds-sutherland.com
Development
UAE enacts first federal data protection law
Summary
Date
Links
In line with the United Arab Emirates' 50th anniversary, forty new federal laws have been approved. This represents the biggest legislative reform in the history of the United Arab Emirates ("UAE"). The reforms include the introduction of a federal data protection framework with the approval of Decree-Law No. 45 of 2021 on the Protection of Personal Data (the "Data Protection Law") and Decree-Law No. 44 of 2021 on the Data Protection Office ("DPO") Establishment (the "DPO Law").
Both laws were issued on 20 September 2021. The DPO Law took effect on 21 September 2021 and the Data Protection Law took effect on 2 January 2022. The Implementing Regulations are anticipated to be issued around March 2022. The Implementing Regulations will provide more detailed provisions for the Data Protection Law.
Please read our full client briefing for further information on the UAE's first federal data protection law.
DPO Law 21 September 2021
Data Protection Law 2 January 2022
Eversheds Sutherland client briefing
Updata Edition 14 October to December 2021 | United Arab Emirates
67
United Kingdom
United Kingdom
Contributors
Paula Barrett Co-Lead of Global Cybersecurity and Data Privacy
T: +44 20 7919 4634 [email protected] eversheds-sutherland.com
Lizzie Charlton Senior Associate PSL (Data Privacy)
T: +44 20 7919 0826 [email protected] eversheds-sutherland.com
Development
Summary
Date
NCSC releases updated guidance on secure communications principles
The UK's National Cyber Security Centre ("NCSC") has published updated guidance on secure communications principles in relation to assessing the security of voice, video and messaging services. The guidance is aimed at organisations that wish to assess communication technologies, with the purpose of helping to achieve the right balance of functionality, security and privacy. It is important to note that there is particular relevance for those working in Government (with official systems) and the public sector.
1 October 2021
Cyber Security Breaches Survey 2022
The UK Government has launched its Cyber Security Breaches Survey 2022. The survey will take place between October 2021 and February 2022. Its purpose is to gain information on the costs and impacts of cyber breaches and attacks and the results will be used to inform Government policy on cyber security. Participants are selected at random and interviews will take place by telephone.
28 September 2021
ICO data sharing code of practice under DPA 2018 has now come into force
On 5 October 2021 the Information Commissioner's Office ("ICO") statutory data sharing code of practice, which was produced under Section 121 of the Data Protection Act 2018 ("DPA 2018"), came into force. The code provides practical guidance for organisations on how to share personal data in compliance with the requirements of the UK General Data Protection Regulation ("UK GDPR") and DPA 2018, including transparency, the lawful basis for processing, the accountability principle and the need to document processing requirements. It will be a key consideration for future data sharing between controllers (it does not cover the sharing of personal data between controllers and processors). ICO guidance on the code can be found on its website.
5 October 2021
Links
Updated guidance
Statement
Code of practice
Updata Edition 14 October to December 2021 | United Kingdom
68
United Kingdom
Development
Summary
Date
ICO embraces DCMS consultation reviewing UK data regime
On 7 October 2021 the ICO published its response to the Department for Digital, Culture, Media & Sport's ("DCMS") recent consultation, `Data: A New Direction' (as mentioned in last month's edition of Commercially Connected). The response mostly welcomes the review of the UK data protection legal framework and regulatory regime, observing that it is imperative that the Government ensures the final package of reforms clearly upholds rights for individuals, lessens the burdens for businesses and safeguards the independence of the ICO. Furthermore, the response supports the proposal to introduce a more commonly-used regulatory governance model for the ICO, which includes a statutory supervisory board with a separate Chair and CEO. It is important to note however that the Information Commissioner raised concerns regarding certain proposals for the Secretary of State to approve ICO guidance and to appoint the CEO, and Summary Impact insisted that the Government reconsider such proposals in order to uphold the independence of the ICO.
7 October 2021
The first phase of Sandbox Report on Single Customer View published
The first phase of the Gambling Commission and ICO's Sandbox Report has been published. The report covers the challenges of developing a Single Customer View (SCV) and a cross-operator view to identify gambling harms in those holding accounts with more than one gambling company. The Gambling Commission and the ICO have agreed to work together to establish whether there is an appropriate lawful basis under Article 6 of the UK GDPR that allows for the sharing of behavioural or affordability data between online gambling operators via a SCV and to consider the processing of special category personal data and the appropriateness of Article 9 conditions for processing under the UK GDPR.
8 October 2021
CMA response to the government's consultation `A new pro-competition regime for digital markets'
The Competition and Markets Authority ("CMA") has released its response to a government consultation regarding the "new procompetition regime for digital markets". This consultation surrounds the proposed implementation of a new "pro-competition regime for digital markets". The government is concerned that a lack of competition in digital markets is harming businesses and consumers, as well as new start-ups, as prices are higher, there is poorer choice, and start-ups face a harder time owing to barriers to entry into the market.
To counteract these issues the government had already established a "Digital Markets Unit" within the CMA in April 2021, and now wants a
4 October 2021
Updata Edition 14 October to December 2021 | United Kingdom
Links
ICO press release ICO Response
Sandbox Report
CMA response report
69
United Kingdom
Development
Consultation on the ICO's AI and data protection risk toolkit
ICO call for views: Anonymisation, pseudonymisation and privacy enhancing technologies guidance
Summary
Date
new regulatory regime for the DMU to enforce. This would involve a new "Strategy Market Status" test, whereby firms with "substantial" and "entrenched" market powers in the digital technology market would gain this Strategic Market Status ("SMS"). As a result where SMS firms are involved in acquisitions, the CMA would have greater powers to scrutinise these deals with a new code of conduct introduced to set out SMS best practice rules. The CMA would also be able to intervene to ensure strong competition. The DMU would work with the ICO (as well as Ofcom and the Financial Conduct Authority) in order to develop and enforce regulations.
The CMA has commented on aspects of these proposals, stating the code of conduct should be DMU led and tailored to different firm sizes, and that it welcomes the proposals for levy funding and working with other regulators. The CMA as a whole welcomes the powers it would gain and the impact that these would have, with it stating that it will maintain its support of the government's proposals, and work to develop the proposed regime.
The ICO has released a consultation on an early beta version of an AI and data protection risk toolkit. Its purpose is to help persons such as risk practitioners in the identification of risks to data protection which AI systems can pose (where they process personal information) and provide suitable mitigation options for consideration. The ICO is looking for a wide range of views from people involved in the technical and compliance aspects of AI systems. The ICO requires responses to be submitted by 1 December 2021.
12 October 2021
The ICO is currently in the process of drafting new guidance entitled "Anonymisation, pseudonymisation and privacy enhancing technologies" and following the release of its first chapter draft in May 2021, the ICO has now released the second chapter draft. The second chapter focuses on what steps can be taken to ensure that anonymisation is effective.
As part of this, the guidance sets out what identifiability is and its key indicators, what current data protection law states about assessing the identifiability risk, and what factors to take into account when carrying out an assessment of the identifiability risk and how to manage any identification risks. The guidance provides a diagrammatic summary of this in order to help businesses determine their "data release model".
8 October 2021
Links
ICO consultation ICO press release Draft ICO guidance
Updata Edition 14 October to December 2021 | United Kingdom
70
United Kingdom
Development
Summary
The deadline for responses was 28 November 2021 for this second chapter of the guidance, but further draft chapters will be released for consultation before this at "regular intervals".
Date
ICO consultation on the draft journalism code of practice
The ICO has opened a consultation on its draft journalism code of practice, with the consultation open until 10 January 2022, with online workshops being run this November to discuss the code. The draft code concerns the processing of personal data for journalism purposes, and so the ICO is consulting on its proposed code in order for it to be easily accessible for people to understand, with the intention that this leads to better compliance. The code will be supplemented by additional resources for "smaller organisations", with the code itself aimed at data protection officers, lawyers and editorial staff at media organisations. The consultation, aside from seeking opinions on the clarity of the draft code, also seeks to understand whether the code could be more practical, if it is relevant to current issues, if it has any unintended consequences and if it would effectively protect public interest in data protection. The ICO wants a wide range of individuals to engage with the consultation, not only media organisations and journalists, but also more peripheral figures such as regulators, campaign groups and lawyers. The ICO is also consulting on a draft economic impact of introducing such a code, in respect of which respondents can also provide responses.
13 October 2021
ICO opinion: Age assurance for the `Children's code'
The Information Commissioner has published a formal opinion with regard to age assurance for the Children's Code (see last month's edition of Commercially Connected). The opinion outlines what the ICO expects Information Society Services ("ISSs") to be doing in order to meet the Children's Code's age appropriate design standard, and advocates for a risk-based approach to age assurance measures. This should be through a combination of age verification, age estimation, account confirmation and self-declaration.
An ISS is defined as "any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of service" so, like the Children's Code, this formal opinion will cover most for-profit online services.
14 October 2021
Call for evidence on the use of age assurance
Linked to the above, the ICO has issued a call for evidence in relation to As above the age appropriate section of the ICO's Children's Code to support the
Updata Edition 14 October to December 2021 | United Kingdom
Links
ICO press release ICO consultation survey
ICO press release
ICO call for evidence survey
As above
71
United Kingdom
Development
Court held that use of security cameras and video doorbell breached data protection law
Summary
ICO's work.
This includes seeking evidence on the effectiveness and fairness of current age estimation approaches, the economic impact of age assurance approaches and the data protection risks involved with approaches taken. The ICO is keen to understand what new age assurance approaches are emerging in the industry, where there could be further development and how the ICO could support this. The deadline for responding is 9 December 2021, with an online survey being the primary format for respondents to answer the ICO's various questions.
Date
In the County Court case of Fairhurst v Woodard (Case No: G00MK161) (12 October 2021), the claimant and defendant were neighbours, living in a row of terraced houses with private parking in a private car park behind their respective rear garden boundaries.
12 October 2021
The defendant mounted a floodlight and sensor onto a shed in his garden and a video and audio surveillance camera pointing in the direction of the private cark park. The defendant also installed: (i) a doorbell / video / audio surveillance system on his front door; (ii) a video and audio system on the end wall of the neighbouring property, facing down the driveway towards the car park; and (iii) a camera inside the front windowsill of his house facing outwards.
The claimant asserted that the defendant "consistently failed to be honest or open" about the cameras and had "unnecessarily and unjustifiably invaded her privacy", and claimed harassment, nuisance and breach of the Data Protection Act 2018 ("DPA").
The judge held that the defendant had breached the DPA's first principle (lawfulness, fairness and transparency) and second principle (purpose limitation) by failing to process the claimant's personal data in a fair and transparent manner and by failing to collect the data for a specified or explicit purpose the claimant was therefore entitled to compensation and orders preventing the defendant from continuing to breach her rights under the data protection legislation in future.
The case turned on its facts and is not binding given that it was brought in the County Court, but will be of general interest as a practical example of how video surveillance technologies can breach data
Links
Judgment
Updata Edition 14 October to December 2021 | United Kingdom
72
United Kingdom
Development
Summary
protection legislation. We await the outcome of the court's compensatory award.
Date
Court dismisses claim for damages under the GDPR and DPA 2018, alongside damages for misuse of confidential information, breach of confidence and negligence
In Rolfe and others v Veale Wasbrough Vizards LLP [2021] EWHC 2809 (QB), the defendants were instructed by a school to contact the first two claimants (the parents of a child at the school) to demand payment for outstanding school fees. The defendants sent an email, including two attachments: a letter and a copy of the statement of account of the third claimant (the child). The email contained the claimants' names and home address, but no further details were included. The only financial details were the invoice for fees, which was publicly available on the School's website and the statement of account. Due to a one letter difference in the mother's email address, the defendants sent the email and attachments to an incorrect recipient. The incorrect recipient responded promptly and on the same day indicating that they believed they had received the email in error. The defendants replied confirming this and asking the recipient to delete the email, which the recipient confirmed. The incorrect recipient and the claimants were not known to each other. The claimants brought a claim for damages for the misuse of confidential information, breach of confidence, negligence and damages under Article 82 GDPR and section 169 of the DPA. The court summarily dismissed the claims citing, amongst other things, that the claimants exaggerated and lacked credible evidence of distress and that the claim was speculative given its de minimis nature. In deciding the case did not have a "more than fanciful" prospect of success, the Court noted the minimally significant information which was shared, the prompt steps taken to ensure deletion of the misdirected email and that there was no evidence of further transmission or misuse of the information. The judgment in this case has been welcomed as a pragmatic response to a "trivial" claim involving a one off email misdirection which was quickly remedied.
7 September 2021
ICO embraces DCMS consultation reviewing UK data regime
The ICO published its response to the recent DCMS consultation, `Data: A New Direction' (as mentioned in last month's edition of Commercially Connected). The response mostly welcomes the review of the UK data protection legal framework and regulatory regime, observing that it is imperative that the Government ensures the final package of reforms clearly upholds rights for individuals, lessens the burdens for businesses and safeguards the independence of the ICO. Furthermore, the response supports the proposal to introduce a more commonly-used
6 October 2021
Links
Judgment
Response to DCMS consultation "Data: a new direction" ICO News: ICO Response to DCMS Consultation "Data: A new direction"
Updata Edition 14 October to December 2021 | United Kingdom
73
United Kingdom
Development
Summary
regulatory governance model for the ICO, which includes a statutory supervisory board with a separate Chair and CEO. It is important to note however that the Information Commissioner raised concerns regarding certain proposals for the Secretary of State to approve ICO guidance and to appoint the CEO, and Summary Impact insisted that the Government reconsider such proposals in order to uphold the independence of the ICO.
Date
ICO publishes recommendations for video teleconferencing companies
The ICO has published a statement on video teleconferencing companies ("VTC"). The statement refers to a separate report setting out its conclusions and observations following an open letter in July 2020 which was signed by six data protection and privacy authorities from Australia, Canada, Gibraltar, Hong Kong, China, Switzerland and the UK. The letter highlighted in particular concerns that privacy safeguards may not be keeping pace with the rapid increase in use of these services during the COVID-19 pandemic. The joint signatories invited the VTC companies to reply to their letter. The ICO noted that the activity provides an example of "constructive engagement" between the regulators and the organisations who are regulated and that the dialogue between the two was "effective, efficient and mutually beneficial".
The report also set out a number of recommendations to further enhance and improve measures, including encryption and responsible secondary use of data and use of data centres. In addition, use of `share screen' and `recording' functionality raises particular issues in regard to privacy and should be considered carefully.
27 October 2021
National Audit Office releases new "Cyber and information security: Good practice" guide
The National Audit Office ("NAO"), the UK's independent public spending watchdog, has published an updated good practice guide for audit committees to make use of when reviewing an organisation's cyber and information security risk management processes and training. The NAO previously published a good practice guide in 2017, but felt a need for a review given the increase in online working due to the COVID-19 pandemic, the ongoing demand to digitise (and move to cloud-based services) and the increasing threat of cyber incidents to public bodies. The guide emphasises that audit committees need to monitor cyber risks to comply with relevant rules, standards and legislation. The guide comprises a set of high level and more detailed questions for audit committees to use to assess compliance, and
28 October 2021
Updata Edition 14 October to December 2021 | United Kingdom
Links
Open letter: Global privacy expectations of video teleconference providers Joint statement on global privacy expectations of Video Teleconferencing companies Observations following the joint statement on global privacy expectations of video teleconferencing companies
Explanatory Note Guidance
74
United Kingdom
Development
Summary
signposts useful resources from the NCSC.
Date
ICO and the Office of the Australian Information Commissioner conclude joint investigation into Clearview AI Inc
The ICO and the Office of the Australian Information Commissioner ("OIAC") have concluded their joint investigation into the personal information handling practices of Clearview AI Inc ("Clearview") and published a statement in respect of their investigation.
3 November 2021
Clearview has a facial recognition app which allows users to upload a photo of an individual's face, which is then matched to photos of that individual collected from the internet. It also links to where the photos appeared. The system includes a database of more than three billion images that have been taken or "scraped" from various social media platforms and other websites. The investigation, which opened in July 2020, focused on the use of scraped data and the use of biometric facial recognition.
Through the joint investigation, the ICO and OIAC worked together with regard to evidence-gathering. However, the outcomes were considered separately, due to the different legal frameworks which apply to each regulator. Each authority also considered their respective police force's use of the technology separately.
Having concluded the investigation, the ICO is considering the next steps and whether any formal regulation under UK data protection laws might be appropriate or necessary. The OIAC has also released its determination.
Amongst other things, the joint investigation has demonstrated the value and importance of data protection regulators cooperating and collaborating globally, given the international nature of the regulatory environment in this sphere.
All-party Parliamentary Group proposes new legislation to curb surveillance technologies and algorithm-determined targets
Further to its enquiry in May this year, the All Party Parliamentary Group on the Future of Work (the "APPG") has published The New Frontier: Artificial Intelligence at Work, setting out its findings and recommendations in relation to the use of AI and related systems in the workplace.
The report's key findings are that a key source of anxiety is a "pronounced sense of unfairness and lack of agency" relating to automated decisions. Workers may not understand how their information is used to make decisions relating to the work they do, and
12 November 2021
Links
OIAC determination Press release (ICO)
The New Frontier: Artificial Intelligence at Work
Updata Edition 14 October to December 2021 | United Kingdom
75
United Kingdom
Development
Summary
Date
therefore have a low level of trust in the ability of AI technologies to support working environments. There is however a sense of optimism that policy-makers can address these concerns with a robust regulatory response.
The APPG's key recommendations are that:
i. an Accountability for Algorithms Act should be passed to establish a "simple, new corporate and public sector duty to undertake, disclose and act on pre-emptive Algorithmic Impact Assessments" ("AIA");
ii. digital protection should be updated to increase the essential protection for workers this would include: (a) providing an accessible explanation of the purpose, outcomes and significant impacts of algorithmic systems used in workplaces, a summary of the AIA and means for redress; and (b) establishing a right to be `involved' in shaping the design and use of algorithmic systems at work;
iii. unions and specialist third sector organisations should be provided with additional collective rights in recognition of "the collective dimension of data processing" and to enable them to exercise new duties on members' or other groups' behalf;
iv. the joint Digital Regulation Cooperation Forum should be expanded with new powers for the creation of certification schemes, to suspend use or impose terms and issue guidance, to supplement the work of individual regulators and sector-specific standards; and
v. the APPG principles of Good Work should be seen as fundamental values incorporating fundamental rights and freedoms under law to guide the development and application of a human-centred AI strategy which works to serve public interest.
Government responds to call for views on supply chain cybersecurity and announces plans to boost cybersecurity of UK digital supply chains
The Government has published a response to its May 2021 call for views on improving cyber security in supply chains and managed service providers, finding that there was "broad agreement" with the Government's analysis of the challenges around supply chain cyber security as well as its view that additional support from Government is required, including the provision of guidance and a more interventionist approach to improve resilience across supply chains. Regulation was
15 November 2021
Links
Response Government announcement
Updata Edition 14 October to December 2021 | United Kingdom
76
United Kingdom
Development
Summary
perceived to be `very effective' by more respondents than any other suggested intervention.
The Government also states that, as part of the forthcoming National Cyber Strategy, it will continue to work with industry experts to develop policy solutions aimed at increasing cyber security resilience of digital solutions, including increasing uptake of the Government's Cyber Essentials Scheme and working with the National Cyber Security Centre to plug any gaps in its Supplier Assurance Questions. It also aims to implement legislation to ensure that managed service providers undertake reasonable and proportionate cyber security measures and to engage with international partners to foster a joined-up global approach. However, no timescale has yet been indicated.
The government has also announced plans to boost the cyber security of the UK's digital supply chains. Draft proposals include plans to require IT service providers to follow new cyber security rules such as the NCSC's Cyber Assessment Framework, new procurement rules to help ensure good cyber security standards are baked into outsourced public services and improved advice and guidance campaigns to help businesses manage security risks.
Date
Government responds to call for views on amending NIS Regulations 2018
The Government published a response to its call for views on amending the Network and Information Systems ("NIS") Regulations 2018 to move incident reporting thresholds for digital service providers (inscope online search engines, online marketplaces and cloud computing services) out of legislation and into guidance issued by the ICO. The incident reporting thresholds are currently set out in EU retained law but do not work for the UK as they are set on the basis of EU market size.
17 November 2021
Overall respondents were either in favour of or neutral in respect of the Government's proposals, with the most frequently cited reason for opposing the proposal being that the ICO should not have the power to amend thresholds without consultation. The Government notes this view but states that, although the ICO has no statutory duty to consult on its guidance, in practice the ICO is committed to engaging with industry including consulting on changes.
The ICO had already consulted on proposed notification thresholds. Consultation closed on 14 October 2021 and we are awaiting its outcome.
Links
Response
Updata Edition 14 October to December 2021 | United Kingdom
77
United Kingdom
Development
Summary
The Network and Information Systems (EU Exit) (Amendment) Regulations have now been made, they come into force in January 2022. These will amend the NIS Regulations to reflect the new approach to incident reporting thresholds.
Date
Open Life Data Framework report published to encourage discussion around how data can be used to improve health levels of UK population
The All-Party Parliamentary Group for Longevity ("AAPGL") has published the Open Life Data Framework report, which aims to: stimulate conversation and encourage collaboration between public and private sectors; and, assist researchers, policymakers and entrepreneurs to establish what types of health-relevant data (outside of the NHS and care system) provide the most insight into helping disadvantaged individuals keep healthy as well as enhancing overall health resilience at a population health level. The AAPGL describes its core challenge in respect of which it aims to stimulate debate as "how the expansive and diverse explosion of data flows may be best managed to the benefit of the individual and the public, and supported by the public, private and third sectors". The background is that the APPG formed an expert group to define the requirements for an open health system (drawing from pre-existing models like the Open Banking ecosystem) to harness data-intensive technologies to extend the healthy life spans of British citizens. The AAPGL plans to work with a range of organisations to develop the framework. It is seeking funding and other support for the next stage, including testing assumptions of the framework in the real-world, on use cases, sandboxes and pilot projects.
18 November 2021
Updated Surveillance Camera Code of Practice published laid before Parliament
An updated Surveillance Camera Code of Practice was published by the Home Office and then laid before Parliament on 16 November 2021. Subject to receiving Parliamentary approval, this updated code is due to come into force on 12 January 2022.
16 November 2021
The Code has been simplified and updated, including for the recent decision of R (Bridges) v Chief Constable of South Wales Police and others in relation to the use of live facial recognition technology.
The Code applies to the use of surveillance cameras (including CCTV) by relevant authorities (including local authorities and the police), but other operators and users of surveillance camera systems in England and Wales are encouraged to adopt the Code voluntarily.
Before laying the Code before Parliament the Home Office conducted a statutory consultation. A summary of consultation responses is
Updata Edition 14 October to December 2021 | United Kingdom
Links
Report
Amended Code Summary of consultation responses Explanatory memorandum
78
United Kingdom
Development
National Data Strategy Mission 1 Policy Framework published
Summary
available.
Date
The DCMS has published the National Data Strategy Mission 1 Policy Framework: Unlocking the value of data across the economy.
The idea behind this framework is to set the right conditions to allow private and third sector data to be more usable, accessible and available across the UK economy whilst simultaneously protecting individuals' data rights and private enterprises' intellectual property. There are two main sections of the framework:
1. Principles for intervention (a set of principles the Government will be able to use to guide "interventions seeking to unlock data across the economy" for public benefit).
2. Priority areas for action (which identifies specific areas for action to address some of the key barriers to data sharing for public benefit).
The principles for intervention are:
- improve knowledge and understanding of data sharing
- reduce data sharing costs through better data foundations
- support new ways of addressing the risks of data sharing
- improve/demonstrate incentives for data sharing
- reduce perceived regulatory burden associated with data sharing
- mandate data sharing in the public interest by identifying datasets of national importance or public interest
The priority areas for action are:
- establish foundations by promoting the development and use of good data standards so that data is held, processed and shared according to FAIR (findable, accessible, interoperable and reusable) principles
- support infrastructure by encouraging the development and uptake of Privacy Enhancing Technologies, supporting the development of a thriving intermediary ecosystem, and infrastructure that promotes the availability of data for research and development purposes and enabling responsible data sharing
24 November 2021
Links
National Data Strategy Mission 1 Policy Framework
Updata Edition 14 October to December 2021 | United Kingdom
79
United Kingdom
Development
ICO publishes Opinion on novel adtech and privacy
Summary
- encourage the market through incentives to maximise value for money data sharing and addressing data practices that distort competition and consumer outcomes
- lead, co-operate and collaborate, including learning from international partners and developing international cooperation to support the UK's data agenda on the world stage
Whilst the framework does not change data protection law at this time, it aims to track the delivery and assess the effectiveness of the Government interventions to enable future interventions.
Date
The ICO has published its Opinion on novel adtech and privacy in collaboration with the CMA. The Opinion assesses developments in adtech against the ICO's data protection expectations and provides guidance to market participants about how they can adhere to the principles of data protection by design and by default and address data protection and privacy harms when assessing their approaches and making proposals.
The Opinion follow on from the ICO's 2019 report into adtech and real time bidding which called for industry to make changes. Since the 2019 report initiatives within industry have been developed to address data protection concerns. The ICO has encouraged market participants to demonstrate how their proposals meet the expectations outlined in the Opinion.
The Opinion states that any proposals must address the risks that adtech poses and take account of data protection requirements from the outset. It sets the ICO's expectations for proposals, including that they should:
- include data protection by default in the design of the initiative
- provide users with the option to receive adverts without tracking, profiling or targeting based on personal data
- be transparent on any processing activities taking place
- specify the purposes for processing personal data and show how this is fair, lawful and transparent
address existing privacy risks and mitigate any new risks that may arise in relation to the proposals
25 November 2021
Links
Opinion
Updata Edition 14 October to December 2021 | United Kingdom
80
United Kingdom
Development
Product Security and Telecommunications Infrastructure Bill published
The ICO publishes paper on end-to-end encryption
Summary
Date
The Product Security and Telecommunications Infrastructure Bill has been published, Part I of which creates a legal framework to improve the cyber security of internet or network connectable ("smart") consumer products made available in the UK. The definition of products is wide and could potentially catch products supplied to businesses as well as to consumers.
24 November 2021
The publication of this Bill follows on from the 2018 code of practice for security of consumer Internet of Things devices, then ongoing consultation between 2019 and 2021 which demonstrated a consensus that regulation is required in this area.
The Bill imposes obligations on the manufacturer, importer or distributor of an in-scope product to include a "statement of compliance" with the product, which will confirm that the product complies with specified security requirements. These persons are also required to investigate and take action in respect of compliance failures. The Bill will only apply to products that are new to the market and so will not have retrospective effect.
The detail of the security requirements will be set out in secondary legislation which has not yet been published, but the explanatory notes to the Bill say that the Government intends these to be technical in nature and note that specific conformity assessment procedures may be mandated. The Commercially connected UK commercial law updates Immediate impact Short-term impact On the horizon 14 intention is that the requirements will include a ban on universal default passwords, compel the provision of a public point of contact for reporting of flaws/bugs in products and a requirement to inform customers of either the minimum amount of time for which the product will receive security updates and patches or that the product does not come with security updates.
The fines that may be levied for non-compliance will be up to 10,000,000 or 4% of global turnover or 20,000 a day for ongoing non-compliance.
The ICO published its framework to consider the impact of end-to-end encryption ("E2EE") on online safety which summarises the ICO's current thinking on the governance of E2EE and its engagement with national and international stakeholders.
2 November 2021
Links
Draft Bill Explanatory note
A Framework for Analysing End to End Encryption in an Online
Updata Edition 14 October to December 2021 | United Kingdom
81
United Kingdom
Development
Summary
Date
E2EE is a "technical measure that encrypts content in communication channels so that only the sender or recipient can access it". It prevents third parties from accessing the content, including the provider of the communication platform and is increasingly used to support secure communications and content sharing between users.
The ICO's framework recognises that E2EE is "central to a safe and private online experience" and the long history of the ICO providing guidance recommending encryption. However, as E2EE restricts the detection of harmful content, it recognises that it presents a challenge from an online safety and law enforcement perspective, including with respect to child safety.
The ICO is engaged in the Government's Safety Tech Challenge fund which is supporting innovative solutions to tackle key issues in E2EE environments and is leading a programme with support from the Government's Regulators' Pioneer Fund to stimulate the development of privacy enhancing technologies, as well as developing guidance for the use of such technologies.
The ICO is engaging with a variety of stakeholders, including Ofcom and the Financial Conduct Authority, to better understand their priorities and concerns. It will publish the outcomes of its work early next year.
The ICO has highlighted several key factors to consider in relation to reconciling safety and privacy. These include:
- the demand from consumers for services which safeguard privacy and support safety online
- the requirements that existing legislation place on businesses, for example on controllers to process personal data securely
- the effectiveness of existing legislation and technical tools to ensure lawful access for data for both law enforcement and national security purposes, without weakening or breaking encryption standards
- the potential future development of technical solutions for detecting harmful content without weakening E2EE
- the necessity, proportionality, targeting and effectiveness of proposed legislative solutions
Updata Edition 14 October to December 2021 | United Kingdom
Links
Safety Context v1 02/11/2021
82
United Kingdom
Development
Summary
- the economic impact of proposed legislative solutions
Date
Pioneering standard for algorithmic transparency launched by UK Government
The Central Digital and Data Office ("CDDO") has launched an algorithmic transparency standard for government departments and public sector bodies. The standard, which delivers on commitments made in the National AI Strategy and National Data Strategy, will be piloted by several public sector organisations.
1 December 2021
The aim of the standard (allegedly, one of the world's firsts) is to strengthen the UK's position as a world leader in AI governance and to enable transparency about how algorithmic tools are used to support decision making, especially where decision outcomes will have a legal or economic impact on the individuals affected. This is intended to promote trustworthy innovation by enabling better visibility over the use of algorithms.
The standard includes provision for providing relevant information in a "complete, open, understandable, easily-accessible, and free format". Relevant bodies will be required to provide information to the CDDO about algorithmic tools used and their utility, which will be published in the Algorithmic Transparency Standard Collection. This will allow experts and the public to engage with and scrutinise the data provided.
For more information, please read our briefing on the standard for algorithmic transparency.
NCSC blog post on the use of secure messaging, voice and collaboration apps
On 7 December 2021, the National Cyber Security Centre ("NCSC") published a blog post on secure messaging, voice and collaboration apps in which it provides guidance on what organisations should consider before choosing apps for secure communications and collaboration.
Due to "hybrid working" (a combination of working from home and the office) becoming more common, the NCSC states it has been regularly asked whether various "secure voice and messaging" apps available from Google Play or the Apple App Store are suitable for use. The NCSC has not reviewed every single app, but has provided some risk management advice to guide organisations when choosing and using such apps.
The NCSC recommend organisations take the following steps:
7 December 2021
Links
Press release Standards document
Blog Post
Updata Edition 14 October to December 2021 | United Kingdom
83
United Kingdom
Development
Draft regulations amending DPA 2018 laid before parliament
Government announces new National Cyber Strategy
Summary
- Step one - establish the business context in which the app will be used and whether there is a genuine business case for the app. Organisations should ask if a solution is already available within the enterprise and, if so, why those existing solutions are not appropriate
- Step two research the app that is intended for use, including its security and UK GDPR compliance
- Step three configure the app in a way which minimises risk, such as limiting its use to only those users who have a business need, controlling system-level access permissions and turning on multi factor authentication
- Step four document decisions about why the organisation has chosen the app
Date
The draft Data Protection Act 2018 (Amendment of Schedule 2 Exemptions) Regulations 2022 ("Regulations") have been laid before Parliament. The Regulations were drafted following the decision in R (Open Rights Group and another) v Secretary of State for the Home Department and another [2021] EWCA Civ 800 in which the Court of Appeal ruled that paragraph 4 of schedule 2 to the DPA 2018 did not comply with the requirements set out in Article 23(2) of the UK GDPR and was therefore unlawful. See Updata edition 12 for more information about this case.
10 December 2021
The Government has announced a new Cyber Strategy to protect the UK from cyber threats and protect and promote the UK's interests in a "rapidly evolving online world".
The strategy also sets out plans for how the Government intends to keep the public safe from cybercrime, which include:
- bolstering law enforcement through significant funding;
- increasing investment in the National Cyber Force;
- expanding GCHQ's National Cyber Security Centre's research capabilities;
15 December 2021
Links
Draft legislation Draft explanatory memorandum Press Release
Updata Edition 14 October to December 2021 | United Kingdom
84
United Kingdom
Development
Summary
- implementing the Product Security and Telecommunications Infrastructure Bill to enforce minimum security standards in new consumer smart products; and
- investing in public sector cyber security to ensure that key public services are resilient to threats and can continue to deliver services for those who need them.
Date
ICO launches consultation on draft guidance on the right of access for competent authorities
The ICO has launched a consultation on two pieces of draft guidance in relation to the right of access for competent authorities (as defined in the DPA 2018). The two pieces of guidance relate to Part 3 of the DPA 2018 and consider:
- the right of access by competition authorities to information processed for a law enforcement purpose ("Right of Access Guidance"); and
- how authorities should deal with manifestly unfounded or excessive requests ("Manifestly Unfounded or Excessive Requests Guidance").
The consultation is open until 1 March 2022. Interested parties can respond through an online survey available on the ICO website or by completing the survey in word and sending it to [email protected]
15 December 2021
ICO launches consultation on draft Regulatory Action Policy and statutory guidance on regulatory action and PECR powers
The ICO is seeking views on three documents which set out how the ICO aims to carry out its mission to uphold information rights for the UK public in the digital age, namely its Regulatory Action Policy; Statutory Guidance on our Regulatory Action; and Statutory Guidance on our PECR Powers. The consultation closes on 24 March 2022.
20 December 2021
Network and Information Systems (EU Exit) (Amendment) Regulations 2021 to come into force on 12 January 2022
The Network and Information Systems (EU Exit) (Amendment) Regulations have been made. They aim to address failures of retained EU law, enabling it to operate effectively, alongside other deficiencies arising as a result of the UK's withdrawal from the EU. In particular, the regulations amend and remove certain criteria for managing and reporting cyber risks which apply to digital service providers, but which are no longer appropriate to the UK, and remove the thresholds for reporting cyber incidents these will instead be set in guidance.
15 December 2021
Links
ICO consultation Press release Right to access guidance Manifestly unfounded and excessive requests guidance
Press release
Regulations
Updata Edition 14 October to December 2021 | United Kingdom
85
United States
Contributors
Michael Bahar Co-Lead of Global Cybersecurity and Data Privacy
T: +1.202.383.0882 [email protected] eversheds-sutherland.com
Sarah Paul Partner
T: +1.212.301.6587 [email protected] eversheds-sutherland.com
Alexander Sand Counsel
T: +1.512.721.2721 [email protected] eversheds-sutherland.com
Mary Jane Wilson-Bilik Partner
T: +1 202.383.0660 [email protected] eversheds-sutherland.com
Brandi Taylor Partner T: +1.858.252.6106 [email protected] eversheds-sutherland.com
Tanvi Shah Associate
T: +1.858.252.4983 [email protected] eversheds-sutherland.com
Rebekah Whittington* Associate
T: +1.404.853.8283 [email protected] eversheds-sutherland.com
(*Not admitted to practice. Application submitted to the Georgia Bar)
Development
Two states take steps to increase genetic privacy protections
Summary
Date
In both California and Florida, regulators have taken steps to protect the privacy and security of genetic data. In California, Governor Gavin Newsom signed the Genetic Information Privacy Act ("GIPA").The GIPA applies to direct-to-consumer genetic testing companies and requires companies to comply with certain privacy requirements including the following:
1. provide consumers notices explaining the company's privacy practices and obtain consent for the collection of genetic data;
6 October 2021
Links
GIPA Text DPA Text
Updata Edition 14 October to December 2021 | United States
86
United States
Development
Summary
2. restrict service providers' use of genetic data;
3. develop procedures and practices enabling consumers to exercise privacy rights with respect to their genetic data;
4. implement reasonable security procedures and practices to protect consumers' genetic data; and
5. refrain from giving genetic data to any entity that is responsible for administering or making decisions regarding health insurance, life insurance, long-term care insurance, disability insurance, or employment, or a company that provides advice to an entity that is responsible for performing those functions.
Florida's Protecting DNA Privacy Act (DPA) came into effect on 1 October 2021. The DPA criminalizes the willful collection, retention, analysis, and disclosure of a Florida resident's DNA sample or analysis without express consent. People who violate the act are guilty of either misdemeanors or felonies depending on the precise violation.
Date
US Treasury Department announces results of sanctions policy review and OFAC releases compliance guidance for the virtual currency industry
The US Department of the Treasury ("Treasury") has released the results of its review of economic and financial sanctions first announced in December 2020 by then President-elect Biden ("Report"). From that review, Treasury has issued recommendations to "preserve and enhance" the effectiveness of sanctions. This long-anticipated announcement came out the same week as the Office of Foreign Assets Control ("OFAC") issued new "Sanctions Compliance Guidance for the Virtual Currency Industry" ("Guidance") as a resource to help participants in the virtual currency industry navigate and comply with US sanctions administered by OFAC.
Ransomware attacks using virtual currency continue to rise, and the United States is employing a "whole of government" approach to try to stop ransomware attacks. While banks have traditionally been the "front line" for US sanctions enforcement, the most recent Guidance similarly deputizes the virtual currency exchange industry to monitor and police sanctions violations and sanctions evasion. These actions, coupled with Treasury's recognition of the harm virtual currencies and alternative payment platforms could have on the efficacy of US sanctions and its intent to improve
15 October 2021
Updata Edition 14 October to December 2021 | United States
Links
Sanctions Compliance Guidance for the Virtual Currency Industry
87
United States
Development
Summary
institutional knowledge and capabilities in these areas, suggest that companies should expect increased sanctions enforcement activity targeting the virtual currency industry. Accordingly, to mitigate such emerging sanctions risks, companies involved in this sector are advised to establish and maintain a tailored, riskbased sanctions compliance program.
Date
Judicial panel on MDL denies centralization in data breach cases
In October, the Judicial Panel on multidistrict litigation ("MDL") denied an insurer's request to centralize class actions stemming from a data breach. The insurer is fighting five proposed class actions across the country. All five class actions stem from a data breach in which attackers used illegally obtained personal data to gain access to the insurer's online sales system in early 2021. The insurer publically announced the breach in April 2021 and estimated that 150,000 individuals were affected.
Plaintiffs filed three of the class actions in the Eastern District of New York with the other two in federal courts in Maryland and California. While only one set of plaintiffs opposed the centralization, the court found that the small number of class actions favored an "informal coordination among the parties" rather than centralization. After the ruling by the panel, the three New York cases were consolidated and the court in Maryland transferred the Maryland case to the New York court, leaving only the California case as separate.
21 October 2021
CFPB orders big tech companies to turn over information
The Consumer Financial Protection Bureau ("CFPB") issued a series of orders to several large technology companies that operate payment systems. The CFPB is hoping to better understand how these companies use personal payments data and manage data access to users. The CFPB has statutory authority under section 1022(c)(4) of the Consumer Financial Protection Act to order participants in the payments market to turn over information to help the Bureau monitor for risks to consumers and to publish aggregated findings that are in the public interest.
The CFPB may send additional orders in the future. The orders require the companies to send information on the company's data harvesting and monetization, access restrictions and user choice,
21 October 2021
Links
Order Denying Transfer
CFPB Press Release
Updata Edition 14 October to December 2021 | United States
88
United States
Development
FTC updates Gramm-Leach Bliley Act Safeguards Rule
Summary
and other consumer protection measures the company is taking to comply with other consumer protection laws.
Date
In October, the Federal Trade Commission ("FTC") announced significant updates to the Gramm-Leach Bliley Act ("GLBA") Safeguards Rule. While most of the requirements under the amended Safeguards Rule are not immediately effective, the amendments are significant changes to the current regulatory scheme.
27 October 2021
The updated Safeguards Rule has five material modifications to the existing rule. The first main modification includes provisions that require specific elements in an information security program. Notably, the Safeguards Rule now requires financial institutions to designate a single "Qualified Individual" to oversee, implement, and enforce a written and accessible information security program and ensure certain safeguards are in place to control risk, including, with some exceptions, encryption of all customer information and implementation of multi-factor authentication. Secondly, the amendments add periodic reporting requirements to boards of directors and governing bodies. Thirdly, acknowledging that requiring small institutions or institutions with a small amount of customers would be unduly burdensome on the institutions without providing an extensive benefit to consumers, the amendment exempts companies that maintain customer information concerning fewer than five thousand consumers from many of the requirements of the Safeguards Rule.
Fourthly, the amendments expand the definition of "financial institution" to include any institution significantly engaged in activities that are incident to financial activities as described in section 4(k) of the Bank Holding Company Act of 1956, 12 U.S.C. 1843(k). The FTC explained that this amendment is designed to bring only one activity into the definition that was not included previously -- "finding." The definition of a finder is an entity that brings "together more buyers and sellers of any product or service for transactions that the parties themselves negotiate and consummate." Finally, the amendments add several definitions and related examples to the rule that were previously incorporated by reference from another FTC rule. These additions will have little practical effect on the rule.
Updata Edition 14 October to December 2021 | United States
Links
Federal Register Notice of the Amended Rule
89
United States
Development
Summary
In addition to the current amendments, the FTC is seeking comment on whether to make an additional change to the Safeguards Rule to require financial institutions to report certain data breaches and other security events to the FTC. The FTC issued a Notice of Supplemental Rulemaking that proposes adding a requirement that financial institutions notify the commission of certain security events.
Date
New York City council passes measure implementing bias audits for employers that use AI in the hiring process
The New York City Council passed a measure on 10 November 2021 that requires employers which use artificial intelligence to promote or screen job candidates to undergo bias auditing every year and notify candidates if AI was used to make hiring decisions. The bias audits must be completed by an independent auditor and assess the AI system's disparate impact on decisions made based on protected categories, such as race and sex. Employers who violate the law are subject to $500 penalties for first-time violations and up to $1,500 for repeat offenses. This measure is part of a broader push across the nation to address potential biases in automation technology.
10 November 2021
Federal Reserve, Federal Deposit Insurance Corporation, and Office of the Comptroller of the Currency issue joint agency incident response rule for banks
Acknowledging the increasing cybersecurity threats to financial institutions, the Federal Reserve, Federal Deposit Insurance Corporation ("FDIC"), and the Office of the Comptroller of the Currency ("OCC") adopted final rules requiring banking organizations to notify its primary federal regulator and customers of certain cybersecurity incidents (the "Cyber Incident Reporting Rules") on 18 November 2021. Banks must comply with the rules by 1 May 2022.
18 November 2021
The Federal Reserve, the FDIC, and the OCC all promulgated similar but separate rules. The Cyber Incident Reporting Rules require banks to alert its primary federal regulator of certain computer-security incidents as soon as possible and no later than 36 hours after the banking organization determines that a cyber incident has occurred. Under the Cyber Incident Reporting Rules, computer-security incidents are defined as "an occurrence that results in actual harm to the confidentiality, integrity, or availability of an information system or the information that the system processes, stores, or transmits." Regulators must be notified of a computer-security incident that has "materially disrupted or degraded, or is reasonably likely to materially disrupt
Updata Edition 14 October to December 2021 | United States
Links
Text of the Measure
Federal Register Notice of the Cyber Incident Reporting Rules
90
United States
Development
National Institute for Standards and Technology ("NIST") releases concept paper for an artificial intelligence risk management framework and draft report on blockchain for access control systems
Summary
or degrade, a banking organization's" ability to carry out banking operations or deliver banking products and services to a material portion of customers, the organization's business lines, or the organization's operations where the failure would pose a threat to the United States.
Date
NIST has released two documents for public comment. The first is a concept paper to guide the development of an Artificial Intelligence Risk Management Framework. The framework will be for voluntary use and aims to address characteristics of trustworthiness into the design, development, use, and evaluation of AI products, services, and systems.
NIST also released an internal report, (NISTIR 8403), Blockchain for Access Control Systems, for public comment. The report presents analyses of blockchain access controls systems and discusses considerations for implementation. The report focuses on Blockchain access control systems from the perspective of Blockchain system properties, components, functions, and supports for access control policy models. The initial draft notes that blockchain technology features such as decentralization, high confidence and tamper resistance are advantages to solving challenges for network access control by traditional mechanisms, including auditability, resource consumption, scalability, central authority and trust issues.
Comments are due 25 January 2022 for the concept paper and 7 February 2022 for the blockchain report.
20 December 2021
Links
Concept Paper Blockchain Report
Updata Edition 14 October to December 2021 | United States
91
For further information, please contact:
Paula Barrett Co-Lead of Global Cybersecurity and Data Privacy T: +44 20 7919 4634 [email protected]
@ESPrivacyLaw
Michael Bahar Co-Lead of Global Cybersecurity and Data Privacy
T: +1 202 383 0882 [email protected]
Editorial team
Lizzie Charlton Senior Associate PSL (Data Privacy) T: + 44 20 7919 0826 [email protected]
Ruth Haynes Trainee Solicitor
T: + 44 20 7919 0599 [email protected]
Thomas Holt Trainee Solicitor
T: + 44 121 232 1360 [email protected]
Leighann Mountain Trainee Solicitor
T: + 44 1473 284 530 [email protected]
Lauren Pettit Trainee Solicitor
T: + 44 1223 44 3831 [email protected]
Tom Elliott Project Co-ordinator
T: +44 1223 44 3675 [email protected]
Joan Cuevas Legal Technologist
T: + 44 20 7919 0665 [email protected]
eversheds-sutherland.com
Eversheds Sutherland 2022. All rights reserved. Eversheds Sutherland (International) LLP and Eversheds Sutherland (US) LLP are part of a global legal practice, operating through various separate and distinct legal entities, under Eversheds Sutherland. For a full description of the structure and a list of offices, please visit www.eversheds-sutherland.com. This information is for guidance only and should not be regarded as a substitute for research or taking legal advice. Updata Edition 14