Welcome to the November 2018 edition of our Data Protection bulletin, our monthly update covering key developments in data protection law.
Brexit withdrawal agreement and data protection
Although the status of the draft EU-UK withdrawal agreement remains uncertain, the agreement contains key potential implications in relation to the UK's data protection arrangements for Brexit and beyond.
The withdrawal agreement outlines the arrangements for the UK's withdrawal from the EU on 29 March 2019 and proposes a transition period from this date until 31 December 2020, during which EU legislation will continue to apply to the UK ("Transition Period"). This means that the GDPR will continue to apply directly to the UK throughout the Transition Period. The UK will be treated in the same way as any other member state, meaning there will be no restrictions on the transfer of personal data between the EU and the UK throughout the Transition Period.
During the course of the Transition Period, the UK government will be seeking to agree an adequacy decision with the European Commission. If reached, the UK would join one of a number of countries deemed by the Commission to provide an "adequate" level of personal data protection: i.e. in the Commission's view, the UK's level of data protection is essentially equivalent to that of the EU. An adequacy decision would allow for the unrestricted transfer of personal data between organisations established in the EU and the UK.
The political declaration that was released at the same time as the withdrawal agreement makes a clear commitment on the part of the European Commission to begin an assessment of the UK's data protection legislation, with the intention of making an adequacy decision before the end of the Transition Period.
Although the wording of the withdrawal agreement is not entirely clear, commentators expect that it is the intention of the EU and the UK that if an adequacy decision is not made by the end of the Transition Period, EU citizens' data processed in the UK during and after the Transition Period will be processed in line with the GDPR. The withdrawal agreement also makes clear that EU member states will continue to process data of UK citizens in line with the provisions of the GDPR. If an adequacy decision is reached during or after the Transition Period, this adequacy decision will supersede this arrangement.
The role of the Information Commissioner's Office ("ICO") on the European Data Protection Board ("EDPB") during the Transition Period will be diminished. Article 128(5) of the withdrawal agreement gives the ICO the right to attend meetings of the EDPB by invitation only and only in particular circumstances. The ICO will not have a right to vote in EDPB meetings. The UK will also no longer participate in the "one-stop-shop" procedures under the GDPR. This currently means that the ICO only takes responsibility for the EU-wide data processing obligations of controllers established in the UK. For businesses which operate in both the UK and other EU countries, exit from the "one-stop-shop" procedures could mean that they face parallel proceedings by the ICO and other EU data protection supervisory authorities.
In general, the withdrawal agreement should offer some comfort for UK and EU businesses, as it appears that under the agreement legal consistency in data protection laws and the free flow of personal data between the UK and the EU will be guaranteed. However, as you will be well aware, there is still a risk that the withdrawal agreement will not be approved by Parliament. If an agreement is not accepted by Parliament, the UK will leave the EU on 29 March 2019 without any transitional arrangements in force. Organisations should therefore still consult guidance issued by the Department for Digital, Culture, Media & Sport released in September 2018 on what will happen to data protection arrangements in the event that the UK leaves the EU without securing a deal, covered in our September 2018 data protection update here.
ICO publishes provisional Regulatory Action Policy
The ICO has refreshed its Regulatory Action Policy ("Policy") which sets out the situations in which the ICO will take criminal and civil regulatory enforcement action against organisations in breach of data protection legislation. The refreshed Policy comes on the back of the extended powers granted to the ICO following the implementation of the GDPR and the Data Protection Act 2018 in May 2018. The Policy is subject to Parliamentary consultation.
The Policy provides guidance on when and how the ICO will take action for breaches of information rights. When deciding whether and how to respond to breaches of information rights obligations, the ICO will consider certain criteria, which include the nature and seriousness of the breach; the categories of personal data affected; the number of individuals affected; and the gravity and duration of the breach.
Where applicable, the ICO will also consider any aggravating or mitigating factors. Aggravating factors could include whether there is an intentional, wilful or negligent approach to compliance; whether advice or warnings from the ICO and/or the Data Protection Officer have been ignored; and the relevant individual or organisation's regulatory history.
Mitigating factors could include any actions taken by the relevant individual or organisation to mitigate the damage suffered by individuals; the state and nature of any protective or preventative measures and technology available; and early notification by the relevant individual or organisation to the ICO of the breach.
See here for the full Regulatory Action Policy.
UK Information Commissioner elected chair of global forum of data protection and privacy authorities
The UK's Information Commissioner, Elizabeth Denham, has this month been elected Chair of the International Conference of Data Protection and Privacy Commissioners ("ICDPPC"). The ICDPPC is the leading international forum of data protection and privacy authorities. The forum has more than 120 members spanning all continents.
The ICDPPC seeks to resolve global data protection and privacy issues by working with governments and policymakers globally. The ICDPPC also coordinates a highly successful conference which has been occurring annually for four decades.
When accepting her role, Elizabeth Denham commented:
“The ICDPPC is a truly unique global forum, championing strong and independent authorities. Key to this is ensuring that authorities can share cutting edge policy and enforcement experience. I am keen to ensure that ICDPPC can continue to support our member authorities with experiences, strategies and best practice that are inclusive of diverse legal frameworks and cultural backgrounds.”
For more information about the work of the ICDPPC, see here.
DCMS publishes review of data protection fee exemptions to the ICO
The Department for Digital, Culture, Media & Sport ("DCMS") has published the results of its evaluation of the exemptions from paying the data protection fee to the ICO.
The Data Protection (Charges and Information) Regulations 2018 (SI 2018/480) requires controllers to pay the data protection fee to the ICO, unless certain exemptions apply. Examples of the current exemptions to paying the data protection fee include when controllers are processing personal data only for staff administration purposes; advertising, marketing and public relations; and personal, family or household affairs. The ICO has published a self-assessment tool (see here) which aims to assist controllers in determining whether they are liable to pay the fee.
The DCMS has decided not to alter any of the current data protection fee exemptions. This is due to the impact removing these exemptions could have on micro organisations, the voluntary sector and small and medium sized businesses. As part of its review, the ICO decided to introduce a new exemption from payment of the data protection fee for elected representatives and members of the House of Lords. The reason the DCMS has cited for this new exemption is that democratic activity should not be liable to pay a charge as this represents a "barrier to democracy".
For the full DCMS consultation see here.
ICO updates Parliament on investigation into use of data analytics for political purposes
The ICO has published a report that aims to update Parliament on its investigation into the use of data analytics in political campaigning. The report covers the investigation, findings that have been revealed and enforcement actions to date.
The investigation began in May 2017 and is the result of allegations that personal data had been misused by social media platforms and companies such as Cambridge Analytica in order to attempt to sway voters in relation to the Brexit referendum and President Trump's election victory.
The ICO has taken enforcement action against a number of organisations to date, including issuing:
- Facebook with the maximum fine under the Data Protection Act 1998 ("DPA 1998"), £500,000, for breaches of this legislation, which is currently being appealed – see our enforcement section;
- a notice of intent to Leave.EU for contraventions of the Privacy and Electronic Communications Regulations 2003 ("PECR"); and
- formal warnings to 11 political parties, requiring action to be taken in order to comply with data protection legislation.
The ICO additionally makes a number of recommendations. The report asks the government to provide comments on whether there are any regulatory gaps in the current data protection regime, which if remedied could result in improved safeguards for the protection of personal data in the electoral landscape. The ICO has also recommended that a Code of Practice for the use of data analytics in political campaigns is put on a statutory footing in order to help combat the misuse of personal data for these purposes.
The full ICO report can be found here.
European Parliament adopts resolution regarding Cambridge Analytica scandal
The European Parliament has recently adopted a resolution (see here) on Cambridge Analytica's use of Facebook users' personal data and called for Facebook to allow a full and independent audit of the social media platform.
The resolution makes clear that the European Parliament expects all online social media platforms to provide users with information on how their personal data is used in targeted advertising and to ensure that companies have effective controls in place. This encompasses having separate consents to different kinds of processing, increased transparency around privacy settings, and encouraging platforms to reconsider the prominence of their privacy notices.
The European Parliament also made suggestions on how to combat the issues associated with the use of data analytics in political campaigns. It commented that electoral laws needed to be reformed for a digital age, that profiling for political purposes should be prohibited in order to prevent interference in democratic processes and that social media platforms should be responsible for monitoring and informing the relevant authorities if any illicit behaviour occurs.
With regard to Facebook, the European Parliament encouraged Facebook to allow the European Union Agency for Network and Information Security and the EDPB to carry out an audit of the social media platform and subsequently present the findings of this audit to the European Commission, the European Parliament and national parliaments. The European Parliament also urged national supervisory authorities to investigate Facebook's practices.
Finland adopts GDPR-style legislation
On 13 November 2018, Finland updated its data protection law which repeals the Personal Data Act 1999 and is largely in line with the GDPR. There was some delay in the adoption of the new Personal Data Act due to considerations regarding imposing administrative fines. The Data Protection Ombudsman in Finland claimed that to have one person determining a very high level of sanctions did not mirror Finland's legislative tradition. This point of contention has been resolved by setting up a three-member board to determine the issuing of financial penalties.
Other national divergences from the GDPR include setting the age of consent with respect to offering information society services to children at 13 rather than 16, allowing organisations other than public bodies to process personal data pertaining to criminal convictions and public bodies are excluded from the GDPR administrative fines.
The date on which the Personal Data Act enters into force is to be confirmed at a later date.
ICO publishes guidance on encryption and passwords
The ICO has recently published security guidance focussing on encryption and passwords under the GDPR. It is a key principle of the GDPR that personal data is to be processed "in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures". Where appropriate, encryption mechanisms and passwords are one method that can ensure organisations are compliant with this principle of the GDPR.
The guidance highlights the following considerations that should be taken when implementing encryption:
- the algorithm should be regularly assessed and appropriate for its purpose;
- the key size should be sufficient to defend against an attack;
- that the controller, or a data processor acting on the controller's behalf, implements software that meets current standards; and
- that keys should be kept securely as these provide the ability to decrypt the data. The ICO advises that organisations should have procedures in operation to generate new keys when necessary.
In relation to passwords, the GDPR does not include any specific requirements. They are nevertheless a commonly-used method of securing personal data. The ICO's key points on passwords include:
- passwords should not be stored in plaintext. A suitable hashing algorithm should be used instead;
- login pages should be protected with HTTPS;
- the only restrictions that should be imposed on choice of passwords are a minimum password length and blacklisting common and/or weak passwords. Passwords should be screened against a 'password blacklist' containing the most commonly used passwords, leaked passwords from data breaches and common words or phrases relating to the service. Special characters should be permitted, but not required; and
- there should be a limit on the number of login attempts.
Cathay Pacific data breach
Hong Kong's privacy commissioner will launch an investigation into Cathay Pacific's data breach involving 9.4 million passengers' personal data. Cathay Pacific notified the ICO of the breach and the supervisory authority is currently making enquiries. The airline confirmed that the personal information stolen included passport details, email addresses and expired credit card details.
The Cathay Pacific breach follows a number of data breaches by aviation industry giants including British Airways where hackers stole data relating to 380,000 transactions (see our September 2018 update here) and a £120,000 fine handed down by the ICO to Heathrow Airport for failing to ensure the security of personal data held on its network (see our October update here).
Unlike the fine against Heathrow Airport which was imposed under the DPA 1998, Cathay Pacific is likely to be fined under the GDPR. The airline could therefore face fines up to the higher of €20 million or 4% of global annual turnover.
Eurostar data hack
Eurostar has responded to attempts by hackers to break into an unspecified number of accounts by resetting its customers' login passwords. Eurostar has notified those customers whose accounts were targeted. All other customers with Eurostar accounts will be asked to reset their login details when they next attempt to access their account.
Eurostar has not confirmed whether the hackers were successful, but have stated that payment details were not stolen. The ICO has been notified of the incident and are investigating the matter.
Radisson Hotel Group hit by data breach
A "small percentage" of members of the Radisson Rewards scheme had their data stolen by hackers in a data breach. The stolen information included member names and addresses, email addresses, company names, phone numbers, Radisson Rewards member numbers and frequent flyer numbers. The company denied, however, that any payment details, stay details or passwords were subject to the breach.
Suspicious activity was detected on 1 October, several weeks after the hack occurred on 11 September. Radisson Reward members were only notified of the incident at the start of November.
First ICO prosecution under Computer Misuse Act 1990 leads to prison sentence
The ICO has secured its first prosecution under the Computer Misuse Act 1990 ("CMA"). The individual in question, who had worked for an accident repair firm, used a colleague's log-in details to gain unauthorised access to thousands of customer records containing personal data. He was charged under s.1 CMA, which makes it a criminal offence to cause a computer to perform any function with intent to secure access to data held on that computer. A guilty plea was entered, and the individual was sentenced to six months' imprisonment.
The case marks a notable departure from the ICO's previously stated policy that it would prosecute such cases under data protection law and not under the CMA. The ICO were evidently of the opinion that the nature and extent of the offending in this case warranted prosecution under the CMA, which carries heavier sanctions.
Mike Shaw, the ICO's Group Manager Criminal Investigations Team, said: "Although this was a data protection issue, in this case were able to prosecute beyond data protection laws resulting in a tougher penalty to reflect the nature of the criminal behaviour. Members of the public and organisations can be assured that we will push the boundaries and use any tool at our disposal to protect their rights."
Two jailed for involvement in TalkTalk hacking
Two men have been jailed for a combined sentence of 20 months for offences pursuant to the CMA relating to their involvement in a cyber-attack on TalkTalk in October 2015. The data breach affected 156,959 customer accounts and involved the theft of personal information, banking details, and sensitive data.
TalkTalk lost an estimated £77m as a result of the breach, and received a £400,000 fine from the ICO relating to security failings that allowed the hackers to access customer data "with ease".
High Court finds that suspicious activity reports may amount to "personal data" for the purposes of the Data Protection Act 1998
The High Court has granted an order requiring a bank to disclose a suspicious activity report ("SAR") to a customer, in so doing observing that SARs may amount to "personal data" for the purposes of the DPA 1998.
In December 2017, NatWest made a SAR to the National Crime Agency in relation to a number of accounts held by a customer, Mr Lonsdale. The accounts were subsequently frozen. A month later, Mr Lonsdale made a subject access request under s.7 DPA 1998 seeking disclosure of documents relating to the bank's decision to freeze his accounts. Only limited documentary evidence was provided by the bank, and therefore Mr Lonsdale commenced proceedings against it seeking an order that the bank had withheld personal data to which he was entitled under the DPA 1998.
NatWest sought to strike out Mr Lonsdale's claim on the basis that the information requested by Mr Lonsdale was not "personal data"; rather, it was "mixed data" and did not need to be disclosed pursuant to the DPA 1998. Further, it was contended that disclosure of this information was not required as the information fell within the exemption in s.29 DPA 1998, as it was prepared for the "prevention or detection of crime".
The judge dismissed NatWest's application, holding that it had demonstrated a "flawed understanding" of the scope of the concept of "personal data". Mr Lonsdale had a "strong" claim that the data processed in the course of determining whether to make the SAR and freeze his accounts amounted to "personal data". As to the bank's reliance on the exemption at s.29 DPA 1998, the judge deemed this a matter for trial; no evidence was provided to suggest that the provision of further personal data to Mr Lonsdale would prejudice the prevention and detection of crime.
The judgement should make financial institutions think twice about the SAR-related personal data that may become disclosable pursuant to a subject access request, and how to demonstrate that relevant prevention of crime exemptions may apply to that data.
A copy of the judgment is available here.
ICO enforcement action
ICO narrows enforcement notice issued against AggregateIQ following an appeal
An ICO enforcement notice issued against AggregateIQ ("AIQ"), which was the first enforcement notice issued pursuant to the GDPR and the Data Protection Act 2018 ("DPA 2018"), has been amended following an appeal by AIQ pursuant to s.162(1)(c) DPA 2018.
AIQ had been implicated in the Cambridge Analytica investigation earlier this year in relation to the use of data analytics in political campaigning. The ICO had concluded that AIQ failed to comply with Articles 5 (1)(a)-(c), 6 and 14 GDPR. Personal data had been processed in a way that data subjects were not aware of, without a lawful basis, and for purposes they would not have expected.
The original ICO enforcement notice, dated 6 July 2018, ordered AIQ to cease processing "any personal data of UK or EU citizens obtained from UK political organisations or otherwise for the purposes of data analytics, political campaigning or any other advertising purpose".
AIQ appealed, arguing, amongst other things, that the ICO had no jurisdiction over AIQ pursuant to GDPR and DPA 2018, the notice was too broad and was inadequately reasoned, and the individuals whose personal data AIQ had processed had, in fact, consented to such processing.
An amended enforcement notice was subsequently issued by the ICO on 24 October 2018, with notably narrower terms; AIQ were ordered, pending completion of Canadian data protection investigations, to erase the personal data of any UK individuals held on AIQ's servers as of May 2018, which triggered AIQ's appeal to be withdrawn.
Facebook appeals against ICO's £500,000 fine
Facebook has confirmed that it intends to appeal the £500,000 fine it received from the ICO for breaching DPA 1998 in unfairly processing users' personal information "by allowing application developers access to their information without sufficiently clear and informed consent, and allowing access even if users had not downloaded the app, but were simply 'friends' with people who had" (further details regarding the basis for the fine can be found here).
Facebook intends to argue on appeal that, whereas it was originally being investigated by the ICO in relation to concerns about harm to UK citizens, during its investigation the ICO's focus was improperly shifted to an examination of Facebook's approach to the sharing of data, and the reasoning for the fine set out in the enforcement notice "challenges some of the basic principles of how people should be allowed to share information online".
Anna Benckert, Facebook's associate general counsel in Europe, said in a statement: "we believe the ICO's decision raises important questions of principle for everyone online, which should be considered by an impartial court based on all the relevant evidence".
Separately, a cache of documents which had been disclosed by Facebook in the course of US Proceedings were seized from an executive of US tech firm Six4Three, the recipient of that disclosure, by a House of Commons sergeant-at-arms exercising rarely-used Parliamentary powers. It remains unclear whether these documents will be provided to the ICO by the DCMS, on whose instruction they were seized.
ICO fines Uber £385,000 over data protection failings
Uber has been fined £385,000 for failing to protect 2.7 million customers' and drivers’ personal information, including full names, email addresses and phone numbers, during a cyber-attack in late 2016 which, the ICO found, was caused by avoidable data security flaws in breach of Uber's obligations under principle seven of the DPA 1998.
The customers and drivers affected were not told about the incident for more than a year. Instead, Uber paid the attackers responsible $100,000 to destroy the data they had downloaded.
ICO Director of Investigations Steve Eckersley said: "This was not only a serious failure of data security on Uber’s part, but a complete disregard for the customers and drivers whose personal information was stolen. At the time, no steps were taken to inform anyone affected by the breach, or to offer help and support. That left them vulnerable…Paying attackers and then keeping quiet about it afterwards was not, in our view, an appropriate response to the cyber attack."
If this breach had occurred after the GDPR came into force, and Uber adopted a similar approach, it could well have been fined hundreds of millions of dollars, both in relation to the data security flaws which the ICO identified, and for failing to report the breach to the ICO and affected data subjects promptly (there is no specific sanction for failing to do so under DPA 1998 although it may be reflected in the fine levied by the ICO).
Two home security firms fined for making nearly 600,000 nuisance calls to numbers registered with the Telephone Protection Service
Two home security firms, who were together responsible for making nearly 600,000 marketing calls, have been fined a total of £220,000 by the ICO.
Secure Home Systems was fined £80,000 for making calls to 84,347 numbers registered with the Telephone Preference Service ("TPS"), using call lists bought from third parties without screening them. ACT Response Ltd made 496,455 nuisance calls to TPS subscribers and was fined £140,000.
The ICO found that both firms had used a public electronic communications service to make unsolicited direct marketing calls contrary to regulation 21 of PECR. The maximum fine that the ICO can issue under the PECR in relation to nuisance marketing is £500,000.
These fines demonstrate the importance of checking the TPS register of subscribers and using it to screen call lists.
ICO fines marketing company Boost Finance Ltd £90,000 for nuisance emails about pre-paid funeral plans
Boost Finance Ltd ("BFL"), a London-based marketing company, has been fined £90,000 for sending over four million nuisance emails in relation to pre-paid funeral plans. The emails were sent to individuals who had subscribed to websites of BFL's affiliates.
The ICO emphasised that, generally speaking, organisations cannot send marketing emails in the absence of consent from the recipient. Such consent must be freely given, specific and informed, and involve a positive indication signifying the individual's agreement. Consent will not be valid if individuals merely agree to receive marketing from "similar organisations", "partners" or "selected third parties". In the present case, the ICO concluded that the consent relied on by BFL was not sufficiently informed and, as such, did not amount to valid consent for the purposes of regulation 22 of the PECR.
Privacy International files complaints against data brokers and credit scorers
Privacy International has filed complaints against a number of data brokers, credit referencing agencies, and ad tech companies in relation to alleged GDPR breaches. Acxiom, Oracle, Experian, and Equifax have been referred to the UK ICO, whilst data watchdogs in Ireland and France have been received complaints in relation to Tapad, Quantcast and Criteo.
Privacy International alleges that the use of consent and legitimate interests as legal bases for data processing by these companies is invalid. Additionally, it alleges that there have been purported breaches of GDPR principles of transparency, fairness, lawfulness, purpose limitation, data minimisation and accuracy due to the absence of direct contact with individuals and the opacity of the data processing.
The ICO has confirmed that it is "aware of concerns raised about the compliance of data protection laws by big tech companies, data brokers and credit referencing agencies".