Information Commissioner's Office ("ICO")
6 October 2015 ICO issues statement in response to Schrems ruling
On 6th October 2015, the ICO issued an immediate, measured reaction to the invalidation of Safe Harbor.
The ICO recognised the significance of the decision, the need for regulators/legislators to provide a unified response and the fact that it will take some time for data controllers to make alternative arrangements where they had previously relied on Safe Harbor for US transfers. In other words, it was represented a 'don’t panic, but review your position’ message for organisations.
It stressed the fact that Safe Harbor was just one of the available legal bases for EU-US transfers and that the ruling does not affect: (1) the use of EU approved standard contractual clauses as part of data protection agreements; (2) the use of Binding Corporate Rules; (3) reliance on statutory exemptions such as the consent of individuals; and (4) the ability of controllers to 'self-assess' adequacy under this fourth option. The ICO has recently released detailed guidance on what it means to 'self-assess', which can be found here. Organisations should note that this data transfer solution is specific to the UK and not permissible in all EEA states.
The full statement can be found here.
19 October 2015 ICO announces changes in structure and personnel
Christopher Graham has announced that the ICO would move to a broader, senior leadership model. As from November, there will be a 12 person strong senior leadership team to assist the Commissioner. There will be one deputy Commissioner and a deputy CEO (a promotion for current Deputy CEO Simon Entwistle).
The change is partly in response to the retirement of Deputy Commissioner David Smith and the move of Deputy Commissioner Graham Smith, alongside the Leveson Report recommendations that the ICO should no longer be an individual appointment, but should have a more corporate structure.
The full announcement, published on the ICO blog, can be read here.
26 October 2015 ICO signs up to the Global Privacy Enforcement Network alert system
The ICO has signed up to the GPEN Alert system - a secure and confidential information sharing tool for international members of the Global Privacy Enforcement Network (“GPEN”). The system uses the existing Consumer Sentinel Network (“CSN”) platform operated by the US Federal Trade Commission (“FTC”) (although the GPEN Alert is distinct from the rest of the CSN platform).
By joining this initiative, the ICO hopes to mitigate existing challenges in global privacy enforcement co-operation. Until now, such collaboration has proven difficult because the ICO has found it "almost impossible" to share information between authorities.
GPEN Alert members may notify other privacy enforcement authorities of their investigations and enforcement actions, specifically those with an international element, for the purposes of co-operation or co-ordination. It also allows them to proactively investigate whether other members are investigating or acting against the same company, persons or practices.
However, the system is subject to safeguards. Shared information may only be contributed and accessed by GPEN member authorities that have signed a “memorandum of understanding” and whose relevant staff have “appropriate security credentials”. Furthermore, it does not presently facilitate the sharing of detailed and confidential non-public enforcement matters or customer complaints relating to privacy.
Information Commissioner Christopher Graham stated that he hopes the “secure and confidential” GPEN Alert system will prove to be a “key tool” in both “building on the international co-operation the GPEN network has [already] developed” and ensuring the public understand that “privacy authorities around the globe are watching over their information.” His full statement can be found here.
More information on the GPEN Alert is also available here.
28 October 2015 ICO publishes guide to disclosing information safely
This user-friendly guide seeks to assist the many organisations that may disclose information derived from personal data in their day- to-day operations. Such disclosures may be voluntary or obligatory. For example:
when responding to subject access requests under the Data Protection Act 1998;
when responding to relevant requests, or proactively making information available, in relation to the Freedom of Information Act (FOIA) or the Environmental Information Regulations (EIR);
when making personal data available for re-use under the Reuse of Public Sector Information Regulations (RPSI); or
when otherwise making data publicly available.
In such cases, the disclosing organisation may need to remove certain personal data from information before its release to ensure that individuals cannot be identified from it. In the case of subject access request responses, for example, disclosing organisations should not provide information relating to another individual, unless the other individual has given his consent, or it is reasonable in all the circumstances. Public authorities should also not publish third-party personal data in response to a FOIA or EIR request unless this is in accordance with the DPA. In most cases, this will require a careful balancing exercise between the legitimate public interest in disclosure and any potential prejudice to the third-party data subject’s rights.
The guide provides illustrative examples of potential issues and suggested solutions in order to minimise the chances of inappropriate personal data disclosures. Covered scenarios range from identifying personal data “hidden in plain sight” (i.e. hidden due to formatting styles/document set-up), addressing hidden metadata and ensuring effective redaction. Common mistakes in different file formats (Word, Excel, PDF, photo, video, and email) are addressed, with accompanying illustrations.
The full publication can be found here.
ICO confronts "Nuisance Call" offenders
Not only has Christopher Graham re-iterated his call for new ICO powers to compulsorily audit companies in the lead generator and list broker sectors, the ICO's enforcement team has now written to more than 1,000 companies believed to be involved in buying and selling people's names and numbers for the purposes of cold calling. This is part of its ongoing crackdown on nuisance calls. Such companies will be expected to explicitly set out how they comply with the law: including what data they share; how they get people's consent to share; and a list of the companies they have worked with over the last six months.
Where companies do not respond, the ICO will look to issue them with Information Notices. These legally oblige the production of requested information, with the threat of court action if it does not occur. A company was recently fined £2500 for this offence (see "UK Enforcement Action" section below).
Please see here for the full statement.
Other UK News
Culture, Media and Sport Committee launches inquiry into cybersecurity and the protection of personal data online in response to TalkTalk breach
A Department for Culture, Media and Sport ("DCMS") Committee has launched an inquiry into cybersecurity and how best to protect
personal data online, following the recent and highly publicised cyber-attack on TalkTalk's website. This has been reported as the third
cyber-attack this telecom and internet service provider had suffered in 2015 and apparently led to the loss of nearly 157,000 of its customers' personal details, including 15,656 bank account numbers/sort codes and 28,000 "obscured" credit/debit card numbers (Source: BBC).
Although not as serious as first feared, DCMS nevertheless considered this breach to raise questions and concerns over the ways in which companies store and secure information about their customers. The consequent public consultation aims to investigate the "circumstances around the TalkTalk data breach and the wider implications for telecoms and internet service providers".
Prior to the deadline for submissions on 23rd November 2015, the Committee invited views on the following topics set out
below. Surprisingly, the requests specifically ask for views about the attack on TalkTalk and its handling of the incident, and frames a number of questions by reference to personal data rather than information and systems generally – perhaps the DCMS' recent acquisition of responsibility for UK data protection matters and the Information Commissioner's Office influenced the latter:
The nature of the cyber-attacks on TalkTalk’s website and TalkTalk’s response to the latest incident;
The robustness of measures that telecoms and internet service providers are putting in place to maintain the security of their customers’ personal data and the level of investment being made to ensure their systems remain secure and anticipate future threats;
The nature, role and importance of encryption in protecting personal data;
The adequacy of the supervisory, regulatory and enforcement regimes currently in place to ensure companies are responding sufficiently to cyber-crime;
The adequacy of the redress mechanisms and compensatory measures for consumers when security breaches occur and individuals’ personal data are compromised; and
Likely future trends in hacking, technology and security.
Details of the consultation can be found here. The date by which the results of the consultation will be published has not yet been publicised.
4 November 2015
Home Secretary announces draft Investigatory Powers Bill
In November 2015, the UK Government published a draft Investigatory Powers Bill. The Bill is a response to three independent reviews of investigatory powers conducted this year, and to the Investigatory Powers Tribunal and court cases this year which have
considered the lawfulness of the UK regime. The Government has also launched a consultation on the provisions of the Bill, which will
take place before the Bill is introduced to Parliament in 2016.
consolidates the (currently scattered) investigatory powers available to law enforcement, security and intelligence agencies in one place – including setting out specific rules for bulk access to data and bulk interception;
requires judicial authorisation (by specially appointed Judicial Commissioners) for interception warrants;
creates a new Commissioner to supervise these powers (replacing the current split oversight arrangements); and
modernises the rules relating to retention of communications data. The Bill also consolidates the various obligations which apply to CSPs and strengthens their rights of appeal.
The Bill will allow for secondary legislation to be brought forward, covering similar activities to the current Lawful Business Practice Regulations (ie monitoring for quality control purposes and to detect breach of company policies).
Appleton v Gallagher  EWHC 2689 (Fam)
On 28 September 2015, Mostyn J delivered a judgment considering what can be reported by journalists in private court proceedings which, following a change to the Family Procedure Rules in 2009, may be attended by the press, exercising a role as “watchdog” on behalf of the public at large.
Mostyn J’s judgment emphasises the highly confidential nature of ancillary relief proceedings given the very full disclosure parties are required to make. He found that the 2009 Rule change had not been intended to upset the core privacy protections provided by the implied undertaking of confidentiality in relation to hearings in chambers. He rejected the argument put by the press that this implied undertaking bound the parties, but did not bind journalists attending in the role of “watchdog”. In his opinion, this would “throw the baby of privacy of confidential information out with the bath water of a supposedly secretive judicial process”.
Holding that the implied undertaking to treat as confidential material disclosed in private proceedings collaterally binds observing journalists (so that journalists publishing material disclosed in such proceedings would do so in contempt of court), Mostyn J went on to find that, even if he was wrong on that point, the press would nevertheless be prevented in this case from reporting confidential evidence on the basis of a pure balancing exercise between privacy and freedom of expression. In this particular case, the basic presumption of privacy in ancillary relief proceedings was not displaced; this was not a case where there had been extensive inaccurate
speculation and neither party had spoken publicly to the press regarding the detail of the divorce.
Mostyn J recognised however that a high degree of confusion has arisen in this area since the 2009 Rule change. He noted that “to say that the law about the ability of the press to report ancillary relief proceedings which they are allowed to observe is a mess would be a serious understatement". Granting permission to appeal, he expressed the hope that the Court of Appeal will resolve “the unhappy divergence of judicial approach” evident in previous case law in this area.
W, X, Y and Z v Secretary of State for Health, Secretary of State for the Home Department and British Medical Association  EWCA Civ 1034
This case examines the status of medical treatment data in the context of information sharing by NHS entities. It includes a detailed analysis of the scope and extent of the common law protection for private and confidential information.
14 October 2015
By way of background, persons not ordinarily resident in England and Wales may be charged for health services in accordance with the National Health Service (Charges to Overseas Visitors) Regulations 2011 (“the Charging Regulations”). In 2011, guidance was issued on implementing the Charging Regulations, which included provisions requiring NHS bodies to transfer certain non-clinical patient information (including name, gender, DOB, current address, travel document number/expiry dates, the amount/date of the debt and the NHS body to which it is owed; together "the Information") to the Secretary of State, who may then pass it on to the Home Office. Such transfers intend to facilitate enforcement of immigration sanctions, which are permissible when non-resident individuals are seeking to remain or enter the UK with unpaid NHS debts of at least £1000.
In judicial review proceedings, four non-UK claimants challenged these information sharing requirements. They had lost in the High Court where it was considered that the Information was not private or confidential. The CA considered the following questions relevant for privacy purposes:
Does disclosure of the Information breach the claimants' rights to privacy?
The CA confirmed that the approach to whether information is private depends upon Lord Nicholls' test in Campbell v MGN  UKHL 22, namely whether the individual in question had a 'reasonable expectation of privacy.' Although the court accepted that the Information was 'inherently private' since it allowed inferences as to an individual's health (i.e. that they had been unwell and possibly even the nature of their illness in relation to references to some NHS entities), it concluded that it would generally not be considered 'private information vis-à-vis the Secretary of State and Home Office'. This was justified on the basis that the guidance to the Charging Regulations stated that non-resident NHS patients would be told of the possibility of such transfers for the purpose of immigration control. As a result, they would not normally have a 'reasonable expectation of privacy' in relation to the Information when it was used in this way. It may be questioned whether such analysis should be more relevant to the question of justifying privacy
infringement, rather than whether or not the information is private in itself.
The Court then speculated that even if the claimants did have a right to privacy in the Information, this would not be infringed by disclosure consistent with the Guidelines. Such analysis required a 'balancing exercise' between the harm caused and public benefit achieved by the disclosure. The fact that (1) the privacy intrusion was 'at the lower end of the spectrum' of seriousness; (2) overseas patients are advised of the potential disclosure; (3) the disclosure pursues the legitimate aim of improving NHS debt recovery; and (4) the Information is securely transmitted to a limited group of civil servants meant that no infringement had occurred here.
Does disclosure of the Information unjustifiably infringe claimants' Article 8 ECHR Rights?
The CA held that the 'modest interference' with the claimants' Article 8(1) rights would be justified under Art 8(2) as being 'in accordance with the law' and thus 'necessary in a democratic society'. This was because such Information sharing was subject to the principles and procedures stated in the guidance (it did not matter that this was non-statutory), as well as safeguards within the Data Protection Act 1998. The combination of these satisfied the court that the Information would be sufficiently protected against arbitrary or abusive disclosure.
Weller and Ors v Associated Newspapers Limited  EWCA Civ 1176
The Court of Appeal has upheld the High Court's decision in Weller v Associated Newspapers ( EWHC 1163 (QB)), which held that by publishing unpixellated photographs of three of Paul Weller's children taken while they enjoyed a family day out in Los Angeles, the publishers of the Mail Online had infringed their privacy rights.
The claim relates to an article published by Mail Online, entitled "a family day out". Accompanying pictures showed Paul Weller and the children out shopping and relaxing in a café. Although taken in a public place, they showed the children's faces, were taken without parental consent and after a specific request to stop. In the High Court, Dingemans J agreed with the claimants' argument that such publication represented both a "misuse of private information" and/or a breach of the Data Protection Act. He granted an injunction restraining further publication of the relevant photographs and awarded damages totalling £10,000.
Mail Online argued that:
the children did not have a "reasonable expectation of privacy" in relation to the unpixelated images of their faces. They
contended that the photographs were "innocuous", taken in a public place and did not show anything "inherently private" – as English law "does not recognise an image right", their publication should not therefore be actionable; and
Dingemans J should have considered the fact that under Californian law, the taking and publication of the photos would have been entirely lawful.
The CA applied the two-stage test from Campbell v MGN Ltd ( UKHL 22), now firmly established as the correct approach to determining "whether a publication is in breach of an individual's privacy rights". This: (1) asks whether the claimant has a "reasonable expectation of privacy" in relation to the disclosed facts; and if so (2) requires a balancing exercise between their Article 8 privacy rights and the publisher's Article 10 right to freedom of expression.
In confirming that the children had a "reasonable expectation of privacy" on these facts, the CA reiterated that all of the objective circumstances should be considered. The following factors, and associated obiter comments, are of particular note:
The claimant's attributes (including age): Although a child does not have an automatic expectation of privacy, their age and consequent inability to fully exercise their autonomy means such expectation may be sometimes be "reasonable" where it would not be for an adult.
The nature and location of the activity: These have "particular relevance" in the case of young children since they may not "choose to be in a particular place or interact with the public in a particular way" – a matter often dictated by their parents. As such, "a child's reasonable expectation of privacy must be seen in the light of the way their family life is conducted" on their behalf. Activities with a "family element," as opposed to everyday activities like "popping out to the shops for a bottle of milk" are more likely to engage Article 8 protection.
The nature and purpose of the intrusion: The court's approach to these factors should be "generally no different" whether the claimant is a child or an adult.
Parental consent: The parent's lack of consent, if it was known to the publishers, will carry "particular weight" as it is them who "set the context for a child's family life".
Celebrity status: The fact that a child's parents are celebrities should not, without more, be used to argue that a child should
have a lower reasonable expectation of privacy. This may be the case, however, where parents have "courted publicity for the
Effect on the claimant: Although "highly material" to the second aspect of the Campbell approach (i.e. the balancing test), this factor should also be considered at the first stage. Despite the fact that very young children may not be aware of any intrusion, interference with a child's Article 8 rights may "give rise to greater security concerns" than it would with an adult, particularly when their parents are in the public arena. Bullying concerns and even the prospect of embarrassment may also be considered.
Nature of local law: Although relevant in all cases where the intrusion has an international element, "the weight to be accorded to it is a matter for the judge to decide".
In relation to the balancing exercise, the CA also came down on the side of the children's Article 8 rights. It noted that:
although the engagement of a child's Article 8 rights should not automatically "trump" all countervailing Article 10 arguments, any adverse effect on a child's interests should be given "considerable weight";
privacy infringements do not necessarily require evidence of the harm that may be caused to a child. The application of the balancing test is a question of fact for the court to decide based on its own "common sense and experience"; and
the fact that a publication makes a contribution to "a debate of general interest" is just one of the criteria the court must consider when striking the balance. In this case, the photographs did not do so - they "simply showed a figure on a private family outing in a public space". Other relevant factors (including the fact that: (1) the claimants were children with limited public profiles; (2) the parents had never previously allowed similar images to be published in the press; (3) the claimants and their parents were negatively affected by the publication; and (4) the photographer had been asked to stop and given assurances as to pixellation) also supported the High Court's decision.
Associated Newspapers plans to appeal the decision to the Supreme Court.
Secretary of State for the Home Department v Davis  EWCA Civ 1185
On 20th November 2015, the UK Court of Appeal ("CA") referred two questions to the Court of Justice of the EU ("CJEU") in the context of ongoing litigation surrounding the legality of the Data Retention and Investigatory Powers Act ("DRIPA") 2014. DRIPA is
the UK's communications data retention law controversially passed in the aftermath of a previous CJEU ruling against the UK's laws
in this area. Timeline of Events
April 2014: In Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources (C-293/12) 
Q.B. 127, the CJEU declared the EU Data Retention Directive (2006/24/EC) invalid, citing its disproportionate interference with individuals' privacy rights.
July 2014: The UK government immediately implemented emergency national data retention laws, DRIPA 2014 (including requirements on telecoms providers to retain communications data and hand this over to law enforcement bodies), designed to remedy CJEU criticisms in the Digital Rights Ireland case. Two MPs (one Conservative one Labour) allied to swiftly bring judicial review proceedings in respect of this legislation, arguing that the speed at which it was rushed through parliament meant the CJEU's issues had not been properly addressed.
July 2015: The High Court ( EWHC 2092 (Admin)) ruled that s.1 DRIPA 2014 was inconsistent with EU law and should be disapplied in certain respects (although this disapplication was suspended until after 31 March 2016 to allow remedial legislation to be implemented). This was primarily on the basis that (1) "it did not lay down clear and precise rules for providing access to and use of communications data" and (2) access to such data was not "dependant on prior review by a court/administrative body whose decision limited access to and use of the data to what was strictly necessary for the purpose of attaining the objective pursued."
The Secretary of State appealed against this decision, notwithstanding the fact that the government has pursued a parallel overhaul of alternative surveillance legislation (the "Investigatory Powers Bill") and plans for this to come into force by the end of 2016 in place of DRIPA.
In its initial response to this appeal, the CA has referred two questions to the CJEU to facilitate its eventual decision. They seek clarification on whether:
(1)eDigital Rights Ireland decisionlaiddownmandatoryrequirementsofEUlawwithwhichnationallegislationhadto comply(e.g.theneedforpriorapprovalforaccessbyaourtorotheradministrativebody).ContrarytotheHighCourt's opinion,theCAneverthelessnoteda"provisionalview"thatitdidnot,andwasnotintendedto,sinceitwasonlyidentify
and describing protections that were absent from the harmonised EU regime; and
The judges asked for the CJEU to expedite the preliminary reference process for this case. In light of present EU security concerns, a response may be expected in the near future.
For a full copy of the judgment, please click here.
Article 29 Working Party publishes opinion on Cloud Select Industry Group (C-SIG) Code of Conduct on Cloud Computing
On 21 January 2015, the Cloud Select Industry Group (C-SIG) submitted a revised version of the draft Data Protection Code of Conduct for Cloud Service Providers (CoC) to the Article 29 Working Party (A29WP). On 22 September 2015, the A29WP published its Opinion on the CoC, highlighting a number of gaps in the draft code, with a view to contributing to its improvement and adoption.
The Opinion makes the following recommendations:
Governance: the CoC should elaborate on the transition towards the General Data Protection Regulation (GDPR), the powers of the governance body and the differences between self-assessment and third party certification.
Sensitive personal data: the CoC should illustrate relevant scenarios relating to the processing of sensitive personal data by cloud computing providers (CSPs) in order to demonstrate that such processing requires 'additional safeguards'.
Location of processing: the CoC should require that CSPs provide specific and easy-to-access information concerning the location of processing so that customers (as data controllers) may fulfil their obligation to monitor the processing they have handed over to CSPs as processors. This requires more information than simply the countries in which data will be processed, subprocessed and/ or transferred to or by whom the data will be processed.
Personal data: the CoC should make specific reference to the notion of personal data and should apply the definition provided in
Directive 95/46/EC. Reference to the notion of personal data appears to be omitted in the existing draft on the premise that, as processors, CSPs typically do not 'identify' the personal data.
International transfers: the CoC should state the specific requirements for transfers and disclosures to non-EEA companies, as based on the A29WP's interpretation of article 43A of the GDPR. The current draft of the CoC is superficial in this respect.
Liability: the Code should prevent the adoption of terms of services that are disadvantageous to customers by unduly limiting CSPs' obligations and liability and restricting customers’ rights. The CoC should specify what areas of the processing are managed by the CSP as a co-controller, controller or processor, and should provide information on the allocation of liability between the CSP and its customer.
Security: the CoC should require that the CSP provide the customer with a sufficient level of detail on the security measures it has implemented and on the threats on, and vulnerabilities of, the CSP service and infrastructure and on the risk management decisions taken by the CSP. The set of minimum security objectives contained in the draft code was too generically formulated.
Right to audit: this right should be generally guaranteed and not strictly limited to cases where the CSP has not been certified by an independent body.
Users' rights: the right to portability should be explicitly referenced.
EDPS adopts a second opinion on the European Commission's proposal for a Directive on the use of Passenger Name Record (PNR) data
A PNR is the record made when an airline – or other travel – booking is made. The core information is about who is travelling and the ticket and itinerary details, but the PNR typically includes other information such as method of payment and meal preferences and can contain sensitive personal information, for example about medical conditions.
In 2007, the Council of Ministers of the EU proposed to adopt a Decision on a PNR system which would require carriers operating flights to third countries to transmit PNR data to European law enforcement bodies for the purpose of preventing, detecting, investigating and prosecuting terrorist offences and serious crime. But in February 2011 this proposal was overtaken by a Commission proposal for a Directive to achieve the same purpose. The then European Data Protection Supervisor (EDPS), questioned the necessity and proportionality of the Commission’s proposal, but the Council adopted a general approach on 23rd April 2012 with a view to starting negotiations with the Parliament. These negotiations stalled when the Parliament’s LIBE Committee rejected the proposal
in April 2013. They have been revived following the Charlie Hebdo attacks in Paris in January this year and trilogue discussions have
been taking place with a view to reaching agreement by the end on 2015. A key difference between the Council and the Parliament is whether the system should apply to intra-European as well as ‘international’ flights.
Accordingly, on 24 September 2015, the current EDPS, Giovanni Buttarelli, issued a second opinion on the proposal.
Buttarelli, after reciting the history of the prosed Directive, acknowledges that ‘Europe is facing serious terrorist threats and has to take meaningful action.’
The EDPS emphasises, however, that ‘necessity and proportionality are essential prerequisites’ and that ‘available information does not justify why the massive, non-targeted and indiscriminate collection of passengers' personal information is necessary and why it is urgently needed.’ He relies on the decision in the Digital Rights Ireland case for his view that objective criteria are required to identify the persons and places to be targeted, the rules for access, the uses of the data and the retention periods.
The EDPS considers the detail of particular proposals such as the definition of ‘serious crime’, the operation of the proposed Passenger Information Units (PIUs), the role of Europol, information exchange between states and the monitoring of the system. He welcomes the stronger data protection provisions proposed by LIBE such as the requirement for PIUs to appoint Data Protection Officers. He suggests, however, that the scheme might be inconsistent with the package of Data Protection reforms which are currently in trilogue discussions, and in particular with the proposed Law Enforcement Data Protection Directive; he recommends that the PNR proposal should be delayed until the Data Protection Package has been agreed.
The Opinion concludes that in the absence of a justification for mass collection of data, PNR data should be used only on a case-by- case basis. The EDPS calls on the Council and the Parliament to ‘explore the effectiveness of new investigative approaches as well as of more selective and less intrusive surveillance measures based on targeted categories of flights, passengers or countries.’
We speculate that this Opinion will have passed into history with the November attacks in France.
Article 29Working Party confirms that "Do Not Track" is not sufficient to meet EU Data Protection law
1 October 2015
On 1st October the A29Working Party sent a critical submission to the World Wide Web Consortium's public consultation, launched on 14 July 2015, on its Do Not Track Specification.
The A29WP raised two substantial points, as well as a number of more minor observations.
EU law applies to all processing of personal data, the concept should be extended from do not track, to do not collect.
(first and third) require consent to engage in online behavioural advertising and that DoNotTrack should be extended to cover this.
EU Council reaches common position on draft Data Protection Directive for the police and criminal justice sector.
As part of the Data Protection Package - of which the proposed Regulation has received most attention – the European Commission also published on 25 January 2012 a proposed Directive to harmonise the processing of personal data by law enforcement bodies. The draft Directive will replace a Framework Decision adopted in 2008 and is intended to facilitate the exchange of personal data between law enforcement authorities within the European Union whilst maintaining a high level of personal data protection. The Framework Decision, which applied only to cross-border processing, was criticized by privacy lobbyists for failing to provide adequate data protection and the Commission proposal sought to apply the usual data protection standards whilst recognizing the special requirements of law enforcement.
9 October 2015
The new directive would apply to both the cross-border processing of personal data as well as the processing of personal data by the police and judicial authorities at purely national level. It also extends to processing for public security.
On 9 October 2015 the Council adopted a Common Position thereby agreeing its negotiating position on the draft directive and enabling the Luxembourg presidency to start trilogue discussions on this part of the Data Protection Package.
The Council has given lengthy consideration to this Directive and substantial amendments have been adopted in the Council text. There has been a very substantial reworking of the recitals and many significant articles have been deleted or significantly amended. The general tenor of all these changes has been to protect the flexibility of law enforcement investigations and to secure the data from undue access. ‘Competent Authorities’ to whom the Directive would apply can include private sector bodies ‘entrusted by national law to perform public duties or exercise public powers for the purposes’ set out in the Directive.
Anyone carrying out law enforcement activities should read the full text, available here.
Article 29 Working Party issues statement in response to the Schrems decision
16 October 2015
The Article 29 Working Party, comprised of representatives of Member State DPAs, the European Data Protection Supervisor and the European Commission, issued a non-binding press release on the implications of the Schrems decision on 16 October 2015. Although they stressed the need for a robust common position, the lack of clear, immediate, guidance as to next steps for organisations transferring data suggests a possible divergence of opinion amongst national authorities. The Article 29 Working Party stated that:
the existence of mass, indiscriminate surveillance within the US underpins the Court's reasoning and is incompatible with the
European legal framework. Existing tools for the transfer of personal data are not the solution to this issue;
whilst DPAs continue to analyse the scope of the decision, SCCs and BCRs will remain a valid legal basis for EEA-US data transfers. This does not, however, remove individual DPAs’ ability to exercise their powers to protect individuals on a case-by- case basis;
transfers that are made in reliance on Safe Harbor are, however, immediately unlawful and DPAs may take steps to reach out to companies known to rely on Safe Harbor; and
there is a suggestion that a three-month 'grace period' for enforcement action may be recognised to allow for political solutions to be reached. However, the statement emphasises that this will not stop individual DPAs taking actions they consider necessary to protect individuals (for example, this could be refusing to authorise transfers or suspending data flows). If the Commission and US fail to reach agreement on a suitable replacement for Safe Harbor, and depending on the ongoing assessment of other transfer tools, European DPAs are committed to take appropriate steps, which may include coordinated enforcement action in respect of those who fail to implement alternative, valid methods of transfer.
Please see here for the full statement.
Eurojust issues analysis of EU Member States’ legal framework and current challenges on data retention
On 26 October 2015, Eurojust published its analysis of EU Member States' (MS) legal framework and current challenges on data retention following the 8 April Digital Rights Ireland CJEU decision annulling the 2006 Data Retention Directive (DRD).
26 October 2015
The DRD was adopted to harmonise EU efforts in the investigation and prosecution of serious crimes. It required operators to retain
certain categories of traffic and location data for a period of 6-24 months and to make them available, on request, to law enforcement
authorities for the purposes of detecting, investigating, and prosecuting serious crime. The CJEU declared the DRD invalid in its
entirety on the following grounds, despite acknowledging that the retention of data satisfied an objective of general interest in the fight
against serious crime:
The DRD breached Articles 7 (respect for private and family life) and 8 (protection of personal data) of the Charter of Fundamental Rights (CFR) because the limits imposed by the principle of proportionality had not been respected.
Specifically, DRD scheme was deemed non-compliant with the test of strict necessity as it did not lay down clear and precise rules
regarding the scope and justified limitations to the rights to privacy and data protection.
Further, that the DRD lacked sufficient procedural safeguards for the protection of the data.
In analysing the legal and practical implications of the CJEU judgment on Member States, Eurojust found the following:
National legislation on data retention:
The judgment itself does not directly or automatically affect the validity of domestic transposing laws of the DRD.
As a consequence of the judgment, MS are simply no longer obliged to maintain data retention regimes.
Following the judgment, the domestic transposing law has been struck down in at least 11 MS, but remains in force in 14.
Admissibility and reliability of evidence:
The judgment raises questions as to whether evidence gathered through data retention schemes that essentially replicate the DRD is consistent with the right to a fair trial. Issues relating to the admissibility and reliability of evidence may, therefore, arise.
18 MS have experienced no cases regarding the effect of the judgment on the admissibility of data retained and retrieved under the invalidated domestic legislation.
5 MS have had case law in this respect.
Amongst the MS where the national law on data retention had been struck down, in 4 countries illegally obtained evidence is not admissible in court, while in 3 MS illegally procured evidence could be under certain circumstances.
Amongst the MS where the transposing law remains in force, in 10 countries access to (and use of) retained data requires previous authorisation by a judicial authority. However, in 2 countries, no mandatory authorisation regarding access to, and use of, retained data exists.
The analysis concludes that the fragmented regulation in place on data retention undermines criminal investigations and prosecutions
as well as judicial cooperation in the fight against serious crime. Indeed, there have been a significant number of challenges to the
admissibility of evidence in criminal proceedings since the judgment.
European Commission publishes its 2016 work programme
In 2014, European Commission President Juncker announced 10 political priorities of which 3 have particular relevance to data protection:
A Connected Digital Single Market -
This priority area included data protection reform
A Reasonable and Balanced Free Trade Agreement with the U.S. -
27 October 2015
The TTIP would preserve EU data protection standards
An Area of Justice and Fundamental Rights Based on Mutual Trust -
This priority included the PNR Directive, the EU-US Data Protection Agreement and also the Data Protection Package The 2016 Work Programme published by the European Commission reiterates these priorities and proposes new key initiatives.
Progress on the Digital Single Market Strategy, to which the proposed data protection Regulation contributes, will include reviews of the Regulation on consumer protection cooperation, the telecoms regulatory framework, the audiovisual and media services Directive and a legislative proposal on the free flow of data. The TTIP negotiations are to be pursued, but in the area of Fundamental Rights emphasis is placed on security with proposals to amend the Framework Decision on terrorism, to improve rules on firearms and proposal combat fraud and counterfeiting of non-cash means of payment.
Data protection specialists will not be able to concentrate entirely on the implementation of the Data Protection Package which the Commission expects soon to be adopted. The promised reviews of telecoms and audio visual regulation will undoubtedly have implications for the e-Privacy Directive.
28 October 2015
EDPS issues recommendations on the Directive for data protection in the police and justice sectors
Following the adoption by the Council of Ministers of a Common Position on the law enforcement data protection Directive, the European Data Protection Supervisor (EDPS), Giovanni Buttarelli, has issued a further Opinion on the proposal. He points out that a
previous Opinion criticised the inadequate level of protection in the proposed Directive and indeed the latest Opinion is also critical of
the Council’s position.
The stance of the EDPS is that ‘a high level of protection is the consequence of the embedding of the right to data protection in EU primary law, particularly in Article 16 TFEU and in Article 8 of the Charter of the Fundamental Rights of the Union.’ He asserts that the CJEU in Digital Rights Ireland and, in Schrems confirmed the importance of a high level of protection in connection with law enforcement and contends that data protection in law enforcement should both be consistent with the proposed General Data Protection Regulation (GDPR) and also that the Directive should contain only those variations necessary because of the specific nature of law enforcement.
The EDPS sees the Council amendments as ‘changing the nature of the Directive into an instrument providing minimum harmonisation.’ Although, Member States may provide higher safeguards, he believes that the EU legislators are required to provide a high level of protection in the Directive; they should ensure that the standard of data protection is not lowered and respects EU law and the Council of Europe standards.
As well as criticising the Council’s general approach, the EDPS comments on a number of specific matters amongst which are the delaying of the transposition deadline to three years, rather than the two years proposed by the Commission. EDPS considers it important that the whole Data Protection Package should commence at the same time.
The purpose limitation principle, a cornerstone of data protection law, seems to have been weakened by deeming some uses as compatible. The EDPS tells us that ‘data processed by competent authorities acting within the scope of the Directive [should not be] further used for a totally different purpose’ and he cites immigration control as an example.
A too liberal approach has been taken to processing special categories of personal data, and both the right of subject access and the duty to provide information are too weak.
Insufficient powers are granted to DPAs which should be on the same footing as in the General Regulation.
Not all judicial activity should be excluded from supervision, but only that which is a ‘genuine’ exercise of a judicial activity.
In the view of the EDPS, ‘the performance of law enforcement tasks by non-public entities and organisations should be subject to the GDPR’ and not regulated by the Directive; he gives as examples the collection by airlines and telecommunications providers of information which they are required to pass to law enforcement bodies. The scope of the Directive should be narrowly confined to criminal matters rather than extending it to cover the vague concept of threats to public security.
The proposed rules on international transfers of personal data should be aligned with the CJEU ruling Schrems and existing
agreements for the transfer of personal data concluded by the Member States should be amended within a fixed time limit and
not preserved as proposed by the Council.
And finally, the EDPS takes another opportunity to call on the Commission to propose a new instrument for data protection at the level of the EU institutions and bodies to replace Regulation 45/2001.
6 November 2015 EU Commission issues Communication on EU-US Data Transfers following Schrems
The European Commission has adopted a formal communication to the EU Parliament and Council following the CJEU’s decision in Schrems v Data Protection Commissioner (C-362/14: see article below), which invalidated the Safe Harbor framework as a legal basis for data transfers from the EEA to the United States.
This document notes the ensuing lack of clarity and seeks to (1) provide a non-binding overview of the alternative mechanisms available to justify continuing transatlantic data transfers to countries or territories not covered by a Commission adequacy decision, such as the US; and (2) “briefly describe” the consequences of the judgment for existing adequacy decisions.
According to the Commission, “transfers of personal data are an essential element” of the EU-US relationship. It therefore reiterates its commitment to agreeing a new data transfer solution with the US and confirms that it has “immediately…stepped up its talks with the US government to ensure that any new arrangement…fully complies with the standard set by the Court”. Until such agreement is reached, however, the Commission offers the following guidance:
Alternative Bases for Transfers of Personal Data to the US (and other countries not covered by an Existing Adequacy Decision)
The Commission offers the following guidance:
Contractual solutions: As Commission decisions are binding "in their entirety" on DPAs, incorporating approved Standard Contractual Clauses ("SCCs") into a contract obliges a DPA to accept them "in principle". DPAs may not therefore "refuse the transfer of data to a third country on the sole basis that SCCs do not offer sufficient safeguards". However, they may investigate relevant complaints from individuals and refer issues to national courts in judicial review proceedings (who may then refer on to the CJEU). DPAs may also take "additional measures" to ensure adequacy or suspend transfers in response to information received from a data importer on changes in their legal system that prevents the fulfilment of SCC obligations.
Ad-hoc contracts which have been approved on a case-by-case basis by national DPAs may also constitute a legal basis for transfer.
Intra-group transfers: The use of Binding Corporate Rules ("BCRs") to facilitate international transfers within a corporate
group is largely unaffected by the Schrems decision. However, this is not a "quick fix." There is a lengthy approval process by a
lead and two supporting authorities. Even after this, most Member States require data transfers on the basis of BCRs to be authorised by the DPA in each Member State home to a data exporter, leading to a long and complex authorisation process.
Derogations: Although the Schrems judgment does not remove the ability to rely on a statutory derogation to justify non- EEA transfers, the Commission notes Article 29 Working Party guidance which considers that these derogations must be "strictly interpreted". In their view, derogations are unlikely to justify transfers "which might be qualified as repeated, mass or structural," in which case "a specific legal framework such as SCCs or BCRs" should be utilised. The guidance contains examples of circumstances in which relevant derogations may be relied upon (e.g. the transfer of information to a hotel after a reservation may be permitted as "necessary for the performance of a contract or the implementation of pre-contractual measures in response to the data subject's requests".) In relation to consent, this must be unambiguous (i.e. not implicit), freely given, given in advance and in relation to a specific transfer (or a particular category of data) and accompanied by a notice stating that data will be transferred to the specific third country and the specific risk inherent in that transfer.
Consequences for Existing Adequacy Decisions
The Commission states that existing adequacy decisions will remain binding on all Member States and their organs, including DPAs, until such time as they may be withdrawn, annulled or declared invalid by the CJEU, "which alone has jurisdiction in this regard". The Schrems decision did not affect the validity of any other adequacy decision.
However, this does not affect the ability of DPAs to examine individual claims arguing that transfers to countries covered by an adequacy decision violate their rights. Where necessary, the DPA may then institute proceedings before a national court, who may then refer the question of validity on to the CJEU.
Other notable aspects state that:
the CJEU expressly noted that an adequacy finding may still be issued in relation to a country that operates a system of self- certification (similar to Safe Harbor), provided that there are "effective detection and supervision mechanisms that make it possible in practice to identify and sanction any infringements of the data protection rules";
data transfers between the EU and US can no longer be carried out by invoking adherence to the Safe Harbor privacy principles;
although each of the other existing adequacy decisions include restrictions on the powers of national DPAs identical to those found to be invalid in the Safe Harbor decision, the Commission will shortly prepare a decision, to be adopted by the
applicable comitology procedure, which replaces the offending provision in all existing adequacy decisions; and
the Commission will regularly assess existing and future adequacy decisions through joint reviews with the competent authorities in the third countries in question.
Please see here for the full communication.
EDPS issues opinion on "Meeting the challenges of big data"
In its Opinion 7/2015, the EDPS issued recommendations as to how EU data protection law should be applied "innovatively" in the context of "big data". It considers how best to facilitate the "significant benefits and efficiencies for society and individuals" that processing large amounts of data can bring, while simultaneously addressing "serious" associated privacy concerns. The EDPS argues that the EU Single Digital Market should show leadership in this area by critically assessing data-driven technologies and prevailing business models, including internet user tracking.
In summary, although the EDPS suggests that existing data protection principles (including transparency, proportionality and purpose limitation) remain the "base line" for privacy protection in the "world of big data", they must be "complemented by new principles." To this end, he believes that "responsible and sustainable" big data processing and development should be based on "four essential elements":
Transparency: The opinion notes that data processed in a "big data" context is often not knowingly handed over to organisations by individuals – much is observed or inferred (e.g. through recording of online activities). It recommends that individuals are given intelligible information as to all such processed data, are better informed as to the purposes for which it is used and told of the logic used in algorithms that determine assumptions or predictions about them.
Higher degree of data subject control: It is argued that more "powerful rights of access, data portability and effective opt-out mechanisms" will be necessary to empower individuals to better detect unfair biases, challenge mistakes and prevent the secondary use of data for purposes that do not match with their reasonable expectations. Organisations should not prioritise data exploitation over the need to develop innovative ways to deliver information, access and control to individuals.
Privacy by design: The EDPS reiterates the developing consensus that laws, regulations, contractual obligations, internal procedures and privacy policies "will not suffice" to ensure transparency and user control where big data is involved. Privacy- friendly engineering should build safeguards and user control into products, services, organisational arrangements and business practices.
Increased accountability: Privacy compliance should not be considered a "one-off exercise". Instead, the EDPS recommends "regular verification" that internal privacy-control systems remain fit for purpose and the continued production/preservation
of evidence so that this can be easily proved to external stakeholders, such as supervisory authorities. Implementing data
protection "by design", privacy impact assessments, auditing and certification, along with the appointment of a designated
data protection officer, are all mentioned as contributing to an accountable internal control system.
By making such recommendations, the EDPS aims to "stimulate a new and informed discussion in and outside the EU" - involving a wide range of stakeholders including designers, companies, academics, public authorities and regulators – on how best to harness "industry's creative potential" to innovatively protect privacy rights in this context.
To this end, it intends to establish an "external ethics board" to analyse the impact of big data, develop a model for honest information policies (for EU bodies) which can serve as a best practice guide for all data controllers, and organise a "Big Data Protection workshop" for policy makers, relevant representatives of EU institutions and external experts.
Bara v Presedintele Casei Nationala de Asigurari de Sanatate (C-201/14)
On 1st Oct 2015, the CJEU handed down its decision in Bara v Presidentele Casai Nationale de Asigurai de Sanatate - confirming that the disclosure of income data by the Romanian tax authority to the health insurance fund, without notice to the affected persons, breached the Data Protection Directive.
Romanian law contained provisions requiring payment of health insurance fees mandating certain bodies to provide information to the Health Insurance Fund to allow the Fund to determine who was insured. A protocol had been agreed between the National Tax Administration Agency (ANAF) and the Health Insurance Fund, whereby ANAF provided information to the Health Insurance Fund, including information about income.
1 October 2015
The applicants (Bara and others), complained that this information was disclosed without their knowledge or consent, in breach of Directive 95/45/EC. The CJEU agreed that the arrangements breached Arts. 10, 11 and 13 of the Directive. (The CJEU made no comment about consent). Information about the sharing of such data should have been provided to the applicants.
The Romanian government tried to argue that the disclosure of such data, without notice, was permitted under the exemptions set out at Article 13 of the Directive. The CJEU disagreed: first, these exemptions only apply if Member State law so provides: the arrangements in Romania were set out in an administrative Protocol, this fell short of setting them out in a law. Second, the relevant Romanian law only addressed transfer of data proving that someone was insured - it did not require transfer of data about income, as had been the practice of the ANAF.
The applicants also complained that the transfer of data breached Article 124 TFEU (provisions prohibiting privileged access to finance by public bodies). The Court held that this provision was not relevant.
Weltimmo s. r. o. v Nemzeti Adatvedelmi es Informacioszabadsag Hatosag (C-230/14)
On 1 October 2015, the CJEU delivered its ruling in Weltimmo v. NAIH (the Hungarian Data Protection Authority). This affirmed that a data controller who processes personal data, and has its registered seat in one Member State may nevertheless be subject to another Member States' jurisdiction in certain circumstances.
The Data Protection Directive ensures this is possible where such processing is carried out 'in the context of activities of an establishment of the controller' located within that second territory. The CJEU held that the concept of 'establishment' is 'broad' and 'flexible', extending to 'any real and effective activity – even a minimal one – exercised through stable arrangements.' The presence of one representative outside of the registered territory, who sought to negotiate debts with third parties, was sufficient.
1 October 2015
Although the nationality of those concerned by data processing is irrelevant, the CJEU highlighted that:
could, in particular, be taken into account when determining whether processing falls within 'the context of activities of an establishment of the controller' within a second territory.
The court also found that where a DPA finds it does not have jurisdiction to respond to a complaint based on the above principles, although it may investigate immediately, it cannot impose penalties outside of its own territory. Requests would have to be made to the supervisory authority of that other Member State to ask them to establish infringements and impose penalties, where their applicable law permits.
See our full analysis of the decision here.
6 October 2015
Schrems v Data Protection Commissioner (C-362/14)
On 6th October 2015, the CJEU handed down its judgment in Case C-362/14 Maximillian Schrems v Data Protection Commissioner. This stated that:
a Commission decision on the 'adequate protection' offered by a non-EU member state cannot exclude or reduce the powers
available to national data protection authorities to examine complaints brought to them by data subjects; and
data protection authorities do not, themselves, have the power to invalidate a Commission decision. However, data protection authorities and data subjects can refer questions of validity to national courts, which, in turn, can refer the question to the CJEU. The CJEU does have the authority to declare Commission decisions to be invalid.
The CJEU also found the Commission's US Safe Harbor Decision to be invalid because:
the decision contains a derogation which allows safe harborites to share data for national security purposes. However, the agencies with whom data are shared fall outside the safe harbor scheme and the Safe Harbor Decision does not address whether there is adequate protection for personal data so processed; and
the Safe Harbor Decision sets too high a bar for data protection authorities to be able to intervene. This undermines the independence of data protection authorities. The Commission does not have the authority to do this.
We have prepared a suite of materials following this decision:
Initial analysis of the decision – here.
Initial Reaction of EU DPAs – here.
Recorded Webinar discussing implications- here.
US Reaction to the decision- here.
Set of FAQs addressing key concerns – here.
CFI of Brussels orders Facebook to stop collecting personal data from non-Facebook users; imposes Europe's highest DP fine
9 November 2015
Introductory Note: Although this is not a CJEU or ECtHR case, it is included in this "European Cases" section as it is likely to be of interest to many readers.
The President of the Court of First Instance in Brussels has ordered Facebook Inc., Facebook Ireland Limited and Facebook Belgium SPRL to cease tracking internet users in Belgium who are not members of the social network via cookies and social plug-ins.
It was given 48 hours to do this, or face a fine of up to €250,000 (approx. £180,000) per day of non-compliance. This order will remain in force, even though Facebook has announced an intention to appeal the decision.
Processing of Personal Data?: Facebook employs "datr cookies" to track online activities of those who visit the Facebook
site or click a "Like" button. These are installed on the browser of internet users (regardless of whether they are a Facebook member) and may remain for 2 years. They record the IP address of individuals and allocate them a "unique identifier". The order noted that this concerns the 'processing of personal data,' despite Facebook arguments that "datr cookies" are only associated with browsers, not individual people.
"Manifest" violation of Belgian Data Protection Law: The order found that the fact that Facebook collects data on the web surfing behaviour of millions of people from Belgium who have decided not to become a member of Facebook’s social network is a “manifest” violation of Belgian data protection law, irrespective of the purposes for which Facebook uses the relevant personal data. This is because:
Facebook has not obtained individual consent to do so;
Facebook cannot invoke an agreement with people who do not have a Facebook-account;
Facebook cannot invoke a legal obligation to do so; and
the security interest cited as justification by Facebook is overridden by the fundamental right to privacy of people who do not have a Facebook account.
Quantum of Fine: The order notes that Facebook realised a turnover of 12.4 billion US dollars and a profit of 2.9 billion US dollars in 2014 and is one of the most financially sound companies in the world. The significant €250,000 per day quantum was therefore necessary in order for it to be a sufficient deterrent.
notice, undertaking, monetary penalty, or prosecution
Description of Breach
Summary of steps required (in addition to the usual steps)
The company failed to notify with the ICO in breach of
The company was fined £650, ordered
s.17 DPA and the sole director was negligent in allowing
to pay costs of £492.78 and a £65
the company to commit the offence under s.61.
(and its director
The director was fined £534, ordered to
pay £489.08 costs and a £53 victim
The ICO received numerous complaints via the TPS and
The Enforcement Notice requires the
from individuals directly who are subscribers to specific
company to stop making calls to lines of
telephone lines, alleging that they had received
subscribers who have: (1) previously
unsolicited marketing calls on those lines from NCBL.
objected to such calls or; (2) registered
These calls were made by NCBL in order to sell a call-
with the TPS, within 35 days of the date
blocking service and a device to stop unsolicited calls,
of the notice (19 November 2015).
the same type of calls that NCBL itself was making. Each
individual stated that they had previously notified NCBL
NCBL have 28 days to appeal the
that such calls should not be made on that line and/or
had previously registered their number with the TPS.
The ICO found that NCBL had breached Regulation 21 of
The ICO found that Telecom Protection Service Ltd had
The Enforcement Notice requires the
breached Regulation 21 of PECR under exactly the same
company to stop making calls to lines of
circumstances as outlined above in relation to NCBL.
subscribers who have: (1) previously
objected to such calls or; (2) registered
with the TPS, within 35 days of the date
of the notice (19 November 2015).
The company has 28 days to appeal the
In connection with the facts justifying the above
The monetary penalty notice of
Enforcement Notice, the ICO found that between 26
£80,000 must be paid by 5 Januray
September 2013 and 24 July 2015, Telecom Protection
2016. The company has 28 days to
Service Ltd used a public telecommunications service for
appeal the decision.
the purposes of making 839 unsolicited calls for direct
marketing purposes to subscribers where the number
allocated to the subscriber in respect of the called line
was a number registered with the TPS, contrary to
Regulation 21(1)(b) of PECR.
The ICO found that the "seriousness" conditions in s 55A
DPA had been met and, therefore, issued a monetary
penalty on that basis.
In connection with the facts justifying the above
The monetary penalty notice of
Enforcement Notice, the ICO found that between 7 April
£90,000 must be paid by 5 Januray
2015 and 22 July 2015, NCBL used a public
2016. The company has 28 days to
telecommunications service for the purposes of making
appeal the decision.
309 unsolicited calls for direct marketing purposes to
subscribers where the number allocated to the
subscriber in respect of the called line was a number
registered with the TPS, contrary to regulation 21(1)(b)
The ICO found that the conditions in s 55A DPA had
been met and, therefore, issued a monetary penalty on
The ICO was provided with a report from the Trust, as
The Trust was issued with an
the data controller, that two letters containing sensitive
undertaking to secure compliance with
personal data relating to one patient had been included
the Seventh Data Protection Principle.
in the response to another person’s subject access
This requires them to, inter alia:
request. The error occurred due to several factors. In the
first instance the letters in question were filed
ensure all staff involved in
incorrectly. Several opportunities to identify the wrongly
filed letters were then missed before the information was sent. The ICO further discovered that temporary staff did not necessarily receive any data protection training, unless they were employed by the Trust for over three months. The Trust’s policies also set out that
Information Governance training, which includes data
protection, was only refreshed every three years.
The Trust now undertakes to ensure that personal data is processed in accordance with the Seventh Data Protection Principle and in particular, for example, that all staff processing personal data on its behalf (whether permanent or otherwise) are provided with sufficient data protection training before they carry out work that involves regular contact with personal data, especially sensitive personal data.
processing personal data receive
sufficient and compulsory data protection training – such training to be logged, monitored and refreshed annually;
issue those handling subject access requests with dedicated training; and
implement such other security measures as are appropriate.
Falkirk Council (the "Council")
An individual made a subject access request to the
Council. The bundle of documents sent to the individual
by the Council in response contained some documents pertaining to an unrelated third party, which had been incorrectly filed. The documents were not checked to ensure that only information relevant to the individual who had made the request were disclosed.
During its investigations, the ICO learnt that only 11.4% of employees had completed one or more sections of the Council’s data protection training modules.
The Council must:
within 9 months, provide training to all staff members handling personal data as part of their role. Training will be mandatory and refreshed annually;
within 6 months, implement a
system for monitoring attendance/completion of training (including steps to be taken when training has not been attended/completed);
within 6 months, improve guidance to be issued to staff who routinely handle subject access requests (including details on how third party data should be dealt with); and
within 6 months, produce a high level Data Protection Policy, setting out the data controller’s
commitments to protection of
personal data and general standards to which it will adhere. This will be communicated to all staff within 1 month of completion.
UKMS offers PPI compensation claim services. Between
The monetary penalty of £80,000 must
6 April 2015 and 10 June 2015, 1,405 complaints were
be paid by 18 December 2015. The
made to the GSMA’s spam reporting service and 37
company has 28 days to appeal the
complaints were made to the ICO.
In June 2015, the ICO wrote to UKMS with questions
about UKMS’s PECR compliance and warning of the
ICO’s enforcement powers. UKMS responded that it had
purchased the data used for the text messages from third
party suppliers and that messages were only sent to
individuals who had opted-in.
In July 2015, the ICO wrote to UKMS explaining that it
was the responsibility of the sender of the direct
marketing material to ensure PECR compliance
(regardless of assurances received from third party
suppliers). The ICO also requested that UKMS produce
evidence of the consents upon which it relied in respect
of the April and June complainants.
UKMS provided the ICO with the consent wording relied
upon by the third party suppliers, but this was found to
be insufficient to amount to consent for the purposes of
regulation 22 PECR.
The ICO was satisfied that the contravention was
serious: a total of 1,320,000 direct marketing messages
were sent to individuals who had not consented to
receiving such communications.
Whilst the ICO did not find that UKMS had deliberately
contravened PECR, the ICO concluded that it was
reasonable to suppose that UKMS should have been
aware of its responsibilities in relation to direct
marketing (particularly given that the nature of UKMS’s business and the fact that the problem of unsolicited text messages is widely publicised).
Aston James Consulting Ltd trading as ‘The CV Writers” ("Aston James")
Aston James was prosecuted for failing to notify with the
ICO (in contravention of s.17 DPA 1998) and failing to
respond to an information notice (in contravention of
s.47 DPA 1998).
Fine of £1,250 and ordered to pay costs
of £619.85 plus a victim surcharge of
Sirona Care & Health (“Sirona”)
An email containing sensitive personal data (such as
names, dates of birth, NHS numbers, addresses and medical details) about three service users was sent to the wrong email address. The intended recipient was a member of staff, but the sender accidentally selected the email address of former service user who shared the same first name. The email address of the former service user had been saved but not subsequently deleted.
Sirona had implemented data protection policies and procedures, but the ICO found that these did not offer substantial guidance on verifying email addresses and did not require employees to delete irrelevant email addresses, so were not deemed to be fully effective.
Whilst induction and annual mandatory information governance training is provided by Sirona, the ICO found that the employee in question had not received this training for over two years. The ICO also found that only 66% of staff were current with such training.
The ICO had previously expressed concerns that Sirona could not demonstrate that employees completed information governance training annually and that employees might not have been sufficiently aware of Sirona’s data protection policies and procedures. Consequently, the ICO found that Sirona may not have
ensure mandatory annual data protection refresher training is implemented for all staff who routinely process personal data;
ensure that the completion rate of data protection training is monitored. Implement appropriate follow-up procedures for staff non- compliance; and
review policies to ensure that staff are provided with appropriate guidance on email-checking procedures. Policies should be readily accessible to employees.
taken sufficient steps to act on previous advice.
The ICO received 214 complaints between 25 March
The monetary penalty of £120,000
2015 and 28 April 2015 regarding automated calls
must be paid by 8 December 2015. The
offering debt management services. The sender or
company has 28 days to appeal the
instigator of the communication was not identified in the
Upon investigation, the ICO determined that the
telephone numbers used to make the automated calls
were those of Oxygen Ltd. In June 2015, the ICO asked
Oxygen Ltd questions relating to its compliance with
PECR and reminded the company of its regulation 19
obligation and the ICO’s enforcement powers.
In response, Oxygen Ltd claimed that the automated
calls were made by a third party organisation on Oxygen
Ltd’s behalf. Oxygen Ltd claimed that it had been told by
the third party that calls would be screened against the
Telephone Preference Service. Oxygen Ltd also stated
that it had purchased its call list from a third party and
had been informed that the data was “opted-in”.
The ICO subsequently wrote to Oxygen Ltd to re-state
that, to comply with regulation 19 PECR, consent is
required for automated marketing calls. As the instigator
of the calls, Oxygen Ltd is responsible for ensuring that
the necessary consent has been obtained; relying on the
undocumented assurances of a third party is insufficient.
The ICO was satisfied that the contravention was
serious: 1,015,268 calls were made in less than one
month to subscribers without their prior consent and the
automated calls were misleading (as they implied that
they were part of a government initiative). The ICO
determined that automated calls were deliberately sent
Space Systems Ltd
Space Systems Ltd, a storage solutions company, has been prosecuted at Manchester Magistrates' Court for failing to carry out an ICO registration (contrary to s.17 DPA).
The organisation pleaded guilty to the
charge under s.17 DPA and was fined
£500. It was also ordered to pay £440 costs and a £50 victim surcharge.
The CPS was fined £200,000 after laptops containing
The monetary penalty notice of
police interview videos were stolen from a private film
£200,000 must be paid by 2 December
studio. Lost data concerned interviews with 43 victims
2015. The CPS has 28 days in which to
and witnesses – involving 31 investigations. The vast
appeal the decision.
majority of these investigations were ongoing, and of a
violent or sexual nature. Some related to historical
allegations against a high profile individual.
The CPS had entered into an agreement whereby a
private studio edited vidoes of police interviews so that
they could be used in criminal proceedings. Unencrypted
DVDs containing the videos were sent using a national
courier firm. In urgent cases, the sole proprietor of the
studio would collect such DVDs and deliver them to the
premises himself using public transport.
The private studio was situated in a multi-occupied
residential block – this had a simplex lock on the
communal door, the CCTV did not work and the studio
was not alarmed. Three unencrypted laptops were stolen
when an opportunistic burglar gained access to the
studio. These had been left out on the desk.
On these facts, the CPS was found to have failed to take
appropriate technical and organisational security
measures, in contravention of the seventh data
protection principle. According to the ICO, it should
have monitored the outsourced processing activity,
obtained security guarantees and implemented a DPA
compliant processing agreement.
Help Direct UK
A marketing campaign consisting of thousands of
The monetary penalty notice of
unsolicited marketing text messages prompted 6,758
complaints in just one month. As Help Direct Limited is a "lead generation company", these related to a variety of services being offered, including the reclaim of PPI payments, bank refunds and loans.
The ICO previously issued Help Direct with an enforcement notice on 24 February 2015, after a finding that their sending of unsolicited marketing material without consent breached regulation 22 of PECR. The practice continued after this date.
The scale of the operation over a short period of time meant that this contravention was "serious". The ICO also found that Help Direct was "fully aware" of its regulation 22 requirements and had taken active steps to use methods known in the marketing industry as helping to avoid detection by mobile networks' spam detectors
(i.e. using unregistered SIM cards and dongles).
£200,000 must be paid by 23
November 2015. Help Direct UK has 28 days to appeal the decision.
Monetary penalty notice
Pharmacy2U, the UK's largest NHS approved online
pharmacy, was fined £130,000 for selling details of more than 20,000 customers to marketing companies in breach of the Data Protection Act.
The ICO was also satisfied that it constituted a "serious" contravention "due to the context in which the personal data was unfairly processed, the number of individuals
The monetary penalty notice of
£130,000 must be paid by 16 November 2015. Pharmacy2U has 28 days to appeal the decision.
affected (21,500) and the purposes for which the data
Nuisance Call Blocker Ltd, a company offering a cold
calling prevention service, has been prosecuted for failing to respond to an ICO information notice.
The company was ordered to pay a fine
of £2500, a £120 victim surcharge and
£429.85 prosecution costs.
Two separate security incidents dating back to 2011 led
to the ICO imposing undertakings which required an improvement in security and privacy practices. Subsequent audits in 2013 and 2014 revealed that the undertaking requirements had not been fully implemented.
Although ACC stated that it had taken remedial steps in their response to the preliminary enforcement notice,
the ICO had "limited confidence" in their commitment to implement these on an ongoing basis given previous
failures. They concluded that ACC had failed to comply
with the seventh data protection principle by failing to take appropriate security measures. The likelihood of distress to ACC's data subjects was considered to be "self-evident."
Within 3 months, ACC must take steps
to ensure that on an ongoing basis:
data protection KPI's and measures are monitored and acted upon;
there is a mandatory data protection training programme for all staff (including new starters) and refresher training annually – participation and completion to be recorded and documented;
policies are being read, understood and complied with;
information is backed-up to the external server on a daily basis;
back-ups are tested periodically to ensure that they have not degraded and that information is recoverable;
physical access rights are revoked promptly when staff leave and periodically reviewed to ensure that appropriate controls are in place;
the lack of adequate storage solutions for manual records is addressed; and
consistent and regular monitoring is undertaken to enforce a clear desk policy.
Home Energy &
Monetary penalty notice
An ICO investigation revealed that HELM had used an
automated calling system to make over six million calls without subscriber consent as part of a massive
HELM must pay a monetary penalty of
£200,000 by 28 October 2015. If paid on time, this will be reduced by 20% to
automated marketing campaign offering "free solar
panels". This investigation was triggered by over 200 complaints via the online reporting tool, and was found to amount to a serious and deliberate breach of regulation 19 of PECR.
The ICO had previously written to HELM, and received assurance that it would not be running a similar campaign. In response to an ICO reminder that consent was required from each subscriber before making "automated calls," HELM admitted that it was not aware that a different PECR regulation applied to "automated" (rather than "live") marketing calls. They had therefore been adopting the same approach used in "live" calls.
HELM has 28 days in which to appeal.