The UK Information Commissioner’s Office (ICO) has published an updated handbook for Privacy Impact Assessments (PIAs). PIAs are designed to help organisations address any risks to personal privacy before implementing new initiatives and technologies and to ensure that privacy safeguards are built into systems at the outset.


The ICO published the original version of the PIA Handbook in December 2007 as a response to the loss by HM Revenue & Customs (HMRC) in October 2007 of two CD-ROMs containing 25 million data subjects’ unencrypted personal data.

Version 2 of the PIA Handbook sets out the PIA process, PIA screening questions enabling an organisation to decide whether a full-scale PIA should be conducted, a Data Protection Act compliance checklist, an e-Privacy Regulations compliance checklist and privacy strategies.

According to the ICO, the key benefits of a PIA are: identifying and managing risks, avoiding unnecessary costs, avoiding the introduction of inadequate solutions too late in a scheme’s development, avoiding loss of trust and reputational damage, the opportunity to inform and seek feedback from stakeholders and meeting and exceeding legal requirements.

The Handbook fleshes out these benefits and lays out the end results of an effective PIA. It concludes by drawing organisations’ attention to two sets of privacy risks that a wellstructured and implemented PIA can overcome: “Risks to the individual” and “Risks to the organisation”.


The case involving Phorm, a digital technology company owning behavioural tracking software (Webwise) that anonymously monitors internet users’ behaviour as they go from site to site so that advertising companies can better target adverts, highlights a potential drawback to PIAs. Webwise tracks internet users’ browsing history by assigning an anonymous Unique Identifier (UID) to each user while they are browsing and recording a UID’s behaviour on each site by using a duplicate cookie to track user behaviour across the Internet.

In 2006, BT’s trial of Phorm’s software, launched without informing internet users that they were being monitored, faced objections from campaigners and Members of Parliament on the basis that internet users do not expect to have their movements monitored to enable advertisers to hone their campaigns. Phorm responded that it had commissioned an independent data protection and privacy consultant to carry out a PIA on Webwise, which concluded that Phorm’s technology offers a high standard of privacy and data protection.

The ICO commented that the purpose of a PIA is to assess the privacy implications of, for example, new technological systems of processing personal data (such as Phorm’s). New technological methods of processing personal data obtained via the internet are highly likely to process IP addresses. However, the European Commission, the Working Party, the United Kingdom and other RU Member States disagree currently as to whether IP addresses constitute personal data.

Furthermore, PIAs cannot assess whether a new technological system is in compliance with a Member State’s data protection law where that Member State has failed to implement correctly, or monitor, the original EU directive. The European Commission is currently suing the United Kingdom for failing to implement EU data protection law correctly and is citing Phorm as an example of the United Kingdom’s failure.

The Commission and the Working Party take the view that, because it is difficult to differentiate between dynamic and static IP addresses and because static IP addresses can become personal data, internet service providers should treat all IP addresses as if they were personal data and should not process such data without the consent of the individuals concerned. Conversely, the German and French courts take the view that just because IP addresses can be personal data should not force organisations to treat all IP addresses as if they were personal data. Phorm argued that it does not process IP addresses. However, it is arguable that UIDs are a Phorm-specific variant of IP addresses—allowing Phorm to track users’ behaviour across the internet—posing the same dilemma as IP addresses with regard to whether such “anonymised” number sequences are personal data.

Phorm envisaged their duplicate cookie use as an opt-out. Cookies are site-specific software that allow a website owner to track a user’s behaviour on his site. Under the e-Privacy Regulations, cookies are an “opt-out”: websites can assign them to users provided each user is given a clear method of rejecting them. However, in a statement issued in April 2008, the ICO commented that Phorm’s software must be “opt-in” to comply with data protection law; users must consent specifically to use of such duplicate cookies. Phorm’s PIA flagged this (“Best privacy practice would normally be [optin]”) but perhaps did not stress the point sufficiently.


On 14 April 2009, the European Commission announced that it had commenced an infringement challenge against the United Kingdom over privacy and personal data protection in response to complaints by UK internet users and extensive communication with UK authorities about the use of Phorm’s behavioural advertising technology.

The Commission stated that there are “several problems” with the United Kingdom’s implementation of EU e-privacy and personal data protection rules relating to the confidentiality of communications and prohibiting interception and surveillance without the user’s consent. Article 5 of the e-Privacy Directive (2002/58/EC) requires Members States to “prohibit listening, tapping, storage or other kinds of interception or surveillance of communications and related traffic data by persons other than users, without the consent of the users concerned.”


The recent controversy over Phorm’s software shows that the value of PIAs may be somewhat limited for businesses at the cutting edge of the advertising and communications industries, particularly where the key data protection policy makers (the ICO, the European Commission, and the Article 29 Working Party) disagree on the scope of data protection law and what data is covered, or, in the case of the United Kingdom, where the Government has allegedly failed to implement data protection law correctly.