Privacy and Data Protection 2013 Year in Review 7
Developments at the Federal Trade Commission
Throughout 2013, the Federal Trade Commission (FTC) maintained its focus on privacy in the mobile environment, data collection and use among data brokers, and the impact of the “Internet of Things” as connectivity among everyday devices becomes more prevalent. Much like in 2012, the FTC continued to bring enforcement actions—reinforcing the principles in its 2012 privacy report—against companies that allegedly did not live up to their privacy policies, did not implement reasonable data security measures, or that violated the Children’s Online Privacy Protection Act (COPPA). Lastly, in an effort to promote increased cross-border cooperation, the FTC entered into a memorandum of understanding (MOU) with Ireland’s Office of the Data Protection Commissioner.
REPORT ON MOBILE PRIVACY DISCLOSURES
Following its 2012 workshop entitled In Short: Advertising and Privacy Disclosures in a Digital World, which in part explored the need for new guidance for mobile advertisers, FTC staff issued its Mobile Privacy Disclosures report. The report makes a number of recommendations for mobile platform companies, app developers, advertising networks and analytics companies regarding timely, comprehensible disclosures to better inform consumers about the collection and use of their data. The report notes the unique privacy and security concerns that mobile devices raise by their personal and ubiquitous nature.
Furthering its interest in the particular vulnerabilities posed by the data broker industry, the FTC, using its authority to subpoena “special reports,” issued orders to nine data brokerage companies to provide information on how they collect and use information about consumers. In particular, the FTC requested detailed information on the nature and sources of the consumer information data brokers collect; the use, maintenance and distribution of the information; and the extent to which data brokers allowed consumers access to their information for purposes of correcting it or opting out of having it sold. The FTC will use this information to better understand the privacy practices of firms in this industry, and is expected to issue a report at the end of 2013 detailing its findings.
“INTERNET OF THINGS”
On November 19, 2013, the FTC held a public workshop to discuss the distinctive issues posed by the increased connectivity of everyday devices—or the “Internet of Things.” The workshop brought together academics, business and industry representatives, and consumer advocacy groups to explore this next frontier for privacy and data security. FTC staff sought input on a variety of topics geared towards understanding developments in this space, technologies used, companies that operate within this space, current and future uses of smart technologies, benefits and privacy risks associated with consumer use of smart technologies, and whether in some industries (such as health care) the societal benefits of gathering information outweigh the personal risks. The FTC is clearly trying to think ahead of (or at least apace with) the technology to help inform its enforcement policies and priorities.
In its first Internet of Things enforcement action, the FTC brought a complaint against TRENDnet, Inc., a marketer of internet-connected home security video cameras. Echoing the concept of privacy by design articulated in the FTC’s final 2012 omnibus privacy report, the agency alleged that TRENDnet failed to use reasonable security in the design and testing of its software, which resulted in consumers’ private video feeds being made public on the internet. The FTC also claimed that TRENDnet stored and transmitted users’ log-in information in readable text. In settling the FTC’s complaint, TRENDnet must establish an information security program; secure the information stored, captured, accessed or transmitted; and obtain independent audits of its security program every two years for the next 20 years.
Early this year, the FTC settled charges against Path, Inc., that it deceived users of its social networking app by collecting certain personal information from their mobile devices without their knowledge or consent. Allegedly, Path gave the impression that consumers could make meaningful choices about the collection
“The FTC is clearly trying to think ahead of (or at least apace with) the technology to help inform its enforcement policies and priorities.8 McDermott Will & Emery
In a health-information case, the FTC settled charges against Cbr Systems, Inc., alleging that Cbr deceived consumers by not employing reasonably adequate security measures, contributing to a breach of personal information, although Cbr claimed to treat personal information securely. For example, Cbr purportedly transported personal information on unencrypted portable media and laptops. Names; genders; Social Security numbers; dates and times of birth; drivers’ license numbers; credit, debit and checking account details; and contact information of Cbr’s customers were exposed, as well as network passwords and protocols that could have given hackers access to Cbr’s network and other sensitive personal health information. The settlement requires Cbr to establish and maintain a comprehensive information security program and submit independent security audits every other year for 20 years.
FTC ENFORCEMENT ACTIONS FOR UNFAIR PRACTICES OR LACK OF REASONABLE DATA SECURITY MEASURES
In another enforcement action in the mobile environment, the FTC announced a settlement with a mobile device manufacturer in connection with charges that it did not take reasonable steps to ensure the security of its software for smartphones and tablets, and that these defects made sensitive consumer information vulnerable to exposure. Again reflecting the principle of privacy by design, the FTC’s complaint alleged that the company did not use reasonable and appropriate security practices in the design and customization of its software by not training engineering staff, testing the software for potential security flaws, using common coding practices, or establishing a process for receiving and addressing vulnerability reports from third parties. The FTC did not allege that any sensitive personal data was in fact exposed, but that it could have been as a result of these security weaknesses. As part
of the settlement, the company is required to patch the vulnerabilities and establish a security program.
The agency filed a complaint against LabMD, Inc., which provides laboratory testing on patient samples obtained by physicians, for not taking reasonable measures to secure consumers’ personal data, including medical information. LabMD billing information that contained sensitive personal information was found on a peer-to-peer network and in the possession of identity thieves. Among the facts alleged in the complaint, LabMD did not establish or maintain a security program, identify known and foreseeable security risks, prevent employees from accessing personal information not needed for their jobs, train employees on basic security practices, or use easily available measures to prevent and detect inappropriate access to personal information. This case is still pending before the FTC.
Stemming from the DesignerWare cases of 2012, the FTC settled its complaint against the rent-to-own retailer Aaron’s, Inc., for its role in its franchisees’ installation and use of software on rented computers that tracked customers’ locations, captured images of customers through the computers’ webcams and logged keystrokes, thereby capturing customers’ log-in credentials for e-mail, financial and social media accounts. Under the consent agreement, Aaron’s cannot use the monitoring technology except to provide technical support as requested by the consumer. Also, Aaron’s will be required to provide clear notice and obtain express consent from consumers at the time of rental to install the software that allows for location tracking of the computer, and to provide notice when the software is activated. Aaron’s will be prohibited from using the information obtained in connection with any debt collection efforts, and must delete or destroy any improperly collected data. The company will be required to conduct annual monitoring of its franchisees and make sure they
“Cooperation with other countries is critical to reaching privacy violations abroad.”Privacy and Data Protection 2013 Year in Review 9
adhere to the requirements of the consent order. Franchisees that do not abide by the requirements of the settlement will be subject to termination.
MOU WITH IRISH PRIVACY ENFORCER
In June, the FTC signed an MOU with Ireland’s Data Protection Commissioner to enhance its ability to cooperate in cross-border enforcement of privacy violations. The MOU reflects the constant movement of consumer data across borders and the increasing complexity of information technology and the paths data take. The MOU does not create legally binding obligations for either the United States or Ireland, but recognizes a common interest in cooperating in enforcement efforts and facilitating a better understanding of ways to protect personal information. The parties will use best efforts to share information, provide investigative and potentially enforcement assistance, and explore joint training programs. Cooperation with other countries is critical to reaching privacy violations abroad, and MOUs such as this facilitate enforcement beyond U.S. and Irish borders.
FTC Issues Amendments to Children’s Online Privacy Protection Rule
Julia B. Jacobson and Han Jason Yu
The Children’s Online Privacy Protection Act (COPPA)
is a federal statute enacted in 1998 that requires operators of commercial digital services (e.g., websites, mobile applications, social media apps) to provide parental notification and obtain verifiable parental consent prior to collecting personal information from children under the age of 13.1 To implement COPPA after its enactment in 1998, the Federal Trade Commission (FTC) issued a set of regulations known as the Children’s Online Privacy Protection Rule (COPPA Rule).2 On December 19, 2012, the FTC released amendments to its COPPA Rule that became effective July 1, 2013.
The amended COPPA Rule enhances online privacy protection for children and makes digital services operators more accountable for data collection activities involving children under age 13. Notable for digital services operators is a new liability standard
1 15 U.S.C. §§ 6501-6508.
2 16 C.F.R. Part 312.
for third-party service providers. Specifically, effective July 1, 2013, the following changes apply:
The operator of “children-directed” (i.e., intended for children under age 13) online or mobile websites and services is strictly liable for actions of independent third parties—including social media plug-ins—on or through its website and mobile services if the third party is acting as its agent or service provider, or if the operator benefits by allowing the third party information collection.
A software plug-in, ad network or similar party that collects information on or through a third-party’s online or mobile website or service now is liable under COPPA if that party has actual knowledge that it is collecting personal information on a children-directed platform.
The amended COPPA Rule makes several other key changes to the old COPPA Rule, including the following:
An expanded definition of personal information to include geo-location information, a child’s photo or audio or video file, screen or user names and persistent identifiers, such as information held in a cookie, an IP address or a mobile device ID number, that can be used to identify an individual consumer over time and across different websites or online services
Further clarification about the test for determining whether an online service is children-directed (which remains a highly fact-specific inquiry that depends on the totality of the circumstances)
The addition of an age-screening safe harbor for online services that fit the children-directed criteria but do not target children as their primary audience
Streamlined disclosure requirements for parental notification and privacy statement regarding information practices with respect to children
Expanded acceptable methods for obtaining verified parental consent.
To ensure compliance with the amended COPPA Rule, digital services operators must evaluate their data collection activities with respect to children on their own digital services as well as third-party digital services to ensure that disclosures about data collection from children are accurate and up-to-date. Even though operators of digital services directed to
NORTH AMERICA10 McDermott Will & Emery
children are strictly liable for their third-party service providers, or if they have actual knowledge of data collection from children, operators should consider checking their services agreements to make sure that service providers’ compliance with the amended COPPA Rule (and data privacy and security laws in general) is covered by existing provisions.
U.S. Department of Health and Human Services Publishes Final Rule
Daniel F. Gottlieb and Edward G. Zacharias
HEIGHTENED PENALTIES, BREACH NOTIFICATION; DIRECT APPLICABILITY TO BUSINESS ASSOCIATES
On January 25, 2013, the Office for Civil Rights (OCR) of the U.S. Department of Health and Human Services (HHS) published a final rule (Final Rule) containing modifications to the privacy standards (Privacy Rule), security standards (Security Rule), interim final security breach notification standards and enforcement regulations under the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Health Information Technology for Economic and Clinical Health (HITECH) Act. The Final Rule included changes required by the HITECH Act and other changes deemed appropriate by OCR in order to strengthen the privacy and security of health information. The changes affect HIPAA covered entities (which include certain health care providers, health plans and health care clearinghouses) and the business associates that provide services to them involving protected health information (PHI). Covered entities and business associates were required to comply with most requirements of the Final Rule by September 23, 2013.
Some of the most notable changes under the Final Rule include the following:
Business associates are directly liable for civil money penalties and criminal penalties for violations of the Privacy Rule and Security Rule.
The definition of business associate was expanded to include a subcontractor of a business associate so that subcontractors of a business associate are also liable for violations of the Privacy Rule and the Security Rule.
The definition of a breach of unsecured PHI was revised to make it more difficult for a covered entity or business associate to avoid reporting an unauthorized use or disclosure of PHI to the affected individuals and OCR.
Except in limited cases, a covered entity may not receive cash or other financial remuneration for marketing communications made for a third party’s products or services.
Notably, the Final Rule did not address the accounting for disclosures requirement of the HITECH Act, which OCR stated will be the subject of a future rulemaking.
For more information about the Final Rule, see McDermott’s White Paper “OCR Issues Final Modifications to the HIPAA Privacy, Security, Breach Notification and Enforcement Rules to Implement the HITECH Act.”
Since the publication of the Final Rule, OCR also has issued guidance regarding particular aspects of the Final Rule and template HIPAA documents, including guidance on the refill reminders exception to the prohibition on the use of PHI for marketing, guidance on the use and disclosure of PHI of deceased individuals, model notice of privacy practices and a model business associate agreement. It is likely that OCR will issue guidance on additional provisions of the Final Rule in the coming months.
New Rules for Telemarketing and
Text Message Marketing Effective October 16, 2013
Julia B. Jacobson and Manoj Khandekar
In February 2012, the Federal Communications
NORTH AMERICAPrivacy and Data Protection 2013 Year in Review 11
Commission (FCC) adopted Report and Order 12-21 describing revisions to its telemarketing rules issued under the Telephone Consumer Protection Act, 47 U.S.C. §227 (TCPA). Specifically, the FCC’s revised telemarketing rules relate to a telephone call to a residential landline or wireless number, or a text message that is initiated for advertising purposes and uses an “automatic telephone dialer system” (ATDS) or an “artificial or pre-recorded” voice message.
The major changes reflected in the FCC’s revised rules are as follows:
Abandoned calls: telemarketers must ensure
that no more than 3 percent of calls answered by a person are “abandoned” (i.e., not answered by the telemarketer within two seconds after the called person answers) during a 30-day calling campaign period.
Opt-out mechanism: artificial or pre-recorded telemarketing messages must include an automated, interactive mechanism that enables the called person to opt out of receiving future prerecorded messages.
Prior express written consent: “prior express written consent” (as described below) of the called person is required3 for the following:
Telemarketing calls to a wireless telephone number when an artificial or prerecorded message or ATDS is used
Telemarketing text messages sent using an ATDS
Telemarketing calls to a residential landline telephone number using an artificial or prerecorded message
“Prior express written consent” means a written agreement signed by the person called that clearly authorizes delivery of advertising or telemarketing messages using an ATDS or an artificial or prerecorded voice message. A written agreement may be “signed” electronically using any method recognized under the federal Electronic Signatures in Global and National Commerce Act (E-SIGN Act), 15 U.S.C. § 7001 et seq, or applicable state contract law. The E-SIGN
3 The FCC does not require prior express written consent for non-telemarketing calls and health-care-related calls subject to HIPAA that are made to residential lines, but prior express consent is required for such calls or text messages made or sent to wireless telephone numbers.
Act recognizes a signature as an “electronic sound, symbol or process” that is “attached or logically associated with” an agreement and “adopted by a person with the intent to sign” (15 U.S.C. § 7006).
The revised abandoned call rule became effective November 16, 2012; the opt-out mechanism rule became effective January 14, 2013; and the prior express written consent rule became effective on October 16, 2013. All calls using an artificial or pre-recorded message still must clearly state the identity and telephone number of the person/entity responsible for initiating the call, and telephone solicitations of any kind are permitted only during the hours of 8:00 am to 9:00 pm local time (for the called party).
Consents given under the old regulatory framework probably are not sufficient under the new FCC consent rule because (among other requirements) the “agreement” to which the consumer consents (i) must include reference to use of automated technology and (ii) “must be obtained without requiring, directly or indirectly, that the agreement be executed as a condition of purchasing any good or service.” Obtaining new opt-in consent consistent with the requirements of the new FCC consent rule is best practice, because the sender bears the burden of proving that it has obtained prior express written consent that meets the FCC standards. Also a best practice is implementation of a record-keeping system through which evidence of compliant consent is retained for at least three years (i.e., the statute of limitations for contract claims) after the consumer opts out or after sending the last text message related to the consumer consent.
Update on State and Federal Enforcement Actions and Settlements
David Q. Gacioch
The past year has witnessed the continuation of a long-developing trend in U.S. privacy and data protection: increasing attention by state attorneys general and other enforcement authorities coupled with a steady trickle of settlement announcements, but a dearth of litigation from any government agency other than the Federal Trade Commission (FTC) over alleged compliance shortfalls when settlement cannot be reached.
More data-related incidents are being reported to
“Obtaining new opt-in consent consistent with the requirements of the new FCC consent rule is best practice.”12 McDermott Will & Emery
enforcement authorities as more state reporting requirements take effect and awareness of them penetrates the business and nonprofit communities. The Massachusetts Attorney General’s Office, for example, reports that it received 282 data breach notifications during the first quarter of 2013 (most recent data published), compared with 1,118 notifications in 2012 but 733 or fewer in each prior year back to 2008. Similarly, the California Attorney General’s Office received 131 data breach reports in 2012, the first year that California law required reporting, and as of the date of this publication, 2013 year-to-date breach report counts in California have surpassed that total.
Against the backdrop of such large numbers of reported breaches, 2013 saw an unremarkable number of non-FTC settlement announcements in the data privacy area. Most notably, a two-year, multi-state investigation of an information technology company concerning allegations of unauthorized collection of personal data from unsecured wireless networks resulted in a $7 million settlement in March. Led by Connecticut and Maryland, nearly 40 state attorneys general participated in the investigation and ultimate deal. The settling company denied any wrongdoing, explaining that the data was gathered by mistake and never examined. It also agreed to destroy all information that had been inadvertently collected, to drive a U.S.-wide consumer education campaign on data privacy and wireless networking security, and to establish additional internal data privacy controls and training for its employees. A parallel probe by the U.S. Department of Justice was dropped in 2011, but the Federal Communications Commission imposed a related $25,000 fine in 2012, while stopping short of calling the company’s data gathering practices illegal. The same company announced a $17 million settlement with 38 state attorneys general resolving allegations of improper practices with respect to tracking cookies.
Outside of that case, there were only a handful of breach-related settlements at the state level. For example, in July 2013, New Jersey regulators announced a $1 million consent order with PulsePoint Inc. resolving an administrative enforcement action under the New Jersey Consumer Fraud Act centering on display advertisement coding that purportedly would allow tracking of users’ online activity even if they had settings in place to block tracking cookies from websites they had not visited. PulsePoint did not admit any wrongdoing but agreed to discontinue the challenged practice, to post privacy practices and opt-out information on its website, and to submit to independent monitoring for five years (at its expense) in addition to the monetary payment.
In late 2013, the New Jersey Attorney General announced three settlements that ranged in amount from $25,000 to $1 million, each coupled with policy/procedure/disclosure enhancements, and resolved allegations of unauthorized collection of children’s personal information by mobile applications, improper “history sniffing” of website users’ browsing activity and outright hijacking of customers’ computers for use as a “botnet” in the theft of online currency (bitcoins).
In August 2013, one national bank and the Connecticut Attorney General’s Office reached a settlement under which the bank agreed to pay $55,000 and subject its online account access portal to a third-party security audit. The settlement resolved allegations that the bank had not done enough to ensure the security of personal information of those online banking customers who were affected by a 2011 breach. The bank denied any wrongdoing as part of the settlement. Of the 360,000 customers purportedly affected, 5,066 were Connecticut residents. The Connecticut and California Attorney General’s Offices conducted the investigation jointly, but no parallel settlement between the bank and California has been announced.
Striking a similar theme, in September 2013, the Vermont Attorney General’s Office announced a settlement with Natural Provisions Inc. over allegations that the store had moved too slowly in notifying affected customers and tightening the security on its payment card IT systems in the wake of a 2012 breach. The settlement involved a $15,000 civil penalty, plus another $15,000 in required spending by Natural Provisions to upgrade its IT security, as well as a requirement that Natural Provisions promptly implement a written information security program with certain specified components to prevent recurrence of the breach.
Beyond the breach report counts and publicized settlements, some states have announced larger resource commitments to privacy and data protection in recent months. The California Attorney General’s Office launched its privacy enforcement unit in mid-2012, and it included six assistant attorneys general plus support professionals by mid-2013. Statements made by Attorney General’s Office representatives in May 2013 (surrounding the attorney general’s defeat
“Commit the time and resources before a breach occurs to ensure compliance with applicable state and federal law.”Privacy and Data Protection 2013 Year in Review 13
at the hands of Delta Airlines) suggested that more enforcement actions would be coming soon, although none have yet occurred.
Maryland’s Attorney General followed California’s lead in January 2013, launching its own dedicated Internet Privacy Unit. In May, that office, joined by the Connecticut Attorney General’s Office, publicly sent a joint letter to LivingSocial, Inc., seeking answers to 15 follow-up questions about a cyber-attack-triggered breach the company had publicly disclosed a week earlier that affected up to 50 million customers. The joint investigation has been out of the news since May 2013, suggesting that it either remains ongoing or has been dropped.
The vast majority of reported data breaches in 2013 resulted in no settlement or enforcement action. Such “no action” resolutions typically occur quietly, but sometimes are made public. For example, in July 2013, the Missouri Attorney General’s Office publicly cleared Schnucks Inc. of any wrongdoing surrounding a data breach potentially affecting 2.4 million customers that the company had publicly disclosed in March. Notably, Schnucks disclosed the breach to the Attorney General’s Office early and worked cooperatively with the investigation, resulting in the Attorney General’s Office characterizing Schnucks as a “victim” of potential criminal wrongdoing by others instead of a bad actor itself.
As illustrated above, in 2013 the United States witnessed a growing number of breach reports leading to a small number of settlements plus an increasing amount of private putative class litigation, as in previous years. However, late in 2012, the California Attorney General’s Office bucked the “settle or walk away” pattern of non-FTC enforcement by filing suit against Delta Airlines in San Francisco Superior Court over the purported lack of privacy protections surrounding its Fly Delta mobile application—the first such suit the office had brought under the state’s nine-year-old online privacy law. In May 2013, the court dismissed the complaint in its entirety on grounds that the federal Airline Deregulation Act of 1978 pre-empted the state law as it applied to airlines. While obviously a blow to the California Attorney General’s Office’s new privacy enforcement unit, the decision did not address the merits of the underlying claims, leaving open the possibility of pursuing similar claims against companies without federal pre-emption defenses.
SPECIAL FOCUS ON ENFORCEMENT IN THE HEALTH CARE INDUSTRY
In contrast to the increasing number of breaches reported at the state level, Health Insurance Portability and Accountability Act of 1996 (HIPAA) breach cases have plateaued between 8,000 and 9,500 per year from 2008 through 2012, according to the U.S. Department of Health and Human Services Office for Civil Rights (OCR). That said, the September 2013 effective date of the HIPAA Omnibus Rule, with its tougher standards for risk of harm assessments, is likely to cause those figures to start rising again in the fourth quarter of 2013 and into 2014. OCR also has entered into several HIPAA-related resolution agreements since late 2012:
A December 2012 settlement with Hospice of North Idaho resolved OCR’s investigation into a laptop-theft-generated breach of the Security Rule that the covered entity had reported in early 2011. OCR alleged that its investigation revealed that the provider’s risk analyses and policies were insufficient to safeguard such electronic protected health information (ePHI). Terms included a $50,000 payment and a two-year corrective action plan, but no outside monitor. OCR highlighted the fact that this represented the first settlement triggered by a small (i.e., affecting fewer than 500 individuals) HIPAA breach.
The Massachusetts Attorney General’s Office rang in 2013 by announcing a $140,000 settlement with the former owners of a medical billing practice and four pathology groups the practice had served. The settlement resolved allegations that unredacted paper pathology medical records and billing information for more than 67,000 patients—including names, Social Security numbers and medical diagnoses—had been improperly disposed of at an unsecured waste transfer station and later found by a Boston Globe photographer. The attorney general claimed that this improper disposal violated both HIPAA and state law on security and proper disposal of confidential personal information.
In May 2013, OCR reached a settlement with Idaho State University (ISU) concerning a HIPAA Security Rule breach reported in mid-2011 affecting the ePHI of 17,500 individuals/patients at an ISU clinic. OCR alleged that ISU had left
NORTH AMERICA14 McDermott Will & Emery
the patients’ records unsecured by disabling its servers’ firewall protections, among other things. Settlement terms included a $400,000 payment and a two-year corrective action plan requiring annual written reporting back to OCR, but no outside monitor.
A June 2013 settlement with Shasta Regional Medical Center and 15 related entities followed a Privacy Rule investigation into impermissible disclosure of protected health information to the news media on multiple occasions, triggered by an early 2012 Los Angeles Times story. Terms included a $275,000 payment and a one-year corrective action plan, with required reporting to OCR but no outside monitor.
A July 2013 settlement with WellPoint, Inc., resolved allegations of insufficient administrative and technical safeguards on access to ePHI following a mid-2010 breach notification. The breach affected 612,402 individuals. OCR alleged that WellPoint had failed to adequately secure and control access to an online database. Settlement terms included a $1.7 million payment, but no corrective action plan or monitor.
An August 2013 settlement with Affinity Health Plan, Inc., resolved mid-2010 allegations that Affinity had failed to properly erase photocopier hard drives containing the protected health information of 344,579 individuals before returning the copiers to its leasing agent. Terms included a $1.2 million payment and a 120-day corrective action plan, without outside monitoring.
As mentioned previously, the HIPAA Omnibus Rule, which provides for enforcement actions against both covered entities and business associates, likely will increase the pace of such resolutions. However, in keeping with what we have seen from state attorneys general, OCR’s February 2011 imposition of $4.3 million in civil monetary penalties against Cignet Health for failing to provide patients with copies of their medical records and failing to cooperate with a resulting OCR subpoena remains the only time OCR has taken formal enforcement action for a HIPAA violation outside of a resolution agreement.
The non-FTC enforcement landscape of 2013 underscores the importance of best practices in the area of privacy and data protection:
Commit the time and resources before a breach occurs to ensure compliance with applicable state and federal law, as well as industry standard practices.
If a breach does occur, act quickly to gather basic information and make required notifications.
Engaging enforcement authorities openly and cooperatively does not mean one should avoid taking firm stands on legal and factual issues where such stands are justified. To the contrary, outside of the FTC, authorities have shown little willingness to proceed with disputed enforcement litigation, and California’s negative outcome in the Delta Airlines matter may enhance that reluctance in the near term.
U.S Data Privacy Litigation Trends
Anthony A. Bongiorno, Bridget K. O’Connell and
Julia B. Jacobson
ARBITRATION CLAUSES MAY LIMIT CLASS ACTIONS IN DATA PRIVACY LITIGATION
In two 2013 decisions, the Supreme Court of the United States showed continued support for arbitration as a dispute resolution mechanism by focusing on class action waivers in agreements to arbitrate. These two decisions provide helpful guidance for businesses seeking ways to avoid class action litigation.
In Oxford Health Plans LLC v. Sutter (June 10, 2013), the Supreme Court unanimously affirmed an arbitrator’s conclusion that an arbitration clause authorized class-wide arbitration, despite the lack of any clear language to that effect in the clause. This decision is significant because a typical arbitration clause does not expressly authorize class arbitrations. Following Oxford, if an arbitration clause is silent regarding class arbitration, an arbitrator may be able to interpret the clause to determine that the parties agreed to class arbitration. In American Express Co. v. Italian Colors Restaurant (June 20, 2013), the Supreme Court’s 5–3 decision held that a contractual waiver of class arbitration is enforceable under the Federal Arbitration Act (FAA), even if the cost Privacy and Data Protection 2013 Year in Review 15
of proving an individual claim in arbitration exceeds the potential recovery and plaintiffs’ ability to assert claims to protect their interests is more limited.
Viewed together, these two decisions indicate that an express waiver of class arbitration in an otherwise valid customer agreement is likely enforceable. This is particularly good news in the data privacy area, which has seen a significant uptick in class action litigation under the Telephone Consumer Protection Act (TCPA), Wiretap Act, Stored Communications Act and other data-privacy-related laws. Many large companies already have added arbitration provisions with class action waivers to their customer agreements, and many more businesses likely will follow suit during 2014.
INSURANCE COMPANIES SEEK TO AVOID COVERAGE IN PRIVACY LITIGATION SETTLEMENTS
Insurance companies have had mixed results in trying to convince courts that they are not liable for amounts that their policyholders have agreed to pay to settle several data-privacy-related class action lawsuits in the past year. Insurance companies have been largely unsuccessful in coverage to their policyholders where the policyholder’s alleged violation is a recognized tort or clearly prohibited by statute.
For example, in Maxum Indemnity Co. and Security Ins. Co. of Hartford v. Eclipse Manufacturing Co., et al., a federal court in the Northern District of Illinois rejected the insurance companies’ argument that they were not liable for the settlement agreement reached by their policyholder with class action plaintiffs who accused the insured of sending blast faxes to businesses in violation of the TCPA in the underlying litigation. Because the Illinois Supreme Court had recently ruled that TCPA damages are compensatory, not punitive, the court in Maxum found that the settlement agreement regarding TCPA claims was squarely within the insurers’ scope of coverage.
Similarly, in Hartford Casualty Ins. Co. v. Corcino & Associates, et al., the U.S. District Court for the Central District of California ruled in favor of the defendant policyholder, finding that the plaintiff insurance company could not decline coverage for two class action litigations seeking $20 million from the policyholder as a result of a data breach. Plaintiffs in the underlying class actions allege that the policyholder posted sensitive medical information online in violation of the California Confidentiality of Medical Information Act (CMIA). Although the insurance policy had an exclusion barring coverage for injury arising out of statutory violations, the court ruled that the statutory rights exclusion did not apply because the rights at issue in the underlying lawsuits were not created by CMIA, but instead were encompassed by a common law right to privacy that had long been recognized by California courts.
The decisions of the past year do not uniformly benefit policyholders. Insurance companies have been successful in arguing that they do not have to provide coverage for data privacy class action settlements where the language of the policy excludes coverage or where the entity that settles with the plaintiffs is not clearly listed as a policyholder.16 McDermott Will & Emery
In MDC Acquisition Co., et al. v. Travelers Property & Casualty Co. of America, the U.S. Court of Appeals for the Sixth Circuit rejected the policyholders’ demands that Travelers cover a settlement payment of $6 million to plaintiffs in an underlying TCPA fax-blast class action. The court found that Travelers had amended the policyholders’ policies to exclude coverage for unsolicited communications when they renewed their policies and had provided adequate notice of the amendment. Thus, the policyholders could not turn to Travelers to pay for the settlement of alleged TCPA violations.
In Axis Surplus Insurance Co. v. St. Paul Fire & Marine Insurance Co., et al., the U.S. District Court for the Western District of Washington declined to reform an insurance policy issued to the founder of a venture capital firm to extend coverage to his company, which settled an underlying class action over automated marketing calls to consumers’ cell phones without their consent.
These cases emphasize that companies must closely review their insurance policies to determine whether coverage for data privacy claims exists and to ensure that the proper entities are listed as policyholders on any insurance policy. For more information, please see “Privacy and Cyber Liability Insurance” on page 27.
OUTCOME OF GENESCO INC. V. VISA U.S.A. INC., ET AL., WILL AFFECT LITIGATION AGAINST PAYMENT NETWORKS
Genesco, a sports apparel retailer, is fighting back against Visa over fines and reimbursement amounts paid to customers in the wake of a breach of Genesco customer data.
Like many retailers, Genesco accepts credit cards as a form of payment from customers. To process credit card payments, retailers have contracts with banks to process the transactions. These banks in turn have contracts with payment networks that issue the credit cards, such as Visa. Between December 2009 and December 2010, hackers accessed Genesco’s computer network and lifted customer credit card information as Genesco transmitted the data to two banks that processed its Visa transactions. After the cyber-attack was discovered, Visa fined the banks $5,000 each, alleging that the Genesco transactions processed through the banks breached portions of the banks’ contracts with Visa, including the Payment Card Industry Data Security Standard (PCI-DSS). Visa also assessed the banks for more than $13 million that it reimbursed to cardholders as a result of fraudulent activity on their cards after the data breach.
Pursuant to its contracts with the banks, Genesco indemnified the banks for the PCI-DSS fines and reimbursement amounts. As the assignee and subrogee of the two banks, Genesco subsequently filed suit against Visa in the U.S. District Court for the Middle District of Tennessee to recover the fines and assessment amounts charged by Visa, alleging common law causes of action as well as statutory violations of the California Unfair Competition Act. In essence, Genesco claimed that not all of the cardholder accounts Visa claimed were compromised were actually affected, and that the fines and assessments Visa imposed lacked a factual basis. Genesco also claimed that the PCI-DSS fines Visa charged are penalties and thus unenforceable under its contracts with the banks, or alternatively unenforceable under the California Unfair Competition Act.
In May 2013, Visa filed a motion to dismiss Genesco’s California Unfair Competition Act and restitution claims. In July, the court denied Visa’s motion to dismiss, finding that Genesco’s allegations “create a controversy that allegedly impacts the operation of the Visa card payment system and implicates consumers, merchants and other banks” affected by the cyber-attack on Genesco’s network. Genesco’s allegation that Visa lacked factual evidence of harm before imposing fines and assessments was sufficient to allege an unfair or unlawful business practice in violation of the California Unfair Competition Act.
In August 2013, Genesco filed a motion for summary judgment, claiming that as a matter of law, the $5,000 PCI-DSS fines Visa levied against the banks were in breach of contract and/or violations of the California Unfair Competition Act. The court has not ruled on Genesco’s motion.
While the outcome of the Genesco litigation is not yet settled, the case demonstrates the possibility that retailers may be able to bring claims based on state unfair competition laws against payment networks. This opens up payment networks to a potentially new line of litigation directly from retailers and calls into question the ability of payment networks to unilaterally levy fines for alleged PCI-DSS non-compliance.Privacy and Data Protection 2013 Year in Review 17
Video Privacy Protection Act: Ongoing Litigation and Legal Developments
Anthony A. Bongiorno and Matthew R. Turnell
The Video Privacy Protection Act (VPPA), 18 U.S.C. § 2710, has been heavily litigated recently as courts determine how to apply the 25-year old law to modern methods of video delivery, such as online streaming. The VPPA was enacted in 1988 to protect individuals’ privacy in their selection of the videotapes that they rented or purchased. It prohibits the disclosure of information that identifies a person as having requested or obtained specific video materials or services, and also bars the provider from retaining that information for more than one year.
HULU LITIGATION CONTINUES
In 2012, the application of the VPPA to internet distribution was litigated in In re Hulu Privacy Litigation in the Northern District of California. Hulu was alleged to have violated the VPPA by providing personally identifiable information to its third-party vendors and to social networks. Hulu moved to dismiss, arguing that because it only provided video content online, it was not a “video tape service provider,” defined by the statute as someone engaged in the business of renting, selling or delivering “prerecorded video cassette tapes or similar audio and visual materials.” The court disagreed, finding that Hulu’s online distribution fell within “similar audio and visual materials.”
With the applicability of the VPPA to internet distribution now established, Hulu has moved for summary judgment on other grounds. The court’s decision will have implications for how tightly other online content providers must restrict distribution of their customers’ viewing selections. First, Hulu argued that its use of numerical user IDs, rather than subscribers’ real names, when disclosing information to Hulu’s third-party providers means that there was no VPPA violation. According to Hulu, it never discloses both the subscriber’s name and the videos that subscriber watched to the same third-party vendor. This argument will likely turn on whether the possibility that the recipient of such anonymized data could “reverse-engineer” the consumer’s actual identity is sufficient to create liability under the VPPA. Second, Hulu argues that its disclosures to social networks were the result of specific requests by the customer. Moreover, Hulu asserted that its postings on subscribers’ Facebook accounts were actually made to the customer, not to a third party, and were thus specifically permitted under the VPPA.
Finally, Hulu argued that the suit must fail because the plaintiffs sustained no actual injury. The VPPA, Hulu points out, provides for a damage award only for a person “aggrieved” by a violation of the statute. As none of the plaintiffs could identify any actual injury, Hulu argued that they could not recover under the VPPA. This last argument, if accepted by the court, has the potential to severely limit suits under the VPPA. In most cases it will be difficult for a customer to allege or prove any damages arising from the type of disclosures alleged in the Hulu case, especially on a class-wide basis.
VPPA’S CONSENT REQUIREMENT RELAXED
Prior to this year, the VPPA’s provision allowing disclosure following the customer’s consent was extremely narrow. Under the law, the provider was required to obtain “the informed, written consent of the consumer given at the time the disclosure is sought.” This restriction limited the flexibility of providers to share customers’ viewing histories, even where the customers wanted the provider to make that information available.
This year’s amendment to the VPPA, signed into law in January 2013, significantly relaxes the consent requirement. Under the new provision, a consumer may consent in advance to disclosure for a period of up to two years. The statute also prescribes precise requirements for how providers can obtain consent to disclose. The consent form must be distinct and separate from any other form. The consumer also must be given a “clear and conspicuous” opportunity to withdraw consent on a case-by-case basis or to withdraw on an ongoing basis. The change in the law has allowed social media users to share their online view selections on an ongoing basis.
In light of these changes, fewer cases have been filed in recent months against companies under the VPPA as businesses take steps to obtain advanced consent and otherwise meet the requirements of the amended law. However, there are existing state laws that do not have the relaxed consent requirement. As these and other states’ laws develop, and as new technologies emerge in the online video space, a broader swath of companies likely will become subject to these laws and the VPPA, and this area may well be ripe for creative plaintiffs’ counsel to file claims in 2014.
“This year’s amendment to the VPPA, signed into law in January 2013, significantly relaxes the consent requirement.”18 McDermott Will & Emery
Electronic Communications Privacy Act Update: Wiretap Act and Cookies
Jason Crow and Heather Egan Sussman
Anyone who has shopped online for an item and who then has seen an advertisement for that very item appear in a banner advertisement likely has experienced what is known as “behavioral tracking” or “online tracking.” Online tracking is facilitated by the use of tracking technologies such as “cookies” that are downloaded to a user’s computer or mobile device with or without that person’s knowledge. These cookies then record and broadcast information about that user’s browsing habits to the company that placed the tracker, which can then lead to the delivery of targeted ads deemed to be relevant to that particular user.
Cookies have become a key part of the nearly $40 billion online advertising industry that supports the $260 billion online shopping industry and that helps to ensure the continued availability of the free internet. Not surprisingly, cookies also have become a focal point of plaintiff class action lawsuits in the United States. These lawsuits are commonly based on violations the Electronic Communications Privacy Act (ECPA), which encompasses two well-known provisions, the Wiretap Act, 18 U.S.C. 2511 et seq., and the Stored Communications Act, 18 U.S.C. 2701 et seq. The ECPA and many analogous state statutes contain damages provisions that can mean substantial litigation exposure to companies that participate in the online tracking ecosystem.
WIRETAP ACT FUNDAMENTALS
The Wiretap Act prohibits the intentional interception, use or disclosure of wire and electronic communications unless a statutory exception applies. Class action plaintiffs have generally argued that cookies violate the Wiretap Act by effectively hacking into users’ machines and broadcasting the users’ browsing habits to a third party without the users’ consent. Web companies, on the other hand, have sought safe harbor under the Wiretap Act’s “ordinary course of business” exception and the user consent defense.
The Wiretap Act’s “ordinary course of business” exception permits the interception of communications by a “provider of wire or electronic communication service in the ordinary course of its business.” Web companies argue targeted ads are a key component to keeping many web-based services, such as search engines and e-mail, free, therefore fall within this “ordinary course of business” exception.
PLAINTIFFS CONTINUE TO ASSERT ECPA VIOLATIONS FOR ONLINE TRACKING
Cookie Placement Consumer Privacy Litigation
Relying on earlier decisions, the court noted that personal identifying information that is automatically generated by an electronic communication, such as records identifying users’ e-mail addresses, IP addresses, log-in times and screen names, is not “contents” of the electronic communication. Similarly, the court declined to find that universal record locators are “contents” under the Wiretap Act. Although a URL might provide a description of the content of what a user is viewing—e.g., www.depression.com—the court determined that a URL does not concern the “substance, purport, or meaning” of an electronic communication as required for liability under the Wiretap Act.
Similarly, the court found that the plaintiffs failed to state a claim against the defendants under the Stored Communications Act (SCA). The SCA creates liability
“Cookies have become a key part of the nearly $40 billion online advertising industry.”Privacy and Data Protection 2013 Year in Review 19
for anyone who accesses a facility through which an electronic communication service is provided without authorization and obtains, alters or prevents access to an electronic communication while it is stored in that facility. The court declined to accept plaintiffs’ novel argument that their personal computers were “facilities” through which electronic communication services are provided and that defendants violated the SCA by placing cookies on their computers. The court determined that while the defendants’ placement of cookies in the random access memory of plaintiffs’ computers, the SCA claim failed because the computers were not “facilities” under the statute.
The “Ordinary Course of Business Defense”
In a September 2013 decision, the U.S. District Court for the Northern District of California denied a motion to dismiss a complaint filed against a popular e-mail service provider alleging that its e-mail service violated the federal Wiretap Act when it allegedly intercepted the contents of e-mails sent by and to its users. The plaintiffs alleged that the service provider scanned outgoing and incoming e-mails for keywords and then created user profiles and served targeted ads to the user based on those keywords. The court found that the Wiretap Act claim could proceed, rejecting the service provider’s argument that the Wiretap Act’s “ordinary course of business” and user consent defense applied to the company’s scanning of the contents of e-mails.
The service provider argued that the “ordinary course of business” exception offered protection because e-mail scanning and the resulting targeted ads facilitated the transmission of the communications at issue by keeping the e-mail service free. The court disagreed, stating that “the [ordinary course of business] exception offers protection from liability only where an electronic communication service provider’s interception facilitates the transmission of the communication at issue or is incidental to the transmission of such communication” (emphasis added). The court found that the service provider’s scanning of e-mail was “not an instrumental component of [the service provider’s] operation of a functioning email system,” and thus the “ordinary course of business” exception was not a basis for dismissing plaintiffs’ claims.
The service provider also argued that its terms of service and privacy policies put plaintiffs on notice that the company was scanning their e-mails, such that they consented to the practice. For example, the service provider disclosed to users that it could “filter” content and use “information [users] provide” to “display customized content and advertising.” The service provider also warned users that “advertisements may be targeted to the content of information stored on the Services.” The court found that the former disclosure did not “explicitly notify users that [the company] would intercept users’ emails for the purposes of creating user profiles or providing targeted marketing,” and thus users could not consent to such practices. The court held that the latter disclosure only “demonstrates only that [the company] has the capacity to intercept communications, not that it will.”
ISPs Avoid Liability Under the Electronic Communications Privacy Act Liability in Deep Packet Inspection Cases
Bridget K. O’Connell and Anthony A. Bongiorno
Two recent cases have denied claims by plaintiffs that internet service providers (ISPs) should be liable for allowing third parties to access ISP customer user information through so-called “deep packet inspection.” Deep packet inspection refers to the process of examining individual data packets within a broadband transmission in order to identify information about a particular user’s internet habits, such as websites visited, and deliver targeted advertising based on those habits.
In December 2012, the U.S Court of Appeals for the 10th Circuit affirmed the trial court’s grant of summary judgment in favor of the defendant in Kirch v. Embarq Management Co. The defendant, an ISP, had allowed20 McDermott Will & Emery
NebuAd, a third-party online advertising company, to collect and analyze certain customer information relayed through Embarq’s servers, using deep packet inspection to direct targeted ads to those customers. The plaintiffs claimed that Embarq had violated the ECPA by unlawfully intercepting their electronic communications in violation of the statute. The court found that there was no “interception” under the statute, because Embarq’s access to the plaintiff’s data was in the ordinary course of its business as an ISP—it had no greater or different access to the data because it allowed a third party to access the information. Importantly, the court also reaffirmed earlier decisions finding that there is no aider or abettor liability under the ECPA. Thus, an ISP would not be liable under the ECPA merely because it allowed a third-party entity access to user information through deep packet inspection.
In September 2013, a federal district court in Illinois reached the same conclusion in Valentine v. WideOpenWest Finance, LLC. WideOpenWest (WOW), an ISP in Illinois, contracted with NebuAd (the same third party online advertising company as in Kirch) and allowed NebuAd to access WOW users’ internet communications so that NebuAd could analyze the data and deliver targeted ads to those customers. In response to plaintiffs’ motion for reconsideration, the court reaffirmed its dismissal of the plaintiffs’ ECPA violation claims with prejudice. Because the plaintiffs failed to allege that WOW ever acquired or accessed plaintiffs’ electronic communications, only that it allowed NebuAd to do so, the allegations were insufficient to state a claim against WOW.
ISPs that allow third parties to conduct deep packet inspection should not be subject to direct liability under the ECPA as long as the ISP’s access to user information is within the course of its normal business operations. However, ISPs may still be caught up in litigation regarding third-party use of deep packet inspection and can seek to curtail their exposure by seeking customer consent or “opt out” features before allowing third parties to intercept user data.
This Call May Be Recorded:
Class Action Plaintiffs in California Continue to Target Businesses on
Call Center Recording
Matthew Knowles and Anthony A. Bongiorno
It is a familiar warning: “this call may be monitored or recorded for quality assurance purposes.” Recent developments in California law make clear that this warning is an essential one for all business call centers, no matter where they are located. California is one of 12 “all party” (or “two party”) states in which it is illegal to record a call without the consent of all parties to the call. However, more than anywhere else, California’s broad data privacy laws have led aggressive class-action plaintiffs to seek substantial penalties against businesses that fail to properly warn their customers that their call center lines are recorded. Businesses are also subject to stiff penalties for violating the law: Section 637 of the California Code establishes a civil remedy for violations of Sections 632 and 632.7, providing minimum statutory damages of $5,000 per call or treble actual damages, whichever is greater. By subpoenaing call center records, it is easy for plaintiffs to amass large lists of potential class members and driving up potential damages to levels of great concern.
NEW CONSIDERATIONS FOR BUSINESSES
Businesses should be aware of several factors in order to avoid being an easy target for class action litigation. First, cell phones have changed the game. California has two closely related statutes relating to wiretapping. Section 632 of the California Code prohibits intentional recording of a confidential telephone call without consent of all parties. California courts have continued to interpret “confidential” quite broadly. In Mirkarimi v. Nevada Prop., the Southern District of California stated that confidentiality requires “nothing more than the existence of a reasonable expectation by one of the parties that no one is listening in or overhearing the conversation.” Section 632.7, however, is even broader: it prohibits intentional recording of any telephone calls made over a cell phone or other wireless device, regardless of whether the call is confidential. Prior to the now-ubiquitous use of cell phones, businesses tried to avoid liability by arguing that certain customer services calls were not “confidential.” Now, however, plaintiffs can avoid this fact-intensive inquiry by simply pleading that their calls were made from cell phones.
“The court also reaffirmed earlier decisions finding that there is no aider or abettor liability under the ECPA.”Privacy and Data Protection 2013 Year in Review 21
Second, businesses may be exposed to liability for incoming calls no matter where their call centers are located. In the seminal case Kearney v. Salomon Smith Barney, the California Supreme Court ruled that the defendant could be held liable for recording calls made between customers in California and the company’s call center in Georgia. Recent cases such as Zephyr v. Saxon Mortgage Servs., Inc., have confirmed this principle, upholding liability for calls made from Texas to California.
Third, outgoing calls matter too. Sections 632 and 632.7 apply with equal force to outgoing calls that are recorded. Many companies struggle with the logistics of providing and documenting warning messages about recording with respect to outgoing calls. Likewise, outgoing calls with pre-recorded warning messages must be made in compliance with the TCPA, which places certain restrictions on calls placed using automatic dialers or containing pre-recorded messages. For more information, please see “New Rules for Telemarketing and Text Message Marketing Effective October 16, 2013” on page 11.
One proactive step that businesses can take is to protect themselves with a warning message. California courts have held that companies can avoid liability by including a pre-recorded message at the start of a call informing callers that the call will be recorded. It is essential to document when this warning was implemented. Likewise, some plaintiffs have raised claims alleging that callers can bypass the warning by pressing a number on the keypad before the message completes. Businesses should document that their warning messages cannot be bypassed.
Given the proliferation of customer service numbers for all sorts of businesses, it is no surprise that plaintiffs’ firms are aggressively investigating and pursuing wiretap claims under California law. By taking the steps discussed here, businesses can protect themselves from liability in California and elsewhere.
Data Breach Class Actions and the Harm Threshold: Plaintiffs Face Hurdles in Bringing Successful Data Breach Claims
Jason Crow, Heather Egan Sussman and
Anthony A. Bongiorno
DATA BREACHES ARE INCREASING,
AND INCREASINGLY VISIBLE
Large-scale data breaches continue to grab headlines in 2013. In April, hackers breached the daily-deal website LivingSocial, stealing 50 million customers’ names, e-mail addresses, dates of birth and salted passwords. Evernote, Yahoo and Facebook experienced data breaches at the hands of hackers resulting in the exposure of 50 million, 20 million and six million records, respectively. In July, federal prosecutors indicted five hackers in what they are calling the largest known financial data breach in history. The hackers were charged with breaching several major corporate networks, including 7-Eleven and Carrefour S.A. (a French multinational retailer), and stealing more than 160 million credit card numbers that resulted in hundreds of millions of dollars in losses. The Privacy Rights Clearinghouse estimated that in the first 10 months of 2013, businesses, including financial services, insurers and health care organizations, experienced 330 breaches. The Clearinghouse estimated that among government, nonprofit and educational institutions, there have been 80 breaches in 2013 involving more than 800,000 records.4
CLASS ACTION PLAINTIFFS STRUGGLE TO SHOW INJURY IN FACT
Despite the growing number of data breaches in the United States, data breach class action plaintiffs have had little success in the courtroom in part because they continue struggle to quantify damages and define “harm” sufficient to confer Article III standing and survive a motion to dismiss. Undaunted, creative plaintiffs continue to allege various theories of harm, including increased risk of identity theft, loss of time to monitor and fix credit, personal information as property
4 See Privacy Rights Clearinghouse, “Chronology of Data Breaches,” available at http://www.privacyrights.org/databreach.22 McDermott Will & Emery
and emotional distress.5 Federal and state courts have routinely found that these types of alleged harm are too speculative or self-inflicted to confer standing.
Consistent with this trend, in February 2013 the Supreme Court in Clapper v. Amnesty International USA concluded that “actual injury” is required for a plaintiff to proceed in litigation. In Clapper, plaintiffs argued that an amendment to the Foreign Intelligence Surveillance Act of 1978 (FISA) that permits the government to intercept their foreign communications without probable cause was unconstitutional, and taking measures to protect these communications from surveillance harmed them. The Supreme Court held that speculative injury is insufficient to create standing under Article III and further cautioned against standing based on self-inflicted injury. The Supreme Court stated that a plaintiff “cannot manufacture standing merely by inflicting harm based on fears of hypothetical future harm that is not certainly impending,” noting that “[i]f the law were otherwise, an enterprising plaintiff would be able to secure a lower standard for Article III standing simply by making expenditure based on a nonparanoid fear.”
Although Clapper was not a data breach case, it already has had significant ramifications in the data breach context. In September 2013, a federal district court in Illinois in In re Barnes & Noble Pin Pad Litigation dismissed a class action complaint arising from a credit card “skimming” attack (i.e., the theft of credit card information in an otherwise legitimate transaction) suffered by Barnes & Noble in 2012. Relying on the Supreme Court’s decision in Clapper, the court granted Barnes & Noble’s motion to dismiss, holding that the injuries asserted by plaintiffs—increased risk of identity theft, out-of-pocket expenses related to the purchase of identity theft protection—were insufficient to confer standing because plaintiffs failed to allege an “injury in fact” that is “certainly impending.”
CLASS ACTION PLAINTIFFS MUST OVERCOME INDIVIDUALIZED ISSUES TO BE CERTIFIED
Data breach class action plaintiffs likewise continue to struggle to receive class certification because of the
5 In addition to common law theories, plaintiffs have sued on state and federal privacy and consumer protection laws that include damages provisions. Federal statutes such as the Stored Communications Act, the Wiretap Act, the Telephone Consumer Protection Act and the Video Privacy Protection Act, as well as many state laws, include statutory damages for each violation.
presence of individualized damages issues. Emblematic of this trend is the March 2013 class certification decision in Anderson v. Hannaford Bros. case, where a federal district court in Maine denied plaintiffs’ motion to certify a class, holding that proving mitigation damages—credit card replacement costs and identity theft monitoring—for members of the class required highly individualized determinations that could not be tried through proof common to the class as a whole.
Also in March 2013, the Supreme Court in Comcast Corp. v. Behrend continued its trend of requiring a more demanding standard for obtaining class certification. The Supreme Court reversed a grant of class certification in an antitrust class action because plaintiffs had failed to meet Fed. R. Civ. P. 23(b)’s predominance requirement. The Supreme Court stated that a “rigorous analysis” of the plaintiffs’ damages model must be conducted, and after conducting such analysis, found plaintiffs’ damages model was inconsistent with their theory of antitrust liability and inadequate to establish damages on a class-wide basis. While commentators have speculated that Comcast Corp. will affect class certification in data breach litigation, its applicability in the data privacy context has already been questioned by at least one federal district court.6
CLASS ACTION PLAINTIFFS MAY
SUCCEED IN FAVORABLE SETTLEMENTS
FOR DATA BREACHES
Despite these standing and class certification hurdles, data breach plaintiffs have had at least one success in 2013. In October, a federal district court in Florida granted preliminary approval of a $3 million class settlement in Resnick v. AvMed Inc. In 2009, AvMed had a data breach when two company laptops were stolen that contained personal information for 1.2 million customers, only some of whom suffered identify theft as a result. The class action settlement not only returns funds to those who suffered identity theft, but also requires AvMed to give back $10 to
6 In October 2013, a district court in Illinois granted class certification in Harris v. comScore, Inc., one of the largest data privacy class action certifications to date. Class action plaintiffs claimed that comScore, an online data research company, unlawfully collected data about their activities on the internet, analyzed that data and sold it to third parties. In a footnote, the court stated, “[t]he Supreme Court’s holding [in Comcast Corp.] came from its assumption . . . that Rule 23(b)(3) requires that damages must be measurable based on a common methodology applicable to the entire class in antitrust cases. That assumption, even assuming it is applicable to privacy class actions in some way, is merely dicta and does not bind this court.”Privacy and Data Protection 2013 Year in Review 23
customers implicated in the data breach for each year they bought health insurance from the company, up to $30. The settlement also requires AvMed to take several steps to improve its data security and implement mandatory employee privacy-training programs. This settlement may serve as a template for future data breach class action plaintiffs, at least where there is some “actual injury,” such as identity theft, affecting a portion of the class.
BEST PRACTICES TO AVOID DATA
With the number and scope of data breaches on the rise, companies that collect personal information are vulnerable to both the hackers that infiltrate company networks and the class action plaintiffs that seek remedy for these breaches. Such companies should take a proactive approach to their data security policies, practices and procedures. To mitigate risk in this area, companies should consider stepping up compliance efforts and ensuring they follow their own data security policies, reviewing their information governance policies, obtaining an appropriate insurance policy that will cover litigation expenses resulting from a data breach, ensuring a robust data-related vendor management program and setting up a comprehensive data breach response plan.
Point-of-Sale Zip Code Collection Litigation Expands
Matthew R. Turnell and Anthony A. Bongiorno
The past year saw an expansion of class action litigation challenging retailers’ point-of-sale collection of customers’ ZIP codes. These suits are based on state laws originally enacted in the 1980s and 1990s that typically prohibit merchants from requesting that a customer provide personal identification information as a condition of accepting payment by credit card. In 2011, the California Supreme Court held in Pineda v. Williams-Sonoma Stores that ZIP codes constitute “personal identification information” under California’s Song Beverly Credit Card Act. Since the Pineda decision, ZIP code litigation has spread across the United States, with litigation now ongoing in Massachusetts and the District of Columbia, and suits in several other states possible.
Earlier this year, Massachusetts’s highest court gave the green light to ZIP code collection suits in Tyler v. Michaels Stores. The plaintiff in that case brought suit in federal court just months after the Pineda decision, alleging that Michaels Stores’ collection of customers’ ZIP codes during credit card transactions violated Massachusetts law. The case reached the Massachusetts Supreme Judicial Court on questions certified from the federal district court. As in Pineda, the Massachusetts court held that ZIP codes are personal identification information and, therefore, collection of zip codes during credit card transactions may violate the Massachusetts statute. In the wake of the Tyler decision, several putative class actions have been filed against Massachusetts retailers.
The most recent front in the ZIP code litigation is the District of Columbia. In June 2013, plaintiffs filed against Urban Outfitters alleging that the company’s collection of ZIP codes during in-store credit card transactions violated D.C. law. Urban Outfitters has moved to dismiss on several grounds, including that the collection of ZIP codes is not prohibited by the D.C. statute. Several other states have similar prohibitions on the collection of customer information during credit card transactions, meaning that this litigation could spread to other states.
Even as ZIP code litigation expands geographically, courts are limiting the circumstances in which these claims may be brought. Earlier in 2013, a California State appeals court ruled that the collection of ZIP codes during pay-at-the pump transactions did not violate California law. The decision, Flores v. Chevron U.S.A., held that such collection, which was aimed at deterring theft and fraud, fell within the statutory exception for information collection that is “required for a special purpose incidental but related to the individual credit card transaction.” (While the case was pending, the California legislature specifically amended the statute to exclude information collected at the gas pump to prevent fraud, theft or identity theft.) A federal district court in California recently denied class certification in a suit alleging point-of-sale ZIP code collection. In Gossoo v. Microsoft, Microsoft defeated class certification by arguing that its sales representatives did not follow a script during their interactions with customers. Because each sales interaction was different, Microsoft argued, the court would be unable to determine on a class-wide basis whether the reasonable consumer would have believed that providing the ZIP code was a condition of paying by credit card.
“Even as ZIP code litigation expands geographically, courts are limiting the circumstances in which these claims may be brought.”24 McDermott Will & Emery
The recent decisions in California and Massachusetts and newly filed litigation in Washington, D.C., likely will drive a U.S.-wide expansion of ZIP code collection suits. In addition, the prevalence of retailers engaging in this type of information collection likely will provide plaintiffs’ lawyers with a target-rich environment in 2014.
U.S. Safe Harbor Program Developments
Ann I. Killilea and Heather Egan Sussman
One of the biggest pieces of news to rock the privacy and data security world came in May 2013, when the now-infamous former federal contractor Edward Snowden leaked to the press information about the existence and scope of certain surveillance activities of the U.S. National Security Agency (NSA). Since that time, data protection officials from Europe and beyond have seized those headlines to argue that the United States does not sufficiently protect the privacy of personal information, and some have called for suspension of the U.S./EU/Swiss Safe Harbor Program. When pressed, however, most data protection officials will acknowledge that many governments around the world operate similar surveillance programs.
And while legal scholars have written volumes documenting that the U.S. government has in place the best system of checks and balances to ensure privacy protections within the framework of its surveillance programs, these carefully researched pieces receive little or no attention, as the mainstream media seem to prefer to cover salacious spying scandals and hostile reactions from European data protection officials. These headlines have had a devastating impact on U.S. businesses. Forrester Research recently calculated that the fallout from the Snowden leaks could cost U.S. businesses more than $180 billion by 2016.
Many of the complaints about the Safe Harbor Program have been vague, however, and seem to be bolstered more by media hype than actual fact. For example, Federal Trade Commission (FTC) Commissioner Maureen Ohlhausen remarked during an October 2013 privacy conference that she had attended the 35th Annual Conference of Data Protection and Privacy Commissioners in Warsaw, Poland, in late September 2013. She said that while some representatives of the various Data Protection Authorities (DPAs) expressed their serious concern, none articulated exactly how U.S. surveillance activities undermine the effectiveness of the Safe Harbor Program. In contrast to these stated concerns, other DPA officials have made clear that they believe the U.S./EU/Swiss Safe Harbor Program remains viable and serves an important purpose. For example, at the same October conference during which Commissioner Ohlhausen delivered her remarks, Irish Data Protection Commissioner Billy F. Hawkes stated unequivocally that he supports the continuation of the U.S./EU/Swiss Safe Harbor Program.
This support makes sense given that the overall benefits conferred by the U.S./EU/Swiss Safe Harbor Program are extraordinary. For example, the Safe Harbor Program facilitates trade between Europe and the United States by removing barriers to the exchange of data between businesses operating on either side of the pond. In addition, the Safe Harbor Program has had the net effect of improving the overall privacyPrivacy and Data Protection 2013 Year in Review 25
compliance posture of thousands of U.S. companies. While the media recently has reported the allegation that as many as 400 companies are acting in violation of their Safe Harbor promises without agency review, some commentators suggest that complaining EU data protection authorities are casting stones in glass houses because of their own relatively limited enforcement activities in their home territories.
In all events, since the inception of the Safe Harbor Program in 2000, more than 4,000 companies have been certified, and 70 new applications are received by the U.S. Department of Commerce each month. That means that more than 4,000 participating businesses are self-certifying on an annual basis their compliance with seven privacy principles that mirror the European Directive. Businesses that fail to live up to these public promises become subject to FTC enforcement. In fact, the FTC has brought enforcement actions against and reached consent decrees with 10 companies, including technology giants such as Facebook, for alleged violation of their Safe Harbor promises. As a result, thousands of U.S. companies have developed corporate-wide privacy compliance and data management programs, many of which were motivated to do so for the first time because of the Safe Harbor Program.
Given these benefits, it seems highly unlikely that European regulators would eliminate the Safe Harbor Program altogether without putting an alternative program in its place.
In fact, on November 27, 2013, the European Commission reinforced its commitment to the Safe Harbor Program as a legitimate means for enabling cross-border data transfers from the European Union to the United States. Despite pressure by the European Parliament to suspend the program, the Commission announced that it would take no such action. Instead, the Commission made 13 recommendations intended to improve the Safe Harbor Program. The Commission advises that U.S. authorities should implement these recommendations by mid-2014 and states that, if implemented, such improvements will contribute to the restoration of trust in EU-U.S. data flows. For more information, see Communications from the Commission to the European Parliament and the Council on the Functioning of the Safe Harbour from the Perspective of EU Citizens and Companies Established in the EU, and on Rebuilding Trust in
EU-US Data Flows (each separately issued on November 27, 2013).
In the meantime, while governments continue to wrangle over one another’s mutual surveillance programs, U.S.-based corporations will continue to move their privacy programs forward by self-certifying under the Safe Harbor Program. As pointed out recently by Damon Greer, former director of the U.S./EU/Swiss Safe Harbor Program, this program remains legally binding on all Member States in the European Union and the three European Economic Area countries. No individual Member State may opt out of the agreement. For more information, please see “EU Privacy Safe Harbor Still Alive and Well, With Implications for Enterprise Risk Management.”
Trade Negotiations Potentially Affecting Privacy Regulation and Data Transfers
Jay L. Eizenstat and Heather Egan Sussman
The Transatlantic Trade and Investment Partnership (T-TIP) is a comprehensive bilateral trade negotiation between the United States and the European Union, launched in 2013 following intensive, year-long senior-level consultations between the Office of the U.S. Trade Representative and the Directorate General for Trade at the European Commission, named the U.S.-EU High-Level Working Group on Jobs and Growth. The High-Level Working Group was tasked with making recommendations to U.S. and EU trade policymakers on ways the United States and the European Union could enhance their trade relationship through increased trade and investment, reduce and eliminate regulatory barriers to trade, and resolve legacy trade issues. The High-Level Working Group’s final report recommended an investment partnership agreement as the most comprehensive vehicle to achieve these goals. In March 2013, President Obama formally notified Congress of the administration’s intention to enter into negotiations with the European Union, and in August 2013 the European Commission obtained a mandate from the EU Member States. The first round of negotiations was held in July 2013 in Washington, D.C.; the second round concluded in Brussels in November; and the third round will be held in Washington, D.C., in mid-
“Given these benefits, it seems highly unlikely that European regulators would eliminate the Safe Harbor Program altogether without putting an alternative program in its place.”26 McDermott Will & Emery
December. Similar trade negotiations between the United States and Pacific-region countries, called the Trans-Pacific Partnership, also have been underway.
A core topic for negotiation has been cross-border data flows and the impact of different regulatory regimes on the issue of data privacy. The internet economy and the electronic, cross-border transmission of personal data have led trade policymakers to introduce into free trade agreements disciplines that seek to facilitate the unimpeded flow of inbound and outbound data over the internet. However, creating trade disciplines in free trade agreements intended to provide for unimpeded cross-border data flows (such as the Trans-Pacific Partnership and the T-TIP) can clash with national data privacy and data protection requirements intended to safeguard personal information and fundamental privacy rights. In the T-TIP negotiations, the United States faces a challenge, because the European Union considers individual privacy a “fundamental right” in EU treaties. In the Trans-Pacific Partnership negotiations, Australia and New Zealand have expressed similar concerns that provisions on cross-border data flows will encroach on their national privacy laws. Therefore, creating disciplines on cross-border data flows that facilitate efficient cross-border data flows without violating core privacy rights will be a major challenge for negotiators, private-sector stakeholders and privacy rights interests in the ongoing Trans-Pacific Partnership and T-TIP negotiations.
Looking ahead to 2014, T-TIP negotiating rounds likely will be held approximately every eight weeks, although specific negotiating groups may hold intercessional meetings at various points during the year if additional time is needed. The United States and the European Union hope to conclude the negotiations by the end of 2014, when the current European Commission expires, but that is an ambitious objective considering the size and complexity of the negotiations, and the inclusion of “cutting-edge” trade issues, such as digital trade, cross-border data flows and data privacy.
President Obama Issues Executive Order Entitled “Improving Critical Infrastructure Cybersecurity”
David D. Ransom and Ann Marie Turner
In February 2013, President Obama issued Executive Order 13636, Improving Critical Infrastructure Cybersecurity, which instructed the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) to “lead the development of a framework to reduce cyber risks to critical infrastructure.” After engaging with more than 3,000 industry, academic and government individuals and organizations, NIST released its Preliminary Cybersecurity Framework on October 22, 2013. There followed a 45-day public comment period along with a workshop on the Cybersecurity Framework at North Carolina State University on November 14–15. After collecting and reviewing feedback, NIST will issue its Official Cybersecurity Framework in February 2014.
The Preliminary Cybersecurity Framework includes a collection of existing standards and best practices that have proven helpful in assisting organizations in managing their cybersecurity risks. It then provides a structure for utilizing these standards. Patrick Gallagher, director of NIST, stated that NIST aims to “turn today’s best practices into common and expected practices.” One of the Framework’s key objectives is to encourage businesses to assign cybersecurity risk the same level of priority as financial, safety and operational risk. In short, cybersecurity risk management is part of doing good business.
NIST has been careful to point out that the Preliminary Cybersecurity Framework does not provide threat proofing. The goal is to effectively manage cyber risks, not focus on eliminating them. As this issue is ever changing, NIST views the Framework as a living document that will evolve as technology improves and cybersecurity threats change.
Payment Card Industry Data Security Standards Update
On November 7, 2013, the PCI Security Standards Council, the standard-setting group for the payment
“In short, cybersecurity risk management is part of doing good business.”Privacy and Data Protection 2013 Year in Review 27
card industry, released its triennial updates to the Payment Card Industry Data Security Standards (PCI-DSS) and related Payment Application Data Security Standards. The PCI-DSS set forth payment card data security standards for prevention, detection and response to data security incidents. Known as Version 3, the updated PCI-DSS are intended to “help organizations focus on security, not compliance,” and “make payment security part of their business-as-usual activities.” This shift to a more holistic security-focused approach seems to mirror the Federal Trade Commission’s privacy-by-design initiative and likely results from reports about data breaches among merchants that were allegedly compliant with the PCI-DSS.
The PCI Security Standards Council’s summary of changes between Version 2 and Version 3 indicates that nine different sections were “Evolving Requirements,” defined as changes to ensure that the standards are up-to-date with emerging threats and changes in the market. Key changes include requiring password education for users, and point-of-sale education and training; allowing payment card industry organizations to implement password strength and other “business as usual” security requirements appropriate for their business needs and risk-management strategy; and providing guidance on outsourcing of responsibilities covered by the PCI-DSS.
Version 3 of the PCI-DSS is effective as of January 1, 2014, but merchants have until December 31, 2014—and in some cases until July 15, 2015—to comply.
Privacy and Cyber Liability Insurance
Joshua D. Rogaczewski and Julia B. Jacobson
A May 2013 report from the Ponemon Institute7 shows that cybercrime accounts for 41 percent of data breaches in the United States, with human error the cause of 33 percent and “system glitch” the cause of the remaining 26 percent. The report also found
7 Ponemon Inst., 2013 Cost of Data Breach: Global Analysis (May 2013), available at https://www4.symantec.com/mktginfo/whitepaper/053013_GL_NA_WP_Ponemon-2013-Cost-of-a-Data-Breach-Report_daiNA_cta72382.pdf.
that a data breach costs a U.S. company an average of $188 per record during 2012—the second highest cost per record (behind Germany) among the nine countries studied—and that U.S. companies have the highest total average data breach cost in 2012, at $5.4 million. A data breach also affects a company indirectly through loss of reputational equity, customers and productivity.
With the frequency and cost of data breaches on the rise, U.S. companies processing large amounts of data increasingly look to insurance to help them manage the financial risks. Many companies, particularly those that have implemented comprehensive data security programs, continue to argue that their comprehensive general liability (CGL) and/or professional liability insurance policies cover data breaches. Some recent court decisions support this argument. For example, in October 2013 the U.S. District Court for the Central District of California upheld coverage under a hospital’s CGL policy for a data breach involving medical records of almost 20,000 patients in Hartford Casualty Insurance Company v. Corcino & Associates et al. During the past decade, however, the insurance industry has taken steps to eliminate or reduce coverage for data breaches in traditional CGL and professional liability policies. Even if a particular CGL policy does not specifically eliminate coverage for data-related losses, a company may need to expend further resources in litigating whether data breaches are covered.
Most (if not all) major carriers will write coverage for data breaches, either as standalone policies or optional coverage parts. Whatever the format, coverage written specifically for data breaches—referred to by the Insurance Services Office generally as cyber liability insurance—is less likely to trigger a coverage dispute between a company and its insurance company. In a recent case, First Bank of Delaware, Inc. v. Fidelity & Deposit Co.,8 the court found coverage under specialty coverage and rejected an attempt by the carrier to apply the policy’s fraud exclusion, because the carrier’s reading swallowed the policy, which was written to address data-privacy risks.
Since cyber liability insurance is a relatively new product, it does not employ consistent standard
8 See 2013 WL 5858794, at *9 (Del. Super. Ct. Oct. 30, 2013).28 McDermott Will & Emery
language, and policies can vary widely in conditions and coverage. Cyber liability insurance may include coverage for (i) liability arising from unauthorized access to and/or disclosure of data, data breach expenses (e.g., costs for investigation and notification of breaches, as well as post-event credit-monitoring for affected parties), programming errors and omissions that lead to unauthorized disclosure of data; (ii) viruses (e.g., actual loss of business income and/or business interruption costs resulting directly from a virus, expenses incurred to replace or restore data or affected directly by a virus); (iii) regulatory fines and penalties; and (iv) public relations expenses related to negative publicity resulting directly from a data breach or virus.9
The lesson for companies is to carefully review the terms of a cyber liability insurance policy during the application process to help ensure that the policy meets business needs. For more information on cyber liability insurance, please see “U.S. Data Privacy Litigation Trends to Watch” on page 14.
Key U.S. State Law Developments
Matthew S. Smith, Heather Egan Sussman, Sabrina E. Dunlap and Julia B. Jacobson
CHANGES TO STATE BREACH
Seven U.S. states passed legislation updating their security breach notification laws in 2013: California, North Dakota, New York, Oregon, South Carolina, Texas and Vermont. Most of these updates were minor amendments, such as adding additional civil penalties for failure to notify, or updating the required notification procedures.
Vermont expanded the coverage of its statute by making clear that financial institutions, which were previously exempt from the statute’s coverage, must comply with that state’s breach notification laws. Two states—North Dakota and California—made even more significant changes to their existing security breach notification laws.
The North Dakota breach notification law now includes in its list of data elements defining
9 See, e.g., http://www.iso.com/Policy-Programs/isos-cyber-liability-insurance-program.html.
personal information (i) medical information (such as an individual’s medical history, mental or physical condition, or medical treatment or diagnosis by a health care professional), and (ii) health insurance information (including an individual’s policy number or subscriber identification number and any unique identifier used by a health insurer).
California’s amended breach notification laws now require notice upon the unauthorized disclosure of certain information that would permit access to an online account. California is the first state to broaden the definition of personal information to include online account log-in or access data—specifically, an e-mail address or user name, together with the password or security question and answer used to access any online accounts, including e-mail accounts.
STATE LAWS PROHIBITING EMPLOYER ACCESS TO SOCIAL MEDIA ACCOUNTS
Given the increasing number of social media platforms, it is no surprise that in 2013 lawmakers continued to pass legislation that prohibits employers from requesting employees’ and applicants’ passwords to personal social media accounts (so-called “password protection” laws). Password protection legislation is pending or has been introduced in more than 30 states, while eight states passed such laws in 2013, including Arkansas, Colorado, Nevada, New Jersey, New Mexico, Oregon, Utah and Washington, bringing the total number of states with password protection laws to 12 (Illinois, Maryland, Michigan and California enacted password protection laws in 2012).
While the details of the password protection laws vary, the basic conduct prohibited by the laws is the same: they generally prohibit employers from asking for, or requiring employees or prospective employees to provide, usernames, passwords or other information necessary to gain access to personal social media accounts. Many of the password protection laws also prohibit employers from taking adverse employment action based on an employee’s refusal to comply with the employer’s request for social media account access (the exception to this general rule is New Mexico, where the law appears to only cover applicants and not current employees).
In addition, many of the laws provide for exceptions. For example, the password-protection laws in Nevada, Washington and Colorado do not prevent employersPrivacy and Data Protection 2013 Year in Review 29
from complying with state or federal laws. Utah’s law appears to contain one of the broadest exceptions, because it allows an employer to request or require disclosure of log-in information for an employer-provided device, or for employer-provided accounts or services used for a business purpose.
Navigating the state password-protection laws likely will become increasingly difficult as states continue to pass laws with slight variations related to employer access to employees’ and applicants’ social media accounts. Employers may find it helpful to establish and follow basic guidelines that could help ensure compliance with the laws. For example, employers may want to consider training managers and supervisors to generally avoid seeking employees’ or applicants’ social media account information, or have a process in place to verify that requests for social media information fall within a state law exception. Employers may also find it helpful to keep in mind that the password protection laws generally do not apply to publicly available social media content. Although there are other issues to consider when accessing publicly available social media content (such as whether it is accurate), access to publicly available content is not likely to violate any of the existing password protection laws.
CHANGES IN CALIFORNIA LAWS APPLY TO WEBSITES AND MOBILE APPLICATIONS
In 2013, California passed some new privacy-related laws for websites and mobile applications. The new laws apply not just to companies based or operating in California, but to any website or mobile application available to residents of California.
New Privacy Rights for Minors
On September 23, 2013, Governor Brown signed into law new Sections 22580 through 22582 of the California Business and Professions Code titled “Privacy Rights for California Minors in the Digital World.”10 This is the first “right to be forgotten” law of its kind in the United States. The law goes into effect January 1, 2015, and requires an operator of a website (including online services and applications, such as a social media site) or mobile application that is “directed to minors” to allow minors (defined as anyone younger than 18 years old residing in California) who are registered users the opportunity to un-post or remove (or request removal of) their posted online content. The operator also must provide minors with notice and “clear instructions” about how to remove their posted content. The operator is not, however, required to remove posted content in certain specific circumstances, such as when the content was posted by a third party.
This new law also prohibits website and mobile app operators from advertising to California minors certain products and services that minors cannot legally purchase, such as alcoholic beverages, firearms, ammunition, spray paint, tobacco products, fireworks, tanning services, lottery tickets, tattoos, drug paraphernalia, electronic cigarettes, “obscene matter” and lethal weapons. Operators also are prohibited from using, disclosing or compiling certain personal information about the minor for the purpose of marketing these products or services.
New Do-Not-Track Disclosure Requirements Effective January 1, 2014
On September 27, 2013, California Governor Brown signed into law amendments to the California Online Privacy Protection Act (CalOPPA), a 2004 law requiring all commercial digital service providers collecting personally identifiable information about
10 See http://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?
bill_id=201320140SB568&search_keywords.30 McDermott Will & Emery
The Bill Analysis indicates that CalOPPA amendments are not intended to “prohibit third-party or any other form of online tracking,” but rather to “implement a uniform protocol for informing Internet users about tracking . . . and any options they may have to exercise choice . . . .”
Among other requirements, the DAA’s Self-Regulatory Program requires website owners/operators (first parties) to provide “clear, meaningful and prominent” disclosure about data collection and use for advertising purposes, and to offer consumers a way to opt out of tracking, such as through the DAA’s consumer choice page.13 As noted in the Bill Analysis, while the DAA’s consumer choice mechanism enables consumers to opt out of receiving advertising based on online tracking data, it only works for companies
11 Cal Bus and Prof Code §22575(b)(5).
12 Cal Bus and Prof Code §22575(b)(6).
13 The consumer opt-out page is available at http://www.aboutads.info/consumers.
that participate in the DAA’s program and “does not allow consumers not to be tracked.”
Relatedly, on October 14, 2013, the Better Business Bureau (BBB) issued a first-of-its-kind Compliance Warning, noting that a “significant minority of website operators” are omitting the “enhanced notice link” required by the DAA’s self-regulatory program when ad networks and other third parties are collecting data for interest-based advertising purposes but cannot provide their own notice on the website on which the data collection is occurring. The BBB operates the Online Interest-Based Advertising Accountability Program, through which it monitors businesses’ advertising practices and enforces the DAA’s self-regulatory program, even for companies that are not participating in it.
For more information on what companies should be doing in the wake of these California and BBB developments, see McDermott’s On the Subject “To Track or Not to Track.”
Canada’s Anti-Spam Legislation
Sabrina E. Dunlap and Heather Egan Sussman
In early 2013, Industry Canada published regulations for Canada’s anti-spam law, Canadian Anti-Spam Legislation (unofficially known as CASL), which has yet to go into effect. The regulations for CASL are known as the Electronic Commerce Protection Regulations and define key CASL terms and concepts. CASL establishes a regulatory framework to protect electronic commerce in Canada and also limits unsolicited commercial electronic mail (spam) by generally prohibiting the sending of commercial electronic messages without consent. The law also prohibits false or misleading commercial representations online; prohibits the collection of personal information via unlawful access to computer systems, and the unauthorized compiling or supplying of lists of electronic addresses; provides for a private right of action for anyone affected by a prohibited act; provides for administrative monetary penalties
62.2270° N, 105.3809° WPrivacy and Data Protection 2013 Year in Review 31
on those who violate the law; and allows for the international sharing of information and evidence to pursue spammers outside of Canada.
The Electronic Commerce Protection Regulations will come into force at the same time as the law, once they receive final Governor in Council approval (according to the Canadian government’s website, a specific date for when the law will come into force will be set in the coming months).
Manitoba’s New Privacy Act
Sabrina E. Dunlap and Heather Egan Sussman
On September 13, 2013, Manitoba’s Personal Information Protection and Identity Theft Prevention Act (PIPITPA) received royal assent (meaning it officially became an Act of Parliament). The PIPITPA is still awaiting proclamation, and as a result is still not in force. After the PIPITPA’s proclamation, Manitoba will be the fourth province with provincial privacy legislation (joining British Columbia, Quebec and Alberta).
The PIPITPA establishes rules for the collection, use and disclosure of personal information (broadly defined as “information about an identifiable individual”) by most organizations, including corporations and individuals acting in a commercial capacity, in the province. In addition, the PIPITPA requires that organizations notify an individual if his or her personal information is lost, accessed or disclosed without authorization.
Failure to comply with PIPITPA may result in fines of up to $100,000. Consumers also may bring a private right of action against an organization for failing to protect personal information in its custody or control, or for a failure to provide notice of unauthorized access to personal information.
Alberta’s Privacy Law Ruled Unconstitutional
Heather Egan Sussman
On November 15, 2013, the Supreme Court of Canada unanimously ruled in Alberta (Information and Privacy Commissioner) v. United Food and Commercial Workers Local 401 that Alberta’s Personal Information Protection Act (PIPA) is unconstitutional and invalid in its entirety. The decision has been suspended for 12 months, however, “to give the legislature time to decide how best to make the legislation constitutional.”
Mexico Enacts Privacy Notice Guidelines
Effie D. Silva
In April 2013, Mexico enacted the Privacy Notice Guidelines. The Guidelines were issued pursuant to the Federal Law on Protection of Personal Data held by Private Parties (2010) (Privacy Law), which provides a constitutional right both to protection of a citizen’s personal data and to access, correct, cancel and challenge the use of such information. In addition to notice and consent requirements, Mexico’s Privacy Law includes rules specific to cloud computing. Specifically, the rules allow a data controller to use cloud services where there are contractual conditions for processing and where the service provider meets certain other confidentiality and security requirements. The data breach notification provisions being added to Mexico’s data protection law are similar to the provisions in almost all U.S. state statutes; Brazil, Uruguay and some EU Member States also are contemplating the addition of data breach notification provisions. The Guidelines are a useful resource to companies seeking to ensure that their operating processes are in compliance with governmental regulations, including regulations that require companies to furnish extensive privacy notices and obtain consent before collecting personal data from a person either directly or electronically. Specifically, the Guidelines recommend that companies adopt a Safety Management System of Personal Data (SGSPD) and implement internationally recognized standards issued by the International Organization for Standardization and the Organization for Economic Cooperation and Development. The Guidelines outline the recommended process for the adoption of a SGSPD.
“In addition to notice and consent requirements, Mexico’s Privacy Law includes rules specific to cloud computing.”
19.0000° N, 102.3667° W32 McDermott Will & Emery
The past year has brought myriad regulatory updates across
Europe on data privacy matters, including trans-border information flow, cookies and data anonymization. Simultaneously, Africa and the Middle East are witnessing a rapidly developing data privacy landscape, as countries seek to attract international business.Privacy and Data Protection 2013 Year in Review 33
European Parliament Gives Data Protection Reform Package the
Keo Shaw and Rohan Massey
On January 25, 2012, the European Commission proposed a comprehensive reform of data protection rules in the European Union. The European Commission’s proposals are designed to update the principles set out in the Data Protection Directive 95/46/EC, in recognition of the dramatic changes to, and business opportunities offered by, the digital economy to strengthen online privacy rights and boost Europe’s digital economy.
Intense lobbying and detailed comments from national data protection authorities, the Article 29 Data Protection Working Party, the European Council, and reports drafted by MEPs Jan Philipp Albrecht and Dimitrios Droutsas have fueled discussions regarding significant amendments to the proposed reform package over the last 20 months. Finally, on October 22, 2013, the European Commission’s proposals were supported by a clear majority (49 votes in favor, compared to one against and three abstentions) in the Committee for Civil Liberties, Justice and Home Affairs (LIBE) of the European Parliament, consolidating 3,999 amendments into just 104 compromise amendments. With this vote, LIBE has endorsed the European Commission’s reform package and advanced the legislative procedure to the next step.
CONFIRMATION OF THE CENTRAL PILLARS OF THE DATA PROTECTION REFORM PACKAGE
First Pillar: Uniform Law with Effective Sanctions
LIBE has confirmed that the new data protection regime should take the form of a regulation, not a directive. Each of the Member States implemented the Directive differently, resulting in patchwork protection across the European Union. The proposed replacement of the existing Directive with a directly effective regulation should help to harmonize the currently fragmented rules across all Member States.
Further, LIBE agreed that in order to achieve effective reform, national data protection authorities must have the power to impose real sanctions when the law is breached. Consequently, it has proposed strengthening the European Commission’s proposal by advocating that the maximum threshold for fines is set at 5 percent of the annual worldwide turnover of a company (a 3 percent increase compared to the 2 percent proposed by the European Commission in the original draft).
Second Pillar: All Companies Must Comply in Order to Operate on the European Market
LIBE confirmed that non-European companies must adhere to the same rules and personal data protection standards as European companies when offering services to European consumers, thus ensuring consistency in protection afforded to consumers in the European Union. This principle was the subject of intense lobbying by non-European-based companies, but ultimately LIBE reasoned that there should be a level playing field for all organizations that wish to tap into the 500 million potential customers in Europe.
Third Pillar: The Right to Be Forgotten
LIBE endorsed the European Commission’s controversial proposals regarding deletion of personal data. Building on existing rights, individuals should gain greater control over their personal information by being able to demand that a data controller stop processing such data and remove it from its system. This is not an absolute right; a data controller can retain personal data if there is a legitimate reason to keep it. A fine balance must be struck to ensure that the right to be forgotten does not impinge upon the rights of freedom of expression and information.
In fact, LIBE approved changes in the compromise text that further reinforce the right to be forgotten by allowing data subjects to require the deletion of any links to, or copies of, that data from third parties in possession of their personal information. In addition, the right to erasure where an EU court or regulatory authority makes a final ruling that the data concerned must be deleted has also been introduced. According to Vice President Viviane Reding, the European Union’s justice commissioner, this strengthening of the right to be forgotten demonstrates that “excessive lobbying can be counter-productive.”
Fourth Pillar: One-Stop Shop
LIBE supported the European Commission’s simplification plans to have a “one-stop shop” enabling
46.0000° N, 7.0000° E
“LIBE reasoned that there should be a level playing field for all organizations that wish to tap into the 500 million potential customers in Europe.”34 McDermott Will & Emery
companies that operate in several European Union countries to only have to deal with a single national data protection authority. Similarly, customers will benefit by being able to make complaints against a company established in a Member State other than their own by dealing with the data protection authority in their home State, in their mother tongue.
The LIBE vote allows the rapporteurs (MEPs Jan Philipp Albrecht and Dimitrios Droutsas) to begin meaningful negotiations with the Council of the European Union. After trilogue, the consolidated text will be put to a vote in the plenary session of the European Parliament.
European Commission President José Manuel Barroso emphasized the importance of advancing the data protection reform proposals swiftly. In his September 27, 2013, letter to heads of state and government, he pushed for adoption of the proposed data protection reforms before the end of the current parliamentary term in 2014. This is an aggressive timeline, but the European authorities appear aligned in respect to the content and importance of the reforms, which may accelerate progression of the intensely debated proposals through the legislative process.
OECD Updates the 1980 Guidelines on the Protection of Privacy and Transborder Flows of Personal Data
Catherine O’Connell and Rohan Massey
In 2010 the Organisation for Economic Cooperation and Development (OECD) marked the 30th anniversary of its 1980 Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (1980 Guidelines). The 1980 Guidelines contained the first international set of privacy principles and have proved to be very influential in data privacy regulation and policy. While the fundamental principles of the 1980 Guidelines remain unchanged, the OECD has recognized the pressing need to update the regulation in this area in order for it to satisfactorily accommodate revolutionary technological changes. Therefore, on September 9, 2013, the OECD published a revision to the 1980 Guidelines (Guidelines).
The Guidelines apply to personal data, both in the public and private sectors, that may pose a risk to privacy and individual liberties because of various factors, including its nature and the way in which it is processed. The Guidelines adopt a practical approach to data privacy, focusing on compliance and the ways in which effective implementation can be realized. In light of this focus, a key theme that runs through the Guidelines is making organizations accountable for their data protection and data privacy practices.
PRIVACY MANAGEMENT PROGRAMS
In recognizing the principle of accountability and its role in promoting organizational responsibility, the Guidelines develop the concept of privacy management programs. These will serve as the primary operational mechanism in delivering privacy protection, and should implement the Guidelines as regards “all personal data under [an organization’s] control,” incorporating all operations for which a data controller is accountable, irrespective of where or to whom data is transferred.
A key function of a privacy management program is that it incorporates effective safeguards for when agents of the data controller process personal data on its behalf or when the data controller’s responsibility is shared—for example, contractual provisions that require compliance with the data controller’s privacy responsibilities.
The Guidelines note that privacy management programs must be inherently flexible, adapting to the locations, volume and sensitivity of the controller’s operations. Further, regular updates and reviews of any privacy management program will ensure its relevance to the risk environment to which it relates.
It is important to note that, in addition to implementing the Guidelines, a privacy management program may need to incorporate other sources of data privacy regulation and policy, such as domestic law, international obligations, self-regulatory programs or contractual provisions.
DATA SECURITY BREACH NOTIFICATION
The Guidelines promote security safeguards to enhance protection against risks such as “loss or unauthorised access, destruction, use, modification or disclosure of data.” Data breaches often can be attributed to the data controller—for example, through lack of employee training and awareness.
“The Guidelines go further and incorporate a new definition of ‘laws protecting privacy.’”Privacy and Data Protection 2013 Year in Review 35
The purpose of breach notification laws and regulations is to increase a data controller’s incentive to disclose breaches voluntarily and quickly, and to adopt appropriate safeguards. Further core principles of the Guidelines, such as accountability, individual participation and openness, will be enhanced by data breach notification and may assist in improving the evidence base for policy making.
When a security breach occurs, other points should be considered in addition to the data controller’s internal notification protocol, such as whether other entities should be notified and how to ensure proportionate responses so as not to create an undue burden on the data controller and enforcement authorities.
PRIVACY ENFORCEMENT AUTHORITIES
A significant change from the 1980 Guidelines and the OECD’s 2007 Recommendation on Cross-border Co-operation in the Enforcement of Laws Protecting Privacy is that the Guidelines explicitly require that privacy enforcement authorities be established and maintained. The Guidelines go further and incorporate a new definition of “laws protecting privacy,” which refers to “national laws or regulations, the enforcement of which has the effect of protecting personal data consistent with these Guidelines.”
Fundamental to the success of these authorities is that they can operate and make decisions on an “objective, impartial and consistent basis.” This is highlighted in a new provision in Part Five, “National Implementation,” which requires the privacy enforcement authorities to be free from instructions, bias or conflicts of interest when making decisions in connection with laws protecting privacy.
TRANS-BORDER FLOWS OF PERSONAL DATA
Part Four of the Guidelines is a consolidation of various mechanisms implemented by Member countries since the 1980 Guidelines, in connection with protecting individuals’ privacy in the context of trans-border data flows. As a result of the technological advances since the 1980 Guidelines, Member countries have needed to adapt to data being processed simultaneously in multiple locations, stored all over the world, re-combined instantaneously and moved across borders via individuals’ mobile devices.
There are two circumstances in which a Member country should refrain from restricting trans-border flows of personal data. First, preserved from the 1980 Guidelines, trans-border data flows should not be restricted between countries in which the Guidelines are substantially observed. Second, where sufficient safeguards exist to ensure compliance with the Guidelines, restrictions are actively discouraged. The Guidelines indicate that any restrictions imposed should be proportionate to the risks presented, bearing in mind the type of data and processing involved.
In implementing an effective and robust privacy regime, the OECD recognizes the need for a unified, coordinated approach at the national level, along with consistent levels of protection across governmental bodies. Further, the Guidelines emphasize intra-governmental coordination, so as to promote coherence between various levels of government, as part of a country’s national privacy strategy.
In developing this principle, the Guidelines suggest some ancillary measures, such as education and
& AFRICA36 McDermott Will & Emery
awareness-raising, with a particular emphasis on privacy literary initiatives.
A central theme that underpins the Guidelines is the fundamental importance of consistency and cooperation in the implementation of an effective data privacy regime. To this end, the OECD Recommendation on Internet Policy Making, published in December 2011, and its earlier communiqué of June 2011 highlight this need at a global level, recognizing the importance of global governmental interoperability in this area.
In revising the 1980 Guidelines, the OECD has provided a timely update to the framework underpinning data privacy regimes in many parts of the world. The focus on privacy management programs is very much in line with the proposed EU position. There likely will be popular support in relation to the independence of national regulators, whether single or multiple entities. However, greater clarity on the mechanics for successful trans-border data flows may have been useful. It remains to be seen how these Guidelines will affect the development of the EU regime and regulatory safeguards in other jurisdictions that are also being reconsidered in light of exponential growth in data collection and transfer on a global basis, and the increasing risks this may signify to individuals.
Catherine O’Connell and Rohan Massey
In Opinion 15/2011, the Working Party outlined the main elements incorporated in valid consent: specific information, timing, active choice and freely given. Further, the Working Party highlighted that in order for website operates to ensure a consent mechanism that obtains valid consent for cookies, all of the main elements must be present.
& AFRICAPrivacy and Data Protection 2013 Year in Review 37
Once users have been fully informed as to the consequences of providing their consent, actions such as clicking on a link or ticking a box in, or close to, the relevant information will help to signify an active request to engage with the website and thus constitute valid consent. The main principle is that the website operator should be able to unambiguously conclude that they have obtained a specific and informed consent. Note that it must be made clear to the user that the action will constitute consent and that cookies will be set as a result of this particular action.
Further, the location of the mechanism by which the user provides his or her consent (e.g., a tick box) in close proximity to the information explaining the consequences of doing so is highly emphasized, and such information should remain on the screen until the consent has been provided. The opinion also notes that clicks for further information regarding cookies cannot constitute valid consent, nor can any absence of behavior.
REAL CHOICE – FREELY GIVEN CONSENT
The opinion makes that users must be able to freely choose between accepting all, some or no cookies, and to retain the possibility of amending their preferences at a later stage. The opinion notes that some Member States allow certain websites to restrict their availability to users who provide consent to cookies, although it is recommended that users be able to continue browsing without cookies being set, or by receiving only those for which consent has been obtained. The Working Party suggests that website operators refrain from providing only an option for users to consent, and not offering choice regarding all or some cookies. It is highly recommended that general access to websites is not made conditional upon obtaining consent for cookies; only certain limited content can be restricted on this basis.
The opinion further notes that processed data must be adequate, relevant and not excessive in relation to the purposes for which such data is collected and/or further processed (Article 6.1 of the e-Privacy Directive). Therefore, where cookies are not necessary for furthering the declared purpose, but only provide additional benefits for the website operator, users must be given a real choice as to the use of such cookies.
Article 29 Data Protection Working Party Publishes Opinion 03/2013 on Purpose Limitation
On April 2, 2013, the Article 29 Data Protection Working Party, an independent European advisory body on data protection and privacy, adopted and published Opinion 03/2013 on purpose limitation. The opinion analyses the principle of purpose limitation and provides guidance on and examples of its practical application, as well as recommendations for future policy.
Article 6(1)(b) of EU Data Protection Directive 95/46/EC provides for the principle of “purpose limitation.” The purpose limitation principle states that personal data must be collected for “specified, explicit and legitimate purposes” (purpose specification) and not be “further processed in a way incompatible” with those purposes (compatible use). In this way, the principle aims to protect individuals’ personal data while recognizing the need for some flexibility with respect to data controllers.
Article 7 of the Data Protection Directive sets out a number of legal grounds by which personal data may be processed—e.g., where consent is given and where the processing is necessary for the performance to which the data subject is a part, or for the purposes of the legitimate interests pursued
“It is highly recommended that general access to websites is not made conditional upon obtaining consent for cookies.”38 McDermott Will & Emery
by the controller. Through the draft general Data Protection Regulation (COM(2012) 11 final), it has been proposed in Article 6(4) that incompatible further processing may be legitimized on the basis of one of the Article 7 legal grounds.
The opinion provides useful definitions and examples of what purpose specification means in practice. In particular, it defines its key terms as follows:
Specified: The purpose must be “sufficiently defined to enable the implementation of any necessary data protection safeguards and to delimit the scope of the processing operation.”
Explicit: The purpose must be “sufficiently unambiguous and clearly expressed” so as to leave “no difficulty in understanding.”
Legitimate: According to the Working Party, legitimacy is broad and extends beyond the Article 7 legal grounds to all applicable law and codes of conduct/ethics, where relevant.
The Working Party suggests that whether or not further processing is compatible should be assessed on a case-by-case basis, taking account of all relevant circumstances and the following key factors:
The relationship between the purposes for which the personal data have been collected and the purposes of further processing
The context in which the personal data have been collected and the reasonable expectations of the data subjects as to their further use
The nature of the personal data and the impact of the further processing on the data subjects
The safeguards adopted by the data controller to ensure fair processing and to prevent any undue impact on the data subjects
The Working Party has advocated that Article 6(4) of the Data Protection Regulation be removed, because it sees this provision as allowing further processing for incompatible purposes that may erode the purpose limitation principle. Therefore, the Working Party proposes that data controllers should not be able to further process data already held on the basis of one of the Article 7 legal grounds and, instead, may only further process data on the basis of one of the stricter Article 13 grounds, including in relation to matters of national security, criminal offenses, and important economic or financial interests of the Member State or the European Union.
The Working Party’s opinion provides useful clarification, consistency and increased certainty in the interpretation of the Data Protection Directive, which should be welcomed by data subjects and data controllers alike. The opinion recognizes the purpose limitation principle as one of the key data protection principles and seeks to strengthen its protection while recognizing that its application should not be overly rigid.
Many commentators have welcomed the proposal for the deletion of Article 6(4) of the Data Protection Regulation.
The Changing Face of the Use of Public Sector Information in Europe
Keo Shaw and Rohan Massey
The past year has heralded significant changes for the processing of public sector information (PSI) in the European Union. PSI is data produced, stored or collected by public sector bodies. On June 27, 2013, Directive 2013/37/EU (Amendment Directive) was published in the Official Journal, and the Article 29 Working Party published an opinion on open data and PSI reuse. The opinion was adopted on June 5, 2013, and the Amendment Directive revising Directive 2003/98/EC on the reuse of PSI (PSI Directive) entered into force on July 17, 2013. The PSI Directive and the opinion are part of the European Commission’s Digital Agenda and Open Data Strategy Package for Europe, which aims to encourage cross-border use of PSI by harmonizing relevant legislation across the European Union.
THE REVISED PSI DIRECTIVE
The PSI Directive originally introduced a set of measures to make it easier for commercial entities to obtain permission to reuse information held by government authorities and lowered the fees charged in connection with granting such access. However, recent studies performed on behalf of the European Commission showed that businesses and individuals were still unable
& AFRICAPrivacy and Data Protection 2013 Year in Review 39
to easily locate or access much of the vast amount of PSI held in the European Union. After 16 months of negotiating, the Amendment Directive was finalized in an attempt to unlock the “data goldmine” held by European governments and stimulate economic growth in the European Union. The amended PSI Directive, subject to specific exceptions, makes it mandatory for public sector bodies to allow reuse of all information held, for both commercial and non-commercial purposes, provided the information is publicly accessible under national law and reuse under the PSI Directive is in compliance with applicable data protection law.
The key changes set out in the Amendment Directive are as follows:
The scope of the PSI Directive is expanded to include libraries, museums and archives, for the first time creating rights to reuse public information in these institutions.
The rules on charging for reuse of PSI are tighter. Public sector bodies will, in general, only be able to charge the marginal cost for reproduction, provision and dissemination of the information and will be obliged to be more transparent about the charging rules used.
Availability of governmental data in machine-readable and open formats is encouraged, and new rules on digitization agreements are introduced.
The European authorities have made these amendments in order to benefit the internal market through greater transparency in respect of PSI and ultimately in the hope of stimulating innovation.
While the reuse of PSI may have its benefits, it is not without risk. To this end, the Working Party issued its opinion highlighting the importance of having a strong legal basis for making personal data available to the public, taking into account the relevant rules and principles of data protection laws. The UK Information Commissioner’s Office jointly drafted the opinion with the European Data Protection Supervisor and the Slovenian data protection authority. The opinion acts as guidance for Member States and public authorities implementing the Amendment Directive.
The opinion emphasizes that a balanced approach must be followed to ensure that the protection of personal data is not jeopardized by the desire to make PSI data available for reuse. In fact, the opinion points out that, under the terms of the amended PSI Directive, it is not always appropriate to make PSI containing personal data available. Frequently, statistical data derived from personal data should be made available instead. However, the Working Party appreciates that there may be situations where personal data may be considered available for reuse under the terms of the amended PSI Directive, but only where necessary, subject to additional legal, technical or organizational measures to protect the data subjects concerned.
On this basis, the Working Party opinion makes the following recommendations:
Personal data considerations should be entertained at the earliest opportunity when contemplating making PSI available, specifically by utilizing data protection by design and default.
Public authorities should carry out a data protection impact assessment before any PSI containing personal data or anonymized datasets derived from personal data are made available for reuse. The Working Party stressed the importance of assessing the risk of re-identification when datasets are anonymized, suggesting re-identification tests as a good strategy.
The outcome of the data protection impact assessment should identify appropriate safeguards to minimize risks (for example, appropriate license terms, technical measures to prevent bulk data downloads or appropriate anonymization techniques) or lead to a determination that certain data should not be made available for reuse.
License terms for the reuse of PSI, featuring personal data or anonymized datasets derived from personal data, should include data protection provisions.
Where the data protection impact assessment indicates that an open license is insufficient to address data protection risks, personal data should not be made available under the amended PSI Directive. However, public sector bodies may exercise discretion in respect of reuse outside the remit of the amended PSI Directive in compliance with applicable data protection laws.
“The scope of the PSI Directive is expanded to include libraries, museums and archives.”40 McDermott Will & Emery
Where appropriate, personal data should be anonymized and license conditions should specifically prohibit data subject re-identification and reuse of personal data for purposes that may affect those individuals.
Member States should think about supporting the sharing of best practice methodology related to anonymization and open data among public bodies.
Member States have until July 18, 2015, to implement the Amendment Directive. For the United Kingdom, this is likely to involve updates to the Re-use of Public Sector Information Regulations 2005 and the reuse provisions in the Freedom of Information Act 2000. Publication of revised UK legislation to implement the Amendment Directive is expected in late 2014 or early 2015.
ICO Publishes “Bring Your Own Device” Guidance
Désirée Fields and Rohan Massey
On March 7, 2013, the Information Commissioner’s Office (ICO) published new guidance on “bring your own device” (BYOD), which explains the risks organizations must consider when allowing personal devices, such as smart phones, laptops and tablets, to be used to process work-related personal information.
The Data Protection Act 1998 (DPA) places
obligations on organizations responsible for processing personal information. In particular, Principle 7 requires appropriate technical and organizational measures to be taken against unauthorized or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data.
Data protection risks occur when there are a large number of devices used to process personal information outside the employer’s direct control. According to a recent survey commissioned by the ICO and carried out by YouGov, many employers are at risk of failing to comply with the DPA. In particular, the survey identified that many employers’ approach to popular personal devices, such as laptops, tablets and smart phones, put personal information at risk. The survey indicated that nearly 50 percent of all UK adults use personal devices for work-related purposes, but only 30 percent of those who do so have been provided with guidance from their employers on data protection policies. The ICO guidance explains how allowing employees to use their own devices can be done safely, permitting the employing company to retain control of the personal information for which it is responsible, and ensuring compliance with the DPA.
The key recommendations from the ICO guidance include the following:
Employers should carry out detailed assessments of the types of data being processed, and the nature and risks involved with the different personal devices used by their employees.
Employers should implement a clear BYOD policy that will necessarily be unique to each employer and should be monitored for compliance regularly.
Employers should be clear as to what types of personal information may be processed on personal devices.
Personal devices should be password protected. Strong passwords should be used and controls put in place for the automatic deletion of all data if an incorrect password is detected several times consecutively. Employees should be clear as to which information will be deleted.
Encryption should be used to store data on the device securely.
Employers should exercise extreme caution in any use of public cloud-based sharing or back-up services.
Devices should have remote locate and wipe services in case of loss or theft.
Monitoring technology should remain proportionate and not excessive, in particular during personal use.
ICO URGES ORGANIZATIONS TO IMPLEMENT BYOD POLICIES
The reality of these issues is illustrated by a recent data breach committed by the Royal Veterinary
53.5500° N, 2.4333° W
“The ICO has urged all organizations to provide guidance and training pertaining to the use of personal devices for work purposes.”Privacy and Data Protection 2013 Year in Review 41
College (RVC) involving the theft of a memory card containing passport images of six job applicants from a camera owned by an RVC staff member. Because the device was personally owned by the employee, the theft was not caught by RVC’s data protection policies, which did not account for the possibility of employees using their own devices in the workplace. In addition, the data protection training offered by RVC at the time was not adequate.
This particular data breach only affected a relatively small number of individuals, and the impact of the breach was not likely to cause substantial damage or distress to the affected persons. RVC undertook to (i) provide mandatory induction and annual refresher data protection training to all staff; (ii) record and monitor all such training; (iii) encrypt all portable media devices; (iv) introduce adequate physical security measures to prevent unauthorized access to personal data; and (v) introduce other appropriate security measures to protect personal data against unauthorized and unlawful processing, accidental loss, destruction and/or damage. In light of these measures, the ICO agreed not to exercise its powers to serve an Enforcement Notice under Section 40 of the DPA.
In the aftermath of RVC’s data breach, the ICO has urged all organizations to provide guidance and training pertaining to the use of personal devices for work purposes, and to ensure that they follow the ICO’s recommendations set out in the guidance by ensuring that their data protection policies reflect the way employees use personal devices for work.
While many employers have experienced the increased efficiency and other benefits that modern personal devices can bring to their workforce, the potential for data protection breaches must not be ignored, particularly when employees own the devices. Employees’ personal devices are, by their nature, outside the control of employers. This is a particularly dangerous situation when the devices are used to process personal information that is under the control of the employer.
By way of example, an employee’s personal tablet or other smart device is likely to include a number of different apps. Apps are able to collect large quantities of data from such devices and process these in order to provide new and innovative services to end users. However, severe data protection risks result from a lack of transparency and awareness of the types of processing an app may undertake. Further, there is often a lack of free and informed consent by users when downloading an app. Accordingly, there is a risk that an app downloaded to an employee’s smart device may obtain access to personal data owned by an employer. It is crucial for employers to consider these risks when developing policies and
& AFRICA42 McDermott Will & Emery
implementing controls to keep personal information secure. The ICO guidance should be welcomed by employers, because it offers clear insight into good practice and methods for dealing with such data protection risks.
Rules on Notification of Personal Data Breaches for Electronic Communication Service Providers Come into Force
Leigh J. Smith and Rohan Massey
Regulation 611/2013, on the measures applicable to the notification of personal data breaches under Directive 2002/58/EC, which places obligations on electronic communication service providers to notify national authorities and individuals of breaches of personal data, came into force on August 25, 2013.
The E-Privacy Directive (2002/58/EC), as amended by Directive 2009/136/EC, places certain obligations on the providers of publicly available electronic communications services, such as telecommunications companies and internet service providers. One such obligation is to notify the relevant national data protection authority when a personal data breach occurs.
This obligation was implemented in different ways across the European Union, leading to uncertainty, for example, as to the time period in which the service provider must make the notification and the information that is required to be notified. This was a particular problem in relation to cross-border breaches. In 2011, the European Commission conducted a public consultation that highlighted the need to harmonize the approach to these notifications.
Regulation 611/2013 aims to remedy this uncertainty by harmonizing the notification obligations on electronic communications services in the European Union. Article 2 requires that the service provider notify the competent national authority of the personal data breach no later than 24 hours after the detection of the breach. Detection is deemed to occur where the service provider has “acquired sufficient awareness” of a security incident to make a meaningful notification as required by the Regulation (Article 2(2)).
The Regulation provides an annex that sets out the content of the notification to the competent national authority. In addition to providing basic information regarding the incident and the data concerned, the service provider is required to set out the potential consequences and adverse effects on those affected by the breach, and to state what technical and organizational measures the service provider has taken to mitigate those effects.
For breaches with a cross-border element, the service provider is required to notify each competent authority if the breach in question may involve subscribers in other Member States, and to disclose which other national authorities have been notified. The Regulation contains a requirement that the competent national authority, on being notified of a cross-border breach that may affect individuals in another Member State, must inform the other relevant national authorities concerned (Article 2(5)). This requirement demonstrates the Regulation’s aim of harmonizing the approach to notification, particularly in order to deal more effectively with breaches of personal data by an electronic communications service provider that affect individuals across several Member States. Service providers should be mindful of this obligation on national authorities when considering the extent of any notifications under the Regulation.
Where the information set out in annex 1 to the Regulation is not available with the 24-hour notification period, service providers are still required to make an initial notification of the breach and follow up with full details as soon as possible (Article 2(3)). In any event, within three days of the initial notification, the service provider must provide all information on the breach that it possesses at that stage and provide reasoned justification for the late notification of any outstanding information.
The Regulation also seeks to harmonize notifications by service providers to the individuals “adversely affected” by the breach. Article 3(2) sets out a list of factors to be taken into consideration when assessing whether or not an individual is adversely affected by the breach, including the nature and content of the personal data concerned and the likely consequences
& AFRICAPrivacy and Data Protection 2013 Year in Review 43
of the breach, such as identity theft, physical harm or damage to reputation.
The Regulation does contain an exception to the requirement to notify breaches to individuals. Service providers are exempted if they can demonstrate to the competent national authority that technical measures were in place that rendered the data unintelligible (Article 4(1)). The Regulation invites the European Commission to prepare a list of appropriate technical measures for service providers to use in consultation with the national authorities and the Article 29 Working Party (Article 4(3)).
Because the Regulation has been in force for only a few months, it is difficult to assess whether the Commission’s aims have been achieved. Nevertheless, the need for harmonization is clear.
As stated in recital 19 to the Regulation, the Commission has proposed a harmonized obligation on all data controllers to notify personal data protection breaches in the draft regulation to replace the Data Protection Regulation (95/94/EC). The obligations in the Regulation are described as consistent with that proposal. It will be interesting to see how the obligation in the Regulation works in practice, and whether this leads to the Commission revising its proposal.
ICO Publishes Subject Access Code
Désirée Fields and Rohan Massey
On August 8, 2013, the Information Commissioner’s Office (ICO) published new guidance in the form of a Subject Access Code of Practice to assist organizations in dealing with requests from individuals for personal data, often referred to as subject access requests.
Pursuant to Section 7 of the Data Protection Act 1998 (DPA), individuals have a right to request access to all personal data held about them by an organization by means of a subject access request. This right allows individuals to obtain important information ranging from credit history information to data included in their health care records, and extends to personal data held in e-mails, electronic documents and paper files, as well as personal data held in other forms, such as swipe card records. For information to be personal data, it must relate to a living individual and allow that individual to be identified from it.
Once a subject access request has been received, an organization has 40 days to provide the requested information. Individuals are entitled to be told whether any personal data has been processed, given a description of the personal data and the reasons for it being processed, and told whether it will be provided to any other organization or people. Individuals are entitled to be given a copy of the personal data and, where available, details of the source of the data. Individuals may also request details about the reasoning behind any automated decisions made about them, such as a computer-generated decision to grant or deny credit.
The ICO has noted that it has handled more than 6,000 complaints related to subject access requests during the last financial year, illustrating the need for organizations to improve their procedures in relation to such requests.
The Subject Access Code of Practice was issued under Section 51 of the DPA as part of the ICO’s duty to promote good practice. The Code does not have any legal force, and the ICO cannot take any action against organizations that fail to comply with the Code unless their conduct also breaches the provisions of the DPA. However, the Code provides practical guidance to organizations holding personal data. It covers issues such as the following:
Identifying subject access requests, including those made through social media
Finding and retrieving the relevant information, and the scope of any search to be undertaken
Dealing with subject access requests involving other people’s information
The form in which information must be supplied to the requester
Exempted categories of information
The Code includes as an appendix a “Subject Access Request Checklist” that is intended to assist staff who regularly deal with subject access requests
“Nevertheless, the need for harmonization is clear.”44 McDermott Will & Emery
by highlighting the issues to be considered in 10 simple steps:
1. Identify whether a request should be considered as a subject access request.
2. Ensure that you have enough information to be certain of the requester’s identity.
3. Promptly ask the requester for any details reasonably required to find the information requested.
4. Promptly ask the requester to pay any applicable fee.
5. Check whether you have the information the requester wants.
6. Do not make any changes to the records as a result of receiving the request, even where information is inaccurate or embarrassing.
7. Consider whether records contain information about other people.
8. Consider whether you are obliged to supply all the information or whether a relevant exemption applies.
9. Ensure that you explain any complex terms or codes so that the information provided can be understood.
10. Unless otherwise agreed with the requester, provide the information in a permanent form.
The Code is very detailed, provides practical examples and illustrations, and dedicates a whole section to special cases that have separate considerations, such as credit files, health records, information held about pupils by schools and information about examinations. It also explains the interaction between the subject access requests under the DPA and requests under the Freedom of Information Act 2000, and emphasizes how modern technologies and new business models, such as the use of social media by organizations, may affect subject access requests.
Notably, where organizations allow their staff to use personal devices, such as smart phones, laptops and tablets, to process work-related personal data, such personal data would be covered by a subject access request. The ICO issued separate guidance on “bring your own device” in 2013 that explains the risks organizations must consider when allowing personal devices to be used to process work-related personal information. For more information, please see “ICO Publishes ‘Bring Your Own Device’ Guidance” on page 40.
The ICO also announced its intention to carry out a “subject access request sweep” of websites to determine what type of information organizations in the public, private and third sectors are providing to anyone who may want to make a subject access request. The ICO intends to publish a report on this subject in 2014.
Although the Code is a non-binding set of best practice recommendations, organizations holding and processing personal data are advised to consider adjusting their subject access request policies and procedures to reflect the guidance contained within the Code. Because the Code was drafted with the provisions of the DPA in mind, compliance with its provisions should assure organizations that they have satisfied their legal obligations with respect to the handling of subject access requests.
UK Data Anonymization Code
The year 2013 saw significant developments in the European open data agenda, resulting in more and more anonymized data being released into the public domain. To assist data controllers, the UK Information Commissioner’s Office (ICO) published a helpful code of practice on managing data protection risks related to data anonymization. The code explains the issues surrounding the anonymization of personal data and the disclosure of such data once it has been anonymized.
Data protection law does not apply to data rendered anonymous in such a way that the data subject is no longer identifiable. Following the publication of a draft anonymization code of practice in 2012 and the subsequent review of feedback, the ICO published its final form of the code and launched its UK Anonymisation Network to promote good data practice. The code seeks to help organizations identify the issues that should be considered to ensure that the
& AFRICAPrivacy and Data Protection 2013 Year in Review 45
anonymization of personal data is effective, focusing on the legal tests required in the Data Protection Act 1998 (DPA). It provides an explanation of, and practical advice on, data anonymization methods and the related risks of publishing personal data. It also includes a range of case studies and examples to make the legal issues easier to understand.
DEFINITION OF TERMS AND RISK IDENTIFICATION
The code points out that the concept and definition of identification and anonymization are not straightforward. Individuals can be identified in various ways, and re-identification (the process of turning anonymized data back into personal data through the use of data matching or similar techniques) by third parties is a real possibility. Thus it is vital for an organization to undertake a thorough assessment of the risks of identification if it decides to disclose anonymized data.
ENSURING EFFECTIVENESS OF ANONYMIZATION
The ICO recommends the use of the “motivated intruder” test to assess the risk of re-identification. The motivated intruder is taken to be a competent person with access to publicly available resources, but without any prior knowledge, who wishes to identify the individual relating to the anonymized data. The test is designed to assess whether the motivated intruder would be successful. In practice, the test may involve carrying out a web search to discover whether or not a combination of elements, such as date of birth and postcode, can be used to reveal a particular individual’s identity; using social networking to see if it is possible to link anonymized data to a user’s profile; or using the electoral register and local library resources to attempt to associate anonymized data with an individual’s identity.
The code mentions that consent generally is not needed for lawful anonymization because it could be logistically burdensome or downright unfeasible to obtain such consent.
The code states that organizations that anonymize personal data need an effective and comprehensive governance structure, with senior-level oversight of the arrangements that are put in place. It recommends that such a structure include the following:
A senior information risk owner with the technical and legal understanding to manage the process
Training, so that staff have a clear understanding of anonymization techniques, the risks involved and the ways of mitigating those risks
Procedures for identifying cases where achieving anonymization may be a challenge
A knowledge management system to identify and disseminate new guidance or case law that clarifies the legal framework surrounding anonymization
A procedure to facilitate the sharing of information on planned disclosures with organizations in the same sector or doing similar work in order to assess the risk of jigsaw identification
A privacy impact assessment
A transparent anonymization approach so that the public has easy access to clear information concerning why and how personal data is anonymized, and whether or not individuals have any control over the anonymization of their personal data
A review of the consequences of the anonymization program
A disaster recovery procedure should
re-identification occur and lead to a compromise of the privacy of data subjects
TRUSTED THIRD PARTY
A trusted third party is an organization that can be used to convert personal data into anonymized data. The code highlights the value of using this kind of arrangement, particularly in the context of data anonymization on behalf of a number of organizations working together in a collaborative project. In such situations, the use of trusted-third-party arrangements means that the organizations involved never need to access each other’s personal data, reducing greatly the risk of violating data protection law.
While the code is undoubtedly a useful tool, it is worth remembering that it does not carry the force of law. Compliance with the code’s recommendations and
“Thus it is vital for an organization to undertake a thorough assessment of the risks of identification if it decides to disclose anonymized data.”46 McDermott Will & Emery
advice on good practice is therefore not compulsory where the guidance goes beyond the requirements of the DPA.
ICO Publishes Guidance on Data Protection and Social Networking
Leigh J. Smith and Rohan Massey
The Information Commissioner’s Office (ICO) published guidance on when the Data Protection Act 1998 (DPA) applies to the use of social networking and online forums. The guidance is clear that when an organization posts personal data to, or collects personal data from, an online forum, the DPA applies. The guidance also sets out the steps the ICO expects an organization to take if it allows a third party to post personal data to its online forum.
The DPA applies a principle-based approach to the protection of personal data, which is defined as data from which a living individual can be identified. The DPA imposes obligations on the data controller. Given that “data controller” is defined broadly as “a person who determines the purposes for which and the manner in which any personal data are to be processed,” and that “processing” has an equally broad definition, social networking activities often fall within the scope of the DPA. An individual or organization that posts names or contact details on a social networking site is a data controller processing personal data under the DPA. The social network operator itself also may be considered a data controller.
The ICO issued guidance on the obligations the DPA imposes related to social networking, because the activities described above are commonplace and therefore impose obligations on a large number of individuals and organizations.
The guidance addresses what the ICO sees as the main issues raised by social networking. These are, broadly, personal use of social networking, use by organizations where personal data may be uploaded or downloaded by the organization, and the extent to which obligations are imposed on the operator of a social networking platform.
The ICO is of the view that personal use falls within the domestic purposes exemption found in Section 36 of the DPA. Blogging about family activities, for example, does not fall within the data protection principles. If a personal blog is used for commercial purposes—for example, a sole trader promoting its business—then the domestic purposes exemption is unlikely to apply. The ICO acknowledges that it is not always clear whether the use of social networks is for domestic or non-domestic purposes, and provides working examples for individuals and groups to consider.
The use of social networking by commercial organizations, in contrast, does not fall within Section 36, even if the information is posted by an individual on behalf of the organization. Uploading and downloading of personal data in such circumstances is governed by the DPA. The more complicated issue is when the provider of a social networking site or online forum is to be considered a data controller under the DPA. This is important because one of the data protection principles (Principle 4) requires that personal data be accurate and up-to-date. If the provider of a forum is a data controller, it is unclear how far the provider must go in monitoring and moderating the content of its forum to ensure compliance.
“The social network operator itself also may be considered a data controller.”Privacy and Data Protection 2013 Year in Review 47
The ICO takes the view that in such circumstances the DPA imposes the obligation to take reasonable steps to make sure personal data posted by third parties and presented as fact rather than opinion is accurate. “Reasonable steps” are to be assessed on a case-by-case basis, but would not extend as far as pre-moderating every post. The ICO favors a proportionate approach and would look for the provider to have in place clear policies as to content and a mechanism for complaining about inaccurate posts.
The ICO’s guidance on social networking is welcome. In particular, the clarification on whether an individual can rely on the domestic processing exemption and the extent to which a provider of a forum must engage with third-party posters in order to comply with the DPA is helpful. Although social networking is a useful tool in engaging consumers, commercial organizations must ensure compliance with the DPA and also that their employees understand the organization’s obligations. The dynamic nature of social networking means that organizations should have clear policies and educate employees that may be engaged in social networking on behalf of the organization.
Waiting for the New Employee Data Protection Act
Paul Melot de Beauregard and Christian Gleich
Since 2010, Germany has waited for the enactment of a new employee data protection act to eradicate the existing legal uncertainties regarding the collection, processing and use of employees’ personal data. The widely differing opinions of the parties, trade unions and employers’ associations repeatedly have delayed the enactment.
At the beginning of 2013 there was reasonable hope that the governing parties finally would be able to realize the proposed legislation, but ultimately they delayed it once more for an indefinite period of time, presumably because of the federal elections in September 2013 and the contentious reactions to the draft legislation. Among these reactions, the question regarding the legitimacy of video surveillance at the workplace was a particularly difficult issue. The governing parties agreed to prohibit hidden video surveillance completely, but as a quid pro quo agreed to permit and even to extend the legitimacy of existing open video surveillance of employees. This decision in particular raised the apprehension of the trade unions.
Now, after the elections, it remains to be seen how the Employee Data Protection Act will be handled by the grand coalition that likely will rule the country in the coming four years. Given the participation of the Social Democratic Party, it must be assumed that the Employee Data Protection Act—if it is passed at all—will appear much more employee-friendly than the original bill. During the previous discussions about the original bill, the Social Democrats emphasized that they would take a stand for the data protection rights of employees. In addition, it remains uncertain whether there will be an enactment before the final European General Data Protection Regulation is adopted. It is possible that the governing parties will await that regulation in order to avoid the necessity of modifying the new Employee Data Protection Act shortly after its enactment to bring it in line with the provisions of the General Data Protection Regulation.
Guidelines on Marketing and
Veronica Pinotti and Martino Sforza
On July 4, 2013, the Italian Data Protection Authority published its Guidelines on Marketing and Against Spam, which provide, among other things, guidance on unsolicited commercial offers, as well as on advertising through social networking sites (such as Facebook or Twitter) or messaging services (such as Skype or WhatsApp).
51.5167° N, 9.9167° E
43.0000° N, 12.0000° E48 McDermott Will & Emery
According to the guidelines, spamming (consisting of unsolicited e-mails and SMS) continues to be banned and exposes companies active in Italy to heavy fines (up to EUR 500,000). The guidelines also provide best practices on specific issues, such as commercial offers to companies’ own customers concerning goods similar to those that they have purchased (soft spam), which may be allowed in certain circumstances, and how to collect the consent of “followers” on social networking sites or messaging services. The guidelines also set out recommendations to companies commissioning marketing campaigns by contractors entrusted with contacting prospective customers.
Veronica Pinotti and Martino Sforza
Italian Data Protection Authority’s Short Guide on “Privacy: Working with Business – 10 Corporate Best Practices to Improve Your Business”
Veronica Pinotti and Martino Sforza
On May 27, 2013, the Italian Data Protection Authority published a short guide for companies doing business in Italy, titled “Privacy: Working with Business – 10 Corporate Best Practices to Improve Your Business.”
The guide provides practical recommendations on issues such as the allocation of responsibility between data controller and data processor, information arrangements and consent required depending on the data processing activities performed by the company, and the choice of the most appropriate security measures to minimize the risks of potential data privacy breach.
Data Privacy Developments in France
Jilali Maazouz, Myrtille Lapuelle and Ludovic Bergès
As in most other countries, data privacy law and “e-law” is constantly evolving in France. France’s data information regulatory authority, the CNIL, oversees adherence to data privacy laws. In 2013, the CNIL
47.0000° N, 2.0000° EPrivacy and Data Protection 2013 Year in Review 49
had to deal with everything from retaining customer data in a company’s network to video surveillance and privacy issues with personal files at work.
SALE OF CLIENT FILES NULLIFIED BY FRENCH SUPREME COMMERCIAL COURT FOR FAILURE TO COMPLY WITH CNIL REGISTRATION OBLIGATIONS
Although the CNIL provides administrative penalties, the issue of data privacy in France now has extended beyond being solely an administrative issue. In 2013 an unprecedented seminal case before the French Supreme Commercial Court (Court of Cassation Chambre Commerciale) laid out the need to comply with French data privacy law. In this case, one e-marketing company attempted to sell client data (including customer names, addresses, dates of birth, etc.) to another company; however, the seller had not previously properly registered the customer data with the CNIL as required by French law. The court held that without proper registration, the data could not be sold or transferred. Therefore, the court nullified the sale of data for failure to comply with the French regulations. With this decision, the Court of Cassation has opened a Pandora’s box.
PAYPAL ON THE CNIL’S RADAR
PayPal recently announced its decision to change its privacy rules (modified on October 18, 2013) and its conditions of use (to take effect on November 18, 2013). The change involves more detailed data collection and the exchange of personal information for marketing and fraud investigation purposes. Facebook and certain marketing companies now will have access to personal information. If PayPal customers reject these policy changes, PayPal will close those customers’ accounts. On October 23, 2013, the CNIL announced that it will conduct an analysis of the payment system.
MAIN CNIL SANCTIONS FROM 2013
Two other 2013 cases where the CNIL doled out sanctions involved (i) video surveillance at work, where an employer was sanctioned by the CNIL for “excessive data collection” and for videotaping certain employees throughout the entire work day (the employer claimed it was for security purposes), and (ii) improper data retention, where BNP Paribas mistakenly maintained client information under files for unpaid debts when the clients had already paid off their credit. In the latter case, clients’ credit scores were much lower than they should have been otherwise, which hurt clients in their search for further loans to be used for non-professional reasons. In the video surveillance case, the CNIL issued a EUR 10,000 fine for excessive data retention, while in the BNP Paribas case, the CNIL issued a warning to BNP Paribas for failing to cancel the clients’ unpaid credit in its system.
CNIL INVOLVEMENTS IN INTERNATIONAL ISSUES
Certainly one of the largest issues of the year across Europe and in France involves the U.S. government’s National Security Agency and its PRISM program on data surveillance for fighting terrorism. On October 24, 2013, the CNIL stated its concern that a general, ambiguously defined and non-targeted surveillance program is a violation of privacy laws. The program is expansive and has exhibited unprecedented reach into ordinary citizens’ personal lives. The CNIL recommends a unified European response and emphasizes the need for Europe and the United States to agree on proper regulation of data exchange between countries in order to avoid a society of complete and unencumbered electronic surveillance.
As the leader of the European taskforce (data protection authorities from France, Germany, Italy, the Netherlands, Spain and the United Kingdom), the CNIL also has been deeply involved in the regulation of one major U.S. technology company’s new data privacy policies. On March 19, 2013, representatives of that U.S. company were invited, at their request, to meet with the taskforce led by the CNIL. Since this meeting, the company has not made any changes to its policy based on the taskforce’s recommendations. Consequently, all the authorities composing the taskforce launched enforcement actions on April 2, 2013, on the basis of the provisions laid out in their respective national legislation (investigations, inspections, etc.)
In June 2013, the CNIL ordered the company to comply with the French data protection law within three months and to take the following actions in particular:
Define specified and explicit purposes for data collection
Inform users with regard to the purposes of the processing implemented
“With this decision, the Court of Cassation has opened a Pandora’s box.”50 McDermott Will & Emery
Define retention periods for the personal data processed
Not proceed, without legal basis, with the potentially unlimited combination of users’ data
Fairly collect and process passive users’ data
Inform users and obtain their consent before storing cookies in their terminal
On the last day of the three-month time period given for response, the company contested the CNIL’s reasoning and, notably, the applicability of the French data protection law to the services used by residents in France. Therefore, the company has not implemented the requested changes. In this context, the chair of the CNIL has designated a rapporteur for the purpose of initiating a formal procedure for imposing sanctions, according to the provisions set forth in the French data protection law.
FORECAST FOR 2014
E-law continues to rapidly expand as a legal field, and no doubt increasing numbers of complaints will be filed and unprecedented issues raised as data storage becomes more complex and intertwined with professional and personal information. In the coming months, new developments are sure to emerge as French authorities aim to sanction the U.S. technology company described above for its new data privacy policies and as more information is released on the NSA program and PRISM surveillance. It should be noted that the CNIL has asked that the education and information of “digital citizens” be declared an issue of national importance in 2014, because companies and individuals increasingly are required to be digitally informed and responsible in this digital age.
Middle East & Africa
Emerging Data Protection Laws in the Middle East and Africa
Keo Shaw, Catherine O’Connell and Rohan Massey
The authorities of the European Union currently view all Middle Eastern countries as failing to provide an adequate environment for the safe hosting and protection of personal data. As new technologies are deployed and commerce becomes progressively
& AFRICAPrivacy and Data Protection 2013 Year in Review 51
global, African and Middle Eastern nations increasingly view data protection as an important element to attract foreign business. This outlook is resulting in a rapidly developing privacy landscape in these regions.
United Arab Emirates
Introduction of Federal Information Security Measures
Keo Shaw, Catherine O’Connell and Rohan Massey
In light of the introduction of Emirate-specific data security measures in Abu Dhabi and Dubai in recent years, data protection was formalized at the federal level with the implementation of Cabinet Resolution No. 21 of 2013 on October 29, 2013. The Resolution applies to the ministries, public corporations, institutions and public bodies affiliated with the federal government.
Rather than imposing strict obligations on government departments, the Resolution takes the form of an acceptable use policy, with the aim of raising awareness about information security. In addition, the Resolution implements standards for the use of IT systems and procedures that should be implemented by federal authorities in the context of establishing a federal legal framework for data protection.
Different sections of the Resolution address different aspects of the federal government’s infrastructure, including the use of different electronic communication methods. The Resolution also distinguishes between four types of confidential information, namely very confidential, confidential, restricted and public. Further, an obligation is placed on federal authorities to classify information within these four categories and to define users’ rights in respect of these categories accordingly.
The Resolution implements criminal liability under United Arab Emirates law for employees who are in breach, which could include both financial penalties and imprisonment, along with any disciplinary penalties that may apply within an employee’s federal authority.
Kazakhstan Enacts Privacy Law
Keo Shaw, Catherine O’Connell and Rohan Massey
On May 21, 2013, the Kazakhstan government adopted new data privacy legislation, “On Personal Data and Their Protection,” governing the use and collection of personal data, which took effect November 26, 2013.
Prior to the introduction of this unified legislation, data protection in Kazakhstan had been implemented through various legal acts, including the Civil Code, the Labor Code and the Banking Law, applicable to different areas. The new consolidated law applies to individuals, legal entities and state authorities collecting and processing personal data, whether in electronic form or otherwise, and works parallel to the existing regulatory framework.
The law introduces new concepts of personal data and the collection and processing of such data, in addition to definitions of database controller and database operator. Under the new law, database controllers and operators are required to state the purpose for which data is collected, and such use must adhere to this stated purpose. This terminology and these measures are reminiscent of the European approach to data protection. Further, prior to the collection and processing of personal data, database controllers and operators are obligated to obtain the consent of the data subject, subject to certain exceptions.
The law also addresses trans-border flows of personal data and introduces specific provisions to stipulate what data may be transferred to third parties and under what conditions. Transfers may occur outside of Kazakhstan without the owner’s permission where the recipient ensures personal data protection, although consent must be obtained where the transfer is outside the scope of the stated purpose of use.
Changes made to existing legislation to implement the new law include providing penalties for the improper collection and processing of personal data and failure to protect personal data under the Administrative and Criminal Codes, among others. Administrative
“Rather than imposing strict obligations on government departments, the Resolution takes the form of an acceptable use policy, with the aim of raising awareness about information security.”
23.7833° N, 54.0000° E
48.0000° N, 68.0000° E52 McDermott Will & Emery
offenses may lead to fines and/or the confiscation of objects or instruments of the offense in question, while criminal liability can result in fines, prohibition from holding certain positions, up to one year of community service and/or up to five years’ imprisonment.
Protection of Personal Information Bill
Keo Shaw, Catherine O’Connell and Rohan Massey
After a decade-long process, South Africa’s Protection of Personal information Bill (PoPI) was passed by its Parliament on August 20, 2013. PoPI represents South Africa’s first comprehensive data protection legislation. The new legislation not only provides protection for individuals, but for “juristic persons,” including legal entities. After translation into Afrikaans, PoPI will be sent to the South African president for assent.
PoPI incorporates several data protection “conditions” and is in essence based on Directive 95/46/EC (Data Protection Directive) of the European Union. To this end, it is thought that the introduction of PoPI could help ensure a finding by the European Commission that South Africa’s privacy regime provides adequate protection consistent with EU standards. Such a finding of adequacy would ensure that the flow of personal data from the European Union to South Africa becomes less restrictive.
The eight conditions established by PoPI, which must be met in order for personal data to be transferred lawfully, are accountability, processing limitation, purpose specification, further processing limitation, information quality, openness, security safeguards and data subject participation. Similar to the EU Data Protection Directive, PoPI places further restrictions on data transfers outside of South Africa unless the recipient country has laws that provide a similar level of protection for the personal data.
In addition, PoPI implements requirements of notice and consent applicable to the data subject in terms of collecting and using personal data, and limits the retention period, in most cases, to no longer than is necessary to achieve the declared purpose for which the data was collected. Further, PoPI establishes a right on behalf of data subjects to access and correct their collected personal data.
30.0000° S, 25.0000° EPrivacy and Data Protection 2013 Year in Review 53
PoPI also establishes South Africa’s data protection authority, an independent Information Protection Regulator, and makes data breach notifications to the affected individual and the Regulator compulsory. Individuals will be able to bring claims, or have the Regulator to do so on their behalf, seeking injunctive relief and damages. PoPI also will affect companies directly, in that they will be required to appoint data protection officers to ensure compliance and liaise with the Regulator, and individuals will be given the right to demand that businesses employ reasonable data security safeguards.
The Regulator is given powers to carry out investigations, seek fines of up to ZAR 10 million (approximately $972,000/£604,000), and impose prison sentences of up to 10 years for the obstruction of the activities of the Regulator and up to 12 months for other violations of PoPI. Commentators have noted the uncertainty surrounding the extent to which PoPI will be enforced, however, noting that overly strict enforcement could have the negative effect of lessening South Africa’s appeal as an investment market. For example, it is unclear to what extent the Regulator will be adequately trained and resourced.
Regardless, the potential new exposure to fines, enforcement notices, infringement notices, imprisonment and civil actions for damages makes companies’ compliance with PoPI very important. PoPI provides a 12-month grace period for organizations to ensure their data protection policies adhere to the new legislation.
Morocco Accedes to Council of Europe Convention 108
Keo Shaw, Catherine O’Connell and Rohan Massey
On June 6, 2013, Morocco adopted a bill approving the Council of Europe’s Convention 108 for the Protection of Individuals with Regard to Automatic Processing of Personal Data. Convention 108 represents the Council of Europe’s first legally binding instrument governing data protection, and once it is ratified, Morocco will be the second country that is not a member of the Council of Europe to have acceded to the Convention (Uruguay being the first).
The Convention was adopted in 1981, and the Council of Europe is discussing updating it, particularly in light of recently proposed changes to the EU data protection framework. The Council of Europe is said to be encouraging Morocco to accede to the Additional Protocol, which provides for the establishment of national supervisory authorities and restricts trans-border data flows to countries that do not provide an adequate level of protection. Given that Morocco has already implemented a Data Protection Act, this accession indicates Morocco’s serious commitment to comprehensive data security standards.
Following Russia’s ratification of the Convention in May 2013, Morocco’s upcoming accession will bring the number of parties to the Convention to 47.
New Legislation Against Cybercrime
Keo Shaw, Catherine O’Connell and Rohan Massey
The Bank of Tanzania legal counsel announced at a media seminar in September 2013 that the bank has been working with the Tanzanian government and other stakeholders to draft three pieces of legislation addressing cybercrime. The legislation is thought to be in its final stages and will provide a framework for online financial transactions, for which there is no provision in the current penal code, and for the criminalization of computer- and network-related offenses.
In conjunction with the increase of the number of transactions that occur online, Tanzania has seen an increase in the amount of cybercrime being committed. Such cases have since proved difficult to investigate and prosecute because of the lack of legislation in this area. As a result, at the media seminar, the Bank of Tanzania’s legal counsel emphasized the significance of the new legislation and its importance in the fight against internet-based criminal activity.
“Morocco will be the second country that is not a member of the Council of Europe to have acceded to the Convention.”
32.0000° N, 6.0000° W
6.3070° S, 34.8540° E54 McDermott Will & Emery
China has begun to address issues surrounding data privacy with the publication of several new guidelines and provisions. Meanwhile, a number of other Asia-Pacific countries are looking to implement comprehensive data privacy regimes similar to those promulgated in Europe.Privacy and Data Protection 2013 Year in Review 55
Asia’s Move Towards “European Style” Comprehensive Legislation
Keo Shaw and Heather Egan Sussman
Following is an update on recent legislation that mirrors similar regulations in Europe:
Singapore: The Personal Data Protection Act has been coming into effect in stages since January 2, 2013, with the main data protection rules coming into force July 2, 2014. Additionally, in February 2013, Singapore’s Personal Data Protection Commission issued two public consultation papers:
Advisory guidelines examining key concepts in the Personal Data Protection Act
Selected topics: analytics and research, anonymization, employment, use of national ID numbers and online activities
Malaysia: The Personal Data Protection Act 2010 is expected to come into force by the end of 2013 and is noteworthy for the heavy penalties that it introduces for non-compliant companies.
Hong Kong: In 2012, the Personal Data (Privacy) (Amendment) Ordinance was introduced. Most amendments came into force in October 2012, but a number of provisions, most notably relating to direct marketing and legal assistance, came into force April 1, 2013.
Vietnam: The New Labor Code (effective May 1,
2013) contains new provisions relating to the disclosure of employee and employer information
Status of APEC Cross-Border Privacy Rules Effort
Ann I. Killilea
On August 21, 2013, IBM announced that it is the first company to achieve certification under the new Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules (CBPR) system. In 2012, the United States and Mexico were APEC-approved as participating countries, and TRUSTe was approved as an accountability agent. On June 7, 2013, Japan applied to participate in the program. It appears that progress is underway in attracting participants (companies, countries, accountability agents and enforcement agencies) to the CBPR system.
These developments raise expectations that certified companies may be enabled to engage in cross-border data transfers without running afoul of national data transfer restriction laws. This is not the case. While movement is afoot, concrete benefits have yet to be articulated and realized.
Before an APEC country can become a participating member, all three of the following key structures must be approved:
National Laws. The country’s laws and regulations that apply to cross-border data transfers must be described and reviewed. The United States has been approved as a participating member country.
Accountability Agent. An accountability agent is a third-party organization that provides verification services related to data protection policies and practices for companies seeking CBPR certification. At least one APEC-recognized accountability agent within that country must be approved by APEC. TRUSTe has been certified as an accountability agent for the United States and is the only agent approved so far for the system.
Public Enforcement Agency. A governmental enforcement agency within that country must be approved as a final enforcement arm. The Federal Trade Commission has been designated as the enforcement agency for the United States.
Presently, only two of the 21 APEC member countries are actively participating.
The system is a voluntary, certification-based scheme that incorporates the APEC Privacy Framework.1 To participate in the system, a company in a participating country must develop and implement privacy policies and practices that are consistent with the APEC Framework and, in particular, the nine data privacy principles. A company must then submit its policies and
1 The Privacy Framework established nine core APEC privacy principles on which this new governance system is based: preventing harm, notice, use, collection limitation, choice, security safeguards, integrity, access and correction, and accountability.
ASIA-PACIFIC56 McDermott Will & Emery
practices, in the form of a completed APEC-recognized CBPR questionnaire, to a locally based, APEC-recognized accountability agent for assessment against the CBPR program requirements. If an accountability agent determines that an organization’s policies and practices are consistent with these requirements, the company is certified as CBPR-compliant and its details are added to the CBPR “white-list.” This then allows the company to represent to consumers that it complies with data privacy standards in the program requirements. It is hoped that in the future each APEC country will recognize these requirements as acceptable data privacy standards.
Joshua Harris, chair of the APEC Cross-Border Joint Oversight Panel, charged with operationalizing the CBPR system, indicated recently that the benefits of certification are still to be defined. It is important to note that there is no overarching agreement or treaty binding the APEC countries. Consequently, the certification only indicates a company’s compliance with program requirements. It does not purport to certify that the company is compliant with domestic privacy regulation that applies above and beyond the program requirements. Domestic regulation in the APEC region takes many forms, including detailed statutes in Australia, Singapore and Hong Kong; comparatively high-level statutes in the Philippines and Malaysia; and absence of economy-wide privacy statutes in many other APEC jurisdictions. Compliance with domestic privacy regulation requires country-by-country review, even if a company is CBPR-certified.
Two benefits of certification have been noted, and IBM addressed both of them in its announcement:
Privacy Reputation. Certification for a company may yield a type of privacy seal of approval within the APEC region. IBM defined the certification benefits in this way: “This certification demonstrates IBM’s commitment to protecting individuals’ data privacy.”
Influence on the Future of Data Transfer Regulation. Currently, APEC and the European Commission are exploring the interoperability between APEC CBPRs and the European Binding Corporate Rules. As IBM stated in its certification announcement, “CBPR rules will become the foundation of a globally accepted system that enables data to be shared throughout different regions with strong and trustworthy privacy protections.”
Large multinational companies striving to become leading players in this arena may seek certification to highlight their best practices and to voice their views on the future of global data transfer regulation. By building a corporate privacy management program consistent with the APEC privacy principles, however, any U.S.-based multinational, certified or not, will benefit from the work and vision of this CBPR process.
Information Security Technology – Guidelines for Personal Information Protection Within Public and Commercial Services Information Systems
Alex An and Frank Schoneveld
On February 1, 2013, there came into effect the “Information Security Technology – Guidelines for Personal Information Protection Within Information System for Public and Commercial Services” (the Guidelines). The Guidelines were issued as a national standard by the Standardization Administration, and are voluntary.
The Guidelines are limited to the processing of personal information that involves the use of an information system. The Guidelines define personal information and clarify key points for collecting, processing, transferring and deleting personal information.
DEFINITION OF PERSONAL INFORMATION
In the Guidelines, personal information is defined as “computer data that may be processed by an information system, relevant to a certain natural person, and that may be used solely or together with other information to identify such natural person.”
The Guidelines divide personal information into “sensitive personal information” and “general
35.0000° N, 103.0000° EPrivacy and Data Protection 2013 Year in Review 57
personal information.” The former is defined as “information that would have an adverse impact on the subject if disclosed or altered,” and may include ID numbers, race, political viewpoints, religion, genetic profile, fingerprints, etc., while the latter is defined as “all personal information other than sensitive personal information.”
COLLECTION OF PERSONAL INFORMATION
According to the Guidelines, before personal information is collected, the following should be notified to the personal information subject:
The purpose of collection
The means of collection, the specific information collected and the time of retention
The scope of use of the collected personal information, including the scope of disclosure or provision of personal information to other organizations and institutions
Measures for protection of personal information
The name, address and contact information of
Risks the user may encounter after providing personal information
The consequences if the user is unwilling to provide personal information
The channel that a user should take when filing a complaint
If personal information will be transmitted or entrusted to another organization or entity, the subject of the personal information must be notified of the purpose for such transmission or entrustment; the specific personal information and scope of use of the transmitted or entrusted personal information; and the name, address and contact information of the recipient of the entrusted personal information.
It should be noted that consent is implied for the collection of general personal information, but express consent must be obtained before collecting sensitive personal information.
PROCESSING PERSONAL INFORMATION
Personal information may only be processed according to the purposes, scope, methods and approaches previously notified to the personal information subject. The personal information subject has the right to obtain the status of any processing and require the administrator of the personal information to edit or supplement the information if it is misleading.
TRANSFER OF PERSONAL INFORMATION
Similar to the processing of personal information, the transfer of such information must follow the purposes and scope notified to the personal information subject. It might be noted that the Guidelines prohibit the transfer outside China of any personal information to an individual or entity without express consent, government permission, or explicit permission provided under relevant laws or regulations.
DELETION OF PERSONAL INFORMATION
The personal information subject has the right to require deletion of his or her personal information with appropriate reason. The personal information must be deleted as soon as the purpose for which the information is collected has been fulfilled, or the time of retention has expired. If the administrator of personal information becomes bankrupt or is dismissed, and it cannot fulfill the purpose for which the personal information is processed, it must delete the personal information.
The Guidelines demonstrate China’s self-regulatory efforts on the protection of personal information. While the Guidelines lack the force of law, public sector bodies and companies are encouraged to follow the Guidelines as a best practice. If there is not widespread adoption of these Guidelines, they might be introduced in binding legislation in the future.
Provisions on the Protection of Telecommunications and Internet Users’ Personal Information
Henry Chen, Samon Sun and Jared T. Nelson
On July 16, 2013, China’s Ministry of Industry and Information Technology (MIIT) published the Provisions on the Protection of Telecommunications and Internet Users’ Personal Information (Provisions), which came into effect on September 1, 2013. The Provision’s primary purposes are to improve the protection of telecommunication and internet users’ personal information, and to improve the enforcement
“The Guidelines demonstrate China’s self-regulatory efforts on the protection of personal information.”58 McDermott Will & Emery
of the Decision on Strengthening Protection of Online Information, which was adopted by China’s Standing Committee of the National People’s Congress on December 28, 2012.
SCOPE OF PROTECTION
According to the Provisions, users’ personal information is information that can be used alone or combined with other information to identify a user, and includes information that is collected through the provision of services by telecommunication business operators (TBOs) and internet information service providers (IISPs). Personal information includes a user’s name, date of birth, identification card number, address, telephone number, account name and password, as well as “meta information” about a user’s habits, including the time and location of the use of the services.
COLLECTION AND USE
The Provisions require TBOs and IISPs to comply with the principles of legitimacy, justification and necessity, and to be responsible for the security of personal information. Under the Provisions, TBOs and IISPs are required to take the following actions:
Create and publish their policies for the collection and use of personal information
Obtain users’ consent before collecting and using personal information
Explicitly state the purpose, methods and scope of the collection and use of the personal information
Limit the scope of collection to only that personal information that is necessary to provide services to users
Immediately cease the collection and use of personal information when users stop using the services, and provide supported channels for users to cancel their accounts
Not divulge, falsify, damage, sell or illegally supply users’ personal information
DUTY TO MONITOR THIRD-PARTY AGENTS
When commissioning agents to conduct sales, technological services or other services directly provided to users involving the collection and use of personal information, TBOs and IISPs must supervise and manage the agents’ work to ensure compliance with personal information protection.
THE SECURITY GUARANTEE SYSTEM
The Provisions clearly explain measures that TBOs and IISPs must take to prevent users’ personal information being disclosed, damaged, falsified or lost. These measures include management systems, access supervision, storage media standards, information systems, operating records and other similar aspects. The Provisions also stipulate systems of self-inspection for personal information protection, and training for TBO and IISP employees.
THE SUPERVISION AND INSPECTION SYSTEM
The Provisions require the authorities that regulate the telecommunications industry to carry out supervision and inspection, and require TBOs and IISPs to cooperate with those actions. In addition, the Provisions entitle the authorities to examine the status of protection for users’ personal information when issuing a permit and carrying out annual inspection of TBOs. The Provisions also stipulate that authorities must record any violations in the social credit files and make such violations public information.
Violations of the Provisions may result in penalties, including administrative warnings, fines of up to RMB 30,000 and criminal liabilities. Some commentators believe that the fine amount is too low and that there may not be adequate incentives to encourage compliance by TBOs and IISPs. This level of fine, however, is the maximum amount allowed for a ministry-level regulation such as the Provisions.
As China places more emphasis on the protection of personal information, an increasing number of privacy-specific laws and regulations will be enacted. It is expected that more high-level laws will be issued to provide wider protection for personal information, not only within the area of telecommunications and the internet, but also in all aspects of daily life in China.
China’s New Consumer Protection Law Strengthens the Protection of Consumers’ Personal Information
Henry Chen, Samon Sun, Jared T. Nelson and Alex An
On October 25, 2013, China’s Standing Committee of the National People’s Congress passed the Decision
“As China places more emphasis on the protection of personal information, an increasing number of privacy-specific laws and regulations will be enacted.”Privacy and Data Protection 2013 Year in Review 59
on Amendments to the Consumer Rights and Interests Protection Law and released the amended Consumer Rights and Interests Protection Law, which will come into effect on March 15, 2014. The Consumer Rights and Interests Protection Law came into effect at the beginning of 1994. Some provisions of the law have become outdated because of the significant changes in the Chinese consumer market and the economy overall. Strengthening the protection of consumers’ personal information is one of the highlights of the new Consumer Rights and Interests Protection Law.
PROTECTION OF CONSUMERS’ PERSONAL INFORMATION
The edited Article 14 of the new law explicitly provides that consumers have the right to require their personal information to be legally protected when purchasing or using goods, or receiving services.
Specific rules for protection of consumers’ personal information are provided under the newly added Article 29, including three key points:
Where business operators collect or use personal information of consumers, they must (i) abide by the principles of legitimacy, fairness and necessity; (ii) expressly inform the consumer of the purpose, method and scope of the collection and use; (iii) publish the business operator’s policy on collection and use; and (iv) not violate any laws, regulations or mutual agreements between the company and the consumers.
Business operators must keep information collected from consumers confidential and must not disclose, sell or illegally provide such information to others. Business operators must take technical and other necessary measures to secure consumers’ personal information and to prevent the disclosure or loss of that information. If the personal information has been or might be divulged or lost, then the business operator must immediately take remedial measures.
Business operators must not send commercial messages to a consumer without that consumer’s consent or request, or if a consumer has explicitly refused to receive such commercial messages.
The newly added Article 29 mainly echoes the provisions in the Decision on Strengthening Protection of Online Information, which was adopted by the Standing Committee of the National People’s Congress on December 28, 2012. There are important differences between these two laws,
ASIA-PACIFIC60 McDermott Will & Emery
however. The Decision on Strengthening Protection of Online Information primarily focuses on the protection of citizens’ digital personal information, regardless of whether such data concerns consumers or non-consumers, while the provisions in the new Consumer Rights and Interests Protection Law concentrate on the protection of consumers’ personal information, regardless of whether such information is online (digital) information or non-digital information.
The new law stipulates corresponding legal liabilities for infringing consumers’ rights involving personal information; these liabilities can be divided into civil and administrative liabilities.
According to the amended Article 50 of the new law, business operators that infringe consumers’ rights regarding protection of personal information will be ordered to cease the infringement, restore any damages to the consumers’ reputation, eliminate the bad effects of the violation, make apologies and compensate the victims.
According to the amended Article 56 of the new law, business operators that infringe consumers’ rights regarding protection of personal information may receive a variety of punishments, including a warning, confiscation of unlawful earnings, the imposition of a fine up to RMB 500,000 or up to 10 times the value of the unlawful earnings, or even the suspension or revocation of their business license.
The Chinese government has begun to address more comprehensively the protection of personal information, and similar provisions should be expected in future laws and regulations. These regulations are regarded by commentators as essential for increasing trust and accountability in the broader retail market, and are likely to help contribute to an increase in China’s consumption in the long run.
Data Privacy in South Korea
Paul J. Kim and Solyn Lee
On March 29, 2010, South Korea passed the Personal Information Protection Act (PIPA), a comprehensive law governing data privacy that became effective on September 30, 2011. South Korea also has industry-specific legislation on data privacy as follows:
The Act on the Promotion of Information and Communication Network Utilization and Information Protection, which applies to telecommunications businesses
The Use and Protection of Credit Information Act, which applies to banks and other persons handling credit information
The Act on Real Name Financial Transactions and Guarantee of Secrecy, which applies to information obtained by financial or financial services institutions
PIPA broadly restricts the collection and handling of any personal information by any person, company or government agency. “Personal information” means any information from which, by itself or combined with other information, an individual can be identified, whether from his or her name, national identification number, voice, sound, image or other attributes. Generally the individual’s informed consent will be required for any collection, use or disclosure of personal information.
However, the individual’s informed consent may not be sufficient to collect and use Resident Registration Numbers (RRNs), Korea’s national identification numbers. On July 30, 2013, the Korean Ministry of Security and Public Administration (MOSPA) announced an amendment to PIPA concerning collection and use of RRNs, which will come into force in August 2014. Companies are prohibited from collecting and using RRNs based solely on the consent of a data subject unless such collection and use is (i) specifically authorized by the applicable law, (ii) inevitable in light of imminent danger to the physical safety or property of a data subject or a
“These regulations are regarded by commentators as essential for increasing trust and accountability in the broader retail market.”
36.0000° N, 128.0000° EPrivacy and Data Protection 2013 Year in Review 61
third party, or (iii) authorized by a separate MOSPA ordinance. Any data on RRNs previously collected and stored, if not authorized by the applicable law, should be destroyed within two years after the amendment becomes effective.
The reach of the data protection laws in South Korea is not limited to companies doing business in South Korea, and extraterritorial application is possible with respect to matters that affect South Korean data subjects. Under PIPA Article 17(c), when a company intends to provide a third party outside South Korea with personal information, the company must notify the data subject of certain information in advance, including, among other things, the name of the third party outside South Korea, the purposes of collection or use of the personal information, the specific types of personal information to be provided and the period of retention and use. The company also must obtain prior consent.
PIPA provides clear rules on video images: video recording and transmission devices may be installed and operated only to the extent necessary for crime prevention and other limited purposes.
Individuals may bring a claim against offending companies for damages arising from their breach of PIPA. Importantly, PIPA places an evidentiary burden on the company to prove that there was no negligence on its part in handling the personal information. Therefore, South Korean companies must ensure that they have the appropriate records in place to show full compliance with PIPA, including the amendments related to RRNs that will come into force in August 2014.
ASIA-PACIFIC62 McDermott Will & Emery
Developments in Australia
SIGNIFICANT CHANGES TO FEDERAL PRIVACY ACT
The Federal Parliament passed amendments to the Privacy Act in November 2012, but the changes do not go into effect until March 12, 2014, in order to give businesses, federal government agencies and other organizations sufficient time to prepare for the changes.
The amendments introduce 13 new Australian Privacy Principles (APPs) that are applicable to both business and government. These APPs replace both the Information Privacy Principles applicable to government agencies and the National Privacy Principles applicable only to businesses and other private organizations. Small businesses generally are exempt from the requirements of the Privacy Act.
The office of the Australian Information Commissioner has issued draft guidelines on the new APPs that cover, in some detail, the collection, processing, integrity and access to, and correction of personal information. Notably, the changes to the Privacy Act take the following actions:
Impose liability directly on businesses for breach of the APPs by their offshore data processing contractors
Give greater powers to the Australian Information Commissioner to assess privacy performance, accept undertakings and impose civil penalties of more than $1.5 million for breach
Change the rules concerning direct marketing, requiring greater consent from consumers
Require businesses to introduce and document a privacy compliance program that ensures compliance with the new APPs, as well as an adequate privacy-complaints-handing mechanism
Enable the Australian Information Commissioner to recognize external dispute resolution schemes to handle privacy-related complaints (guidelines on such external schemes are also published)
25.0000° S, 135.0000° EPrivacy and Data Protection 2013 Year in Review 63
Make changes to credit-reporting laws allowing the reporting of credit and repayment history (with certain exceptions), and requiring a credit provider to be a member of an external dispute resolution scheme in order to participate in the credit reporting system
Introduce new Codes of Practice for APPs and credit reporting
PRIVACY BREACH ALERTS BILL
In May 2013, a new Federal Parliamentary bill was introduced to complement the Privacy Act amendments that go into effect in March 2014. This Privacy Breach Alerts Bill would require federal government agencies, businesses and private organizations to notify individuals affected by certain “serious data breaches.” Failure to comply with the notification obligation would be regarded as an interference with an individual’s privacy, thereby triggering the powers of the Information Commissioner, including the power to seek civil penalties. There is currently a voluntary guide issued by the Information Commissioner with advice on how to handle a data breach. The Privacy Breach Alerts Bill was largely supported by the then-Parliamentary-opposition, now the government, so it is expected to become law in 2014 around the same time that the amendments to the Federal Privacy Act go into effect.
NEW SOUTH WALES: CCTV IN PUBLIC PLACES
In May 2013, following a court finding of breach of the Privacy and Personal Information Act by a local council using closed-circuit television (CCTV), local councils in New South Wales were granted an exemption from the Privacy and Personal Information Act to use CCTV in public places. Local government in New South Wales now can legally transmit live CCTV images of public places to the New South Wales police.
VICTORIA: NEW PRIVACY AND DATA PROTECTION COMMISSIONER
In December 2012, the Victorian government decided to merge the offices of the Commissioner for Law Enforcement Data Security and the Privacy Commissioner. The two offices were formally merged in 2013. The new Commissioner is tasked with developing a regulatory regime that provides security of personal information together with oversight of security standards for systems and processes that store, access and exchange personal information.
ASIA-PACIFIC64 McDermott Will & Emery
As more multinational companies expand into Latin America, the call for data protection laws is increasing. Read on for a country-specific primer of the most recently enacted data privacy laws in the region.Privacy and Data Protection 2013 Year in Review 65
Data Privacy in Latin America
Effie D. Silva and Marcos Jiménez
Given the proliferation of multinational corporations in Latin America, data privacy is a growing cause for concern and the impetus for the enactment of data protection laws in Colombia, Costa Rica, Peru, Argentina, Chile and Uruguay. In addition, Brazil currently is considering whether to enact a data protection law.
Although every country within the European Union has adopted the similar data protection laws based on the common Directive, Latin American countries lack a similar uniformity. Indeed, the data protection laws in Latin America vary in both regulation and procedure from country to country. For example, while most Latin American countries require registration with a data protection authority, many countries have differing security obligations. Therefore, it is imperative that companies conducting business in Latin America keep abreast of the country-specific laws, because the obligations differ significantly in each region.
Data Protection Agency of the People Implements Personal Data Regulations
Effie D. Silva
On September 5, 2011, Costa Rica enacted the Protection of the Person Concerning the Treatment of Personal Data (Law No. 8968). Modeled after the EU Data Protection Directive, it regulates almost every kind of personal data processing activity and requires express written consent for many of them. The law also established the Data Protection Agency of the People (Prodhab), which was given authority to issue sanctions for violations of the law. In Mach 2013, Prodhab issued and implemented regulations called the “Regulations of the Law of Protection of the Person in the Processing of His Personal Data.” The regulations require, among other procedures, data controllers to register their databases with Prodhab, notify Prodhab of any breaches, and notify subjects of any irregularities in the processing or storage of their data.
9.9500° N, 84.0000° W66 McDermott Will & Emery
Supplemental Regulations to the Data Protection Act
Effie D. Silva
In late 2012, Colombia enacted the Data Protection Act (Law 1581) to regulate the protection of personal data and safeguard the constitutional right of privacy. On June 27, 2013, Regulations of the Data Protection Act (Decree 1377) were issued to supplement the Data Protection Act. The regulations both identified the information to be disclosed to data owners and implemented consent requirements and standards for cross-border transfers. For example, the regulations require data controllers to develop privacy policies that require the provision of the following information to the data owner: the name of the organization receiving the data, the purpose of the data collection, the person responsible for the data request and the right of the data owner, including the procedure for the owner to lodge a complaint or revoke authorization for the transfer.
Peru Implements Law for Personal Data Protection
Effie D. Silva
On July 3, 2011, Peru published the Law for Personal Data Protection. The regulations implementing this law were issued in March 2013 (Reglamento de la Ley No 29733, Ley de Proteccion de Datos Personales). The regulations include provisions that permit use of digital signatures to satisfy written consent, permit privacy policies to satisfy information notice requirements, require contractual provisions for cross-border transfers and place specific requirements on cloud computing service providers. In addition to the regulations, the Peruvian Congress made Amendments to Criminal Code (Law No. 30076), making unauthorized or illegal trafficking of data a criminal offense (Article 207-D of Criminal Code). A person that creates a log-in or misuses a database on a natural or legal person, identified or identifiable, to market, trade, sell, promote, encourage or provide information of a personal, family, property, labor, financial or other similar nature, creating an injury, may be punished by imprisonment of three to five years.
8.2333° S, 76.0167° W
3.8167° N, 73.9167° WPrivacy and Data Protection 2013 Year in Review 67
“Uruguay became the first non-European nation to ratify Convention 108 of the Council of Europe and its Additional Protocol.”
Uruguay Ratifies Convention 108
Effie D. Silva
In April 2013, Uruguay became the first non-European nation to ratify Convention 108 of the Council of Europe (CoE) and its Additional Protocol, making it the 45th country to be party to the Convention. Convention 108 was the first legally binding instrument governing data protection, adopted by the CoE in 1981. It both established minimum data protection standards against abuses occurring from the collection and processing of personal data and regulated cross-border data transfers. The CoE is currently discussing proposals to modernize the Convention in light of the upcoming changes proposed by the draft EU Data Protection Regulation.
Movement Toward Comprehensive Data Law
Effie D. Silva
Brazil is working on enacting a comprehensive data protection law modeled after both the European Data Protection Directive and the Canadian Data Protection Law (PIPEDA). The draft bill guarantees a list of citizens’ basic rights regarding their personal data, including, but not limited to, the right to access of one’s data, modification or deletion of inaccurate or wrong data, objection to their processing, compensation for their misuse, and participation in decision making. A proposed bill that will provide civil rights for internet users, called Marco Civil da Internet, has not yet been approved. Recently, there have been proposed amendments to the bill that would require internet companies to physically store data about Brazilians within Brazil. Furthermore, the Brazilian government announced that it is decommissioning Microsoft Outlook and will be activating its own secure e-mail system operated by the Federal Data Processing Service.
32.7333° S, 56.6500° W
10.6500° S, 52.9500° W