Introduction

In this issue, we cover hearings convened by the House Appropriations Subcommittee and the House Financial Services' Task Forces on Financial Intelligence and Artificial Intelligence. We report that National Institute of Standards and Technology (NIST) closed the comment period on its Privacy Framework, and we review the Federal Trade Commission's (FTC) Workshop on "The Future of the COPPA Rule" and letters that Senators wrote prior to the workshop encouraging the FTC to protect children's privacy. In California, we examine Alastair Mactaggart's effort to place a privacy initiative on California's 2020 ballot and the California Consumer Privacy Act (CCPA) draft regulations announced by the California Attorney General. Across the pond, we discuss the outcome of Google's "Right to Be Forgotten" case in the European Union (EU) and the ruling for active consent of cookie tracking by the EU's top court. We also highlight the discussion between EU and United States (U.S.) officials regarding EU – U.S. Privacy Shield.

Heard on the Hill

House Appropriations Subcommittee Holds Hearing on FTC Oversight

On September 25, 2019, the House Committee on Appropriations (Committee) Subcommittee on Financial Services and General Government (Subcommittee) convened a hearing, "Federal Trade Commission: Protecting Consumers and Fostering Competition in the 21st Century." Federal Trade Commission (FTC) Chairman Joseph Simons and FTC Commissioner Rohit Chopra testified at the hearing. Among other topics, witnesses and Subcommittee members discussed a wide range of privacy issues, from state privacy initiatives to potential federal privacy frameworks.

During the hearing, members and witnesses discussed funding of the FTC. Subcommittee Chairman Mike Quigley (D-IL) expressed support for the FTC as he referred to it as "one of the most important" agencies that the Subcommittee funds. FTC Chairman Simons noted that the FTC would be able to increase privacy enforcement if the FTC received the extra $40 million that H.R. 3351 would allocate to the FTC. H.R. 3351 was introduced in the Senate and referred to the Senate Committee on Appropriations on June 27, 2019. No hearings have been held in the Senate on the bill. Subcommittee Ranking Member Tom Graves (R-GA) noted that the FTC must ensure that there is no overregulation, which he said would stifle innovation.

Subcommittee members and witnesses voiced a need for a federal privacy policy framework. FTC Chairman Simons and FTC Commissioner Chopra expressed support for a federal consumer privacy legislative framework. FTC Commissioner Chopra expressed support for solutions that would help ensure the protection of sensitive data from domestic and international actors. When discussing potential models for a federal consumer privacy legislative framework, Rep. Norma Torres (D-CA) asked the FTC Commissioners what could be learned from various state privacy laws and bills, such as the California Consumer Privacy Act (CCPA). In response, FTC Commissioner Chopra noted that enforcement actions in the tech space have been successful among state attorneys general. He also noted that the strength of the CCPA is due to its "zero tolerance" rules for privacy violations.

Members of the Subcommittee and witnesses expressed concerns about children's privacy and location privacy. When discussing the FTC's review of the Children's Online Privacy Protection Act (COPPA) Rule, FTC Commissioner Chopra noted that the FTC should not weaken COPPA. During a discussion of location privacy, FTC Chairman Simons expressed concern about location privacy, as he noted that location information is "particularly sensitive" information.

Among other topics, FTC Commissioners and Subcommittee members also discussed online platforms and antitrust enforcement, identity theft and fraud, and the FTC and Federal Communications Commission's efforts to combat robocalls.

House Financial Services' Task Force on Financial Technology Holds Hearing on Privacy During Real-Time Payments and Task Force on AI Holds Hearing on Financial Data

Over the past several weeks, the House Committee on Financial Services (Committee) Task Force on Financial Technology (Fintech Task Force) and the Committee's Task Force on Artificial Intelligence (AI Task Force) held hearings to explore how the use of a real-time payment system, FedNow, and the use of cloud services would affect consumer privacy. The Fintech Task Force and the AI Task Force, created together in May of this year, serve the mandate to examine FinTech and AI developments.

On September 26, 2019, the Committee's Fintech Task Force held a hearing on "The Future of Real-Time Payments." The hearing explored the Federal Reserve's plans to launch FedNow, a real-time payments system, in 2023 or 2024. The hearing included five witnesses from the Federal Reserve, financial institutions, and financial technology firms.

Fintech Task Force members and witnesses discussed the benefits that a real-time payments system would provide to consumers and the logistical hurdles the Federal Reserve will need to overcome when it implements the program. Ranking Member French Hill (R-AR) stated that FedNow must address privacy protections and provide a means to individually authenticate users. Harsh Sinha, Chief Technology Officer of TransferWise, explained that the Federal Reserve can learn from the mistakes of other countries that have implemented their own real-time payment systems by proactively implementing security measures.

On October 18, 2019, the Committee's AI Task Force convened a hearing, "AI and the Evolution of Cloud Computing: Evaluating How Financial Data Is Stored, Protected, and Maintained by Cloud Providers." The purpose of the hearing was to assess the privacy risks that arise when financial institutions use cloud service providers to protect sensitive consumer information. The hearing included witnesses representing technology companies, financial institutions, and academic institutions.

During the hearing, Steve Grobman, Senior Vice President and Chief Technology Officer of McAfee, explained that financial institutions often adopt cloud computing because it helps to cut costs and streamline their IT management. He added that it allows smaller businesses to access advanced technology that would traditionally be available only to large organizations.

AI Task Force Chairman Bill Foster (D-IL) expressed his concern that the move toward cloud storage would increase operational risks, such as data breaches and ransomware attacks, absent regulatory guidelines. He noted that regulators will expect third-party cloud service providers to meet the same regulatory requirements as the financial institutions they serve. Paul Benda, Senior Vice President, Risk Cybersecurity Policy at the American Bankers Association, welcomed additional clarity regarding the role of regulators in overseeing cloud service providers. Both AI Task Force members and witnesses suggested that, moving forward, a collaborative approach between banks, technology companies, and regulators would ensure that consumer data is protected while still creating room for cloud service providers to adapt to the changing security landscape.

Around the Agencies and Executive Branch

NIST Closes Comment Period on Privacy Framework

On October 24, the National Institute of Standards and Technology (NIST) closed the comment period on the preliminary draft of its privacy framework, Privacy Framework: A Tool for Improving Privacy through Enterprise Risk Management (Privacy Framework). The Privacy Framework, similar to NIST's Cybersecurity Framework, creates a voluntary, flexible, and broadly applicable method for considering and addressing privacy risk for organizations just starting their privacy program and those with a well-developed program looking to identify and address gaps.

Also in line with the Cybersecurity Framework, the Privacy Framework is based on a three-part structure: "Core," "Profiles," and "Implementation Tiers." The Core contains five "functions," broken down by category and subcategory, designed to help organizations identify actions they can take to protect privacy and communicate internally about privacy risks.

The Profiles are a subset of the Core's functions, categories, and subcategories that an organization has prioritized for improvement, and they allow an organization to assess its "Current" Profile against a "Target" Profile. This helps an organization identify gaps and develop a risk treatment plan.

The Implementation Tiers provide a method for organizations to evaluate their risk management, from "Tier 1: Partial" (e.g., privacy training is ad hoc and not current with best practices) to "Tier 4: Adaptive" (e.g., privacy training is regular, specialized, and up to date for all personnel, and personnel at all levels understand their roles in upholding an organization's privacy values). The Privacy Framework encourages organizations to move to at least Tier 2, and notes that not all organizations may need to achieve Tier 3 or 4 in all areas.

Once their privacy plans are finalized and aligned with the Cybersecurity Framework, organizations could use both frameworks to assist in managing privacy and security risks.

FTC Holds Workshop on "The Future of the COPPA Rule," and Senators Press FTC to Protect Children's Privacy Prior to FTC COPPA Workshop

On October 7, 2019, the Federal Trade Commission (FTC) held a workshop, "The Future of the COPPA Rule." The workshop related to the FTC's request for comment on the regulation implementing the Children's Online Privacy Protection Act (COPPA Rule). The workshop included remarks from FTC commissioners, presentations by academics and industry, and four panels, moderated by FTC attorneys and including participants from academia, industry, and consumer advocacy.

Opening remarks by FTC Commissioner Christine Wilson highlighted issues that the updates to the COPPA rule are seeking to address, including education technology, platforms that host third-party content, and voice-activated devices. Commissioner Wilson expressed support for federal privacy legislation. Commissioner Noah Philips also gave remarks during the workshop. He noted the importance of advertising and advocated for a risk-based approach to COPPA updates.

A pediatrician from a public research university presented on the increase in time spent by children on interactive platforms and devices and stressed that children are often attracted to content that is "more mature" than their age groups. An industry trade group representative's presentation stated that parents often do not use the privacy controls that are available to them and discussed challenges related to obtaining verifiable parental consent under the law. A computer scientist from a private research university gave the final presentation, which noted privacy concerns arising from children's increased use of mobile devices.

The four panels covered the following topics:

  • State of the World in Children's Privacy. Panelists highlighted concerns regarding the effectiveness of age gates, the California Consumer Privacy Act's intersection with COPPA, additional efforts related to children's privacy in the United Kingdom, and safe harbors for COPPA compliance.
  • Scope of the COPPA Rule. Panelists discussed challenges in determining whether an online service is child-directed, and unintended consequences that adults experience as a result of increased regulation.
  • Definitions, Exceptions, and Misconceptions. Panelists debated the costs and benefits associated with obtaining verifiable parental consent. They also highlighted misconceptions regarding COPPA, including about the scope of the law and its restrictions on data collection, the purpose of COPPA, and the extent to which parents are informed about privacy.
  • Uses and Misuses of Persistent Identifiers. Panelists addressed the value of different types of advertising for online services directed to children and the breadth of the "support for internal operations" exception under COPPA, which allows disclosure of persistent identifiers for limited purposes. Julia Tama, a partner in Venable's privacy group, participated in this panel.

Prior to the workshop, on October 4, 2019, Senators Edward Markey (D-MA), Richard Blumenthal (D-CT), Josh Hawley (R-MO), and Marsha Blackburn (R-TN) sent a letter to the FTC urging it not to weaken COPPA's protections. In the letter, the senators note that they agree that the COPPA Rule should be updated but cautioned against adding exceptions to the rule.

Shortly after the workshop, the FTC extended the deadline for filing comments on the COPPA Rule. The new deadline for comments on the COPPA Rule is December 9, 2019.

In the States

California Attorney General Releases Draft Regulations for CCPA

On June 28, 2018, California Governor Gavin Newsom signed the California Consumer Privacy Act (CCPA), a data privacy law that gives California consumers new rights to access the personal information businesses maintain about them, delete such information, and opt out from the sale of such information, among other rights. The law is scheduled to go into effect on January 1, 2020. After a series of amendments to the CCPA were approved in 2018 and 2019, the California Office of the Attorney General (OAG) issued proposed regulations on October 10, 2019. The proposed regulations provide details about the requirements the CCPA places on businesses and set forth entirely new obligations for businesses. Topics addressed by the OAG in the proposed regulations include:

  • New Defined Terms. The proposed regulations present more than twenty new defined terms. Among these terms are "categories of sources" and "categories of third parties," which provide examples of types of entities that may be named in a privacy policy and a consumer access request disclosure as sources of personal information and third parties with whom personal information is shared. Additionally, the proposed regulations clarify that the term "household" means "a person or group of people occupying a single dwelling."
  • Required Notices. Businesses must provide consumers with a privacy policy and a notice of the right to opt out of the sale of personal information. Businesses that collect information directly from consumers also must provide a "notice at collection" before collecting personal information. Businesses that offer financial incentives or price or service differences in exchange for the retention or sale of consumers' personal information must provide a "notice of financial incentive." The proposed regulations stipulate the required content of these notices and the ways in which they must be presented to consumers.
  • Submitting and Responding to Requests. A business must provide at least two methods for consumers to submit CCPA access or deletion requests. A business must acknowledge a consumer request to know (i.e., an access request) within ten days of receiving the request. Furthermore, a business must act upon a request to opt out of the sale of personal information no later than fifteen days after receiving the request.
  • Verifying Consumer Requests. The proposed regulations require businesses to establish, document, and comply with a reasonable method for verifying consumer access and deletion requests and list factors a business must consider in determining such a reasonable method of identity verification. In addition, the proposed regulations address how a business must verify consumers who maintain password-protected accounts with the business and consumers who do not maintain such accounts.
  • Training and Record-Keeping. Businesses must train individuals responsible for handling inquiries about the business's privacy practices on the requirements of the CCPA. Businesses also must maintain records of consumer CCPA requests and how the business responded to those requests for at least two years.
  • Household Information. The proposed regulations describe how a business must respond to CCPA requests involving household information. The draft regulations list the conditions under which a business may provide aggregate household information and specific pieces of information for the household in response to an access request.
  • Minors. Before selling personal information of a child under the age of thirteen, a business must obtain "affirmative authorization" to do so from the child's parent or guardian. Such authorization must be in addition to any verifiable parental consent required under the Children's Online Privacy Protection Act. The proposed regulations also note that businesses must establish, document, and comply with a reasonable process for allowing minors aged 13 to 16 to opt in to the sale of their personal information themselves. A business that is subject to these provisions must describe its process(es) for obtaining such required authorizations in its privacy policy.

The OAG will hold public hearings to receive input on the content of the draft regulations on the following dates and in the following locations throughout California: December 2, 2019 in Sacramento; December 3, 2019 in Los Angeles; December 4, 2019 in San Francisco; and December 5, 2019 in Fresno. Interested parties may submit comments on the content of the proposed regulations to privacyregulations@doj.ca.gov by 5:00 p.m. PST on December 6, 2019.

New Effort to Place Privacy Initiative on California's 2020 Ballot

On September 25, 2019, Alastair Mactaggart, chair of Californians for Consumer Privacy, submitted a new ballot initiative, The California Privacy Rights and Enforcement Act (the Act), to the California Attorney General. Under California law, if Mr. Mactaggart is able to qualify the Act by obtaining the required number of signatures, currently set at five percent of registered voters, the Act would appear on the 2020 ballot in California for consideration by voters. Mr. Mactaggart sponsored the 2018 ballot initiative that led to the California legislature's adoption of the California Consumer Privacy Act (CCPA).

Mr. Mactaggart stated in a press release that he is proposing the new initiative for two reasons. First, he stated that "the world's largest companies" attempted to weaken the CCPA, and therefore it needed to be strengthened via a ballot initiative. Second, he stated that technology has evolved in ways that "exploit a consumer's data with potentially dangerous consequences." Mr. Mactaggart argued that the Act would strengthen the CCPA in the following ways:

  • Create new rights around and define sensitive personal information, including health and financial information, racial or ethnic origin, and precise geolocation information.
  • Increase the CCPA's fines for violating the law's requirements related to children under the age of 16.
  • Provide transparency for automated decision-making and profiling, including when personal information is used for making adverse decisions about consumers.
  • Create a new agency in California, the California Privacy Protection Agency, to enforce the law and provide guidance to industry and consumers.
  • Create new political advertising requirements to increase transparency about how personal information is used for such advertising.

Because the Act would be a ballot initiative, any future amendment would need to further the purpose of the Act itself. Mr. Mactaggart stated that he is setting a simple majority of the legislature as the vote threshold to pass an amendment.

Since the first submission of the ballot initiative to the Attorney General's Office, Mr. Mactaggart has updated the submission twice, and the most recent update was accepted by the Attorney General on October 9, 2019. The California Secretary of State suggests that, depending on the method of obtaining signatures, proponents of initiatives should submit signatures with county officials in late March or early April of 2020.

International

EU and U.S. Officials Discuss EU – U.S. Privacy Shield

Senior officials from the United States Government, the European Commission, and European Union (EU) data protection authorities came together in Washington, DC on September 12 and 13 to conduct the third annual joint review of the EU-U.S. Privacy Shield Framework. The Privacy Shield provides a method for companies to transfer personal data to the U.S. from the EU in a manner consistent with EU data protection laws. To join the Privacy Shield Framework, companies must annually self-certify compliance with seven Privacy Shield Principles (and sixteen Supplementary Principles) designed to ensure "adequate" protection for personal data. The Privacy Shield is reviewed on an annual basis to ensure that it continues to provide an adequate level of protection for personal data. The reports on the first and second annual reviews are available here.

The U.S. Department of Commerce hosted the third annual joint review, which covered a wide range of issues relating to the Privacy Shield Framework, from administration and enforcement of the Framework to legal developments in the United States about commercial data protection and national security data access. In a joint press statement about the review, U.S. Secretary of Commerce Wilbur Ross and EU Commissioner for Justice, Consumers, and Gender Equality Věra Jourová explained that, since the Privacy Shield Framework became operational on August 1, 2016, more than 5,000 companies have made public and legally enforceable pledges to protect personal data transferred from the EU to the U.S. in accordance with the Privacy Shield Principles. Secretary Ross and Commissioner Jourová also noted that the EU and U.S. recently welcomed the appointment of several key U.S. officials with Privacy Shield responsibilities. The U.S. Senate confirmed the addition of two more members to the independent and bipartisan U.S. Privacy and Civil Liberties Oversight Board. The Senate also confirmed Keith Krach to be the Privacy Shield Ombudsperson at the State Department. In concluding their joint statement, Secretary Ross and Commissioner Jourová explained that, during the review, EU and U.S. officials emphasized the need for robust and credible enforcement of privacy rules to protect citizens and ensure trust in the digital economy.

The report of the third annual review will be published at a later date and will conclude this year's review process.

Google Wins "Right to Be Forgotten" Case in the EU

On September 24, 2019, the Court of Justice of the European Union (CJEU) ruled that upon a consumer's request for information about them to be removed from Google search results, Google is not obligated to "carry out a [global] de-referencing on all versions of its search engine." Accordingly, under European Union (EU law), Google's de-listing of search results applies only within the EU's member states. In its ruling, the CJEU noted that while companies are obligated to uphold EU citizens' de-referencing requests for search results and webpages within the EU, companies are not required to apply de-referencing outside of the General Data Protection Regulation's (GDPR) territorial scope.

The case originated in March 2016, when the French Data Protection Authority issued a €100,000 fine to Google because of Google's "refusal" to apply EU citizens' de-referencing requests to domain name extensions outside of the EU. Under the EU's GDPR, consumers may request that companies remove information about them from Internet search results and webpages under certain circumstances. Google appealed the penalty to the French Council of State, which referred several questions to the CJEU for a preliminary ruling to determine whether the protection of personal data under EU law required a search engine operator to carry out de-referencing on all versions of its search engine when granting an individual's request. The CJEU considered the case under the former Data Protection Directive 95/46/EC (the Directive) and the GDPR, which superseded the Directive on May 25, 2018.

In its ruling, the CJEU stated that it has previously held that the operator of a search engine is obliged to remove from the list of search results displayed following a search made on the basis of a person's name links to web pages containing information relating to that person. The CJEU emphasized that the right to the protection of personal data is not absolute and must be balanced against other fundamental rights. The CJEU stated that the Directive and the GDPR do not appear to have "struck such a balance as regards the scope of a de-referencing outside the EU, nor that is has chosen to confer a scope on the rights of individuals which go beyond the territory of the Member States."

The CJEU concluded that there is no obligation under EU law for a search engine operator to carry out de-referencing on all worldwide versions of its search engine as part of a data subject's request to be forgotten. However, EU law does require a search engine to carry out such a de-referencing on the versions of its search engine that correspond to the GDPR's territorial scope. Additionally, when necessary, the search engine operator must take additional measures to effectively prevent or, at the very least, seriously discourage internet users in the EU from gaining access to the list of results in question. Last, the CJEU provided that while EU law does not require search engine operators to carry out worldwide de-referencing, it also does not prohibit such a practice.

EU's Top Court Issues Ruling for Active Consent of Cookie Tracking

On October 1, 2019, the Court of Justice of the European Union (CJEU) issued an opinion on cookie consent requirements following a request for a preliminary ruling from the German Federal Court of Justice. The German court asked for clarification on (1) whether there is valid consent under the European Union General Data Protection Regulation (GDPR) "if the storage of information, or access to information already stored in the user's terminal equipment, is permitted by way of pre-checked checkbox which the user must deselect to refuse his or her consent," (2) whether it matters that the information stored or accessed constitutes personal data, and (3) what information the service provider has to provide to end users when using cookies.

The case was referred to the CJEU from the German court, where a consumer rights organization, the German Federation of Consumer Organizations (the Federation), challenged the online practices of an online gaming company, Planet49. The Federation claimed that Planet49 organized an online promotional lottery where website users were provided two checkboxes. One box was unchecked and solicited consent for marketing materials. The second box was pre-checked, consenting to the installation of cookies for advertising purposes on the terminal equipment of the website user.

Under the GDPR, a data subject's consent must be "freely given, specific, informed, and unambiguous." The Court found that consent through a pre-checked box did not meet these specifications. The Court stated that only active behavior by the data subject can be considered unambiguous consent. With a pre-ticked checkbox, the Court found that there was still ambiguity for a user, and he or she might overlook the checkbox. The Court also found that consent must be linked directly to the processing of data in question and cannot be inferred from the data subject's wishes expressed for other purposes.

The Court went on to say that the obligation to obtain consent under the GDPR is not limited to personal data. The requirement concerns "the storing of information" or "the gaining access to information already stored in the terminal equipment of a subscriber or user." The Court also stated that as part of the information given to a website user, the service provider must include (1) the duration of the operation of cookies and (2) whether third parties may have access to those cookies.