Holland & Knight Associate Marissa Serafino moderated a panel discussion on Oct. 19, 2020, entitled, "How Do We Create a Comprehensive and Lasting Privacy Law?" The presentation, which was hosted by the Boston Bar Association (BBA), included Commissioner Noah Joshua Phillips of the Federal Trade Commission (FTC); Peter M. Lefkowitz, Vice President and Chief Privacy & Digital Risk Officer at Citrix Systems; Danielle K. Citron, Professor of Law at Boston University School of Law; and Woodrow Hartzog, Professor of Law at Northeastern University School of Law.

The discussion framed the key points of consensus and the remaining points of friction in today's public policy debate about the future of privacy law and regulation. The conversation was timely in light of the renewed focus of Congress on privacy legislation and the immense amount of work going on "behind the scenes" on drafting provisions of proposed legislation.

The speakers put the current legislative effort in the context of the major privacy regulatory developments of the past few years, including:

  • the implementation of the General Data Protection Regulation (GDPR) in 2018
  • the July 2020 decision by the Court of Justice of the European Union invalidating the EU-U.S. Privacy Shield in Schrems II
  • the implementation of the California Consumer Privacy Act (CCPA) in 2020, and the inclusion of the California Privacy Rights Act (CPRA) on the 2020 ballot in the state
  • other states, such as Washington, have followed suit in considering comprehensive privacy bills, which would expand the patchwork of state-by-state privacy laws in the U.S.

The panelists first discussed the foundational parts of a federal privacy law – the model and the preconditions – which are often lost in high-level congressional debates focused on enforcement. Below are highlights of the panelists' comments on this question.

Commissioner Phillips

  • "The more important set of questions is what's in and what's out, what are the practices we want to bar and permit, what are the areas that we want to embrace consumer choice, why or why not."
  • "What's good … about GDPR is having buckets of conduct that are in … and then we focus that energy of choice on areas that are harder, right, and maybe if Congress so deems, we even ban certain conduct, but we limit the scope of where the decisions are made which hopefully allows them to be made with better thought, and allows enforcers to aim our tools at the conduct and at the cases that are able to really move the needle in those particular areas."

Citron

  • "We should…understand information privacy as implicating our civil rights. That without question our personal data, its collection and its use in sharing can make it impossible to get a job, keep a job, that so many of our life opportunities are on the line in ways that we don't appreciate nor do we protect." … "This helps us think through the kinds of commitments we might make substantively, but it also helps us understand the stakes."
  • "There may be certain types of data that we don't collect at all like narrow conduct prohibitions."
  • "My view is that we overcollect and underprotect, especially when it comes to intimate information."

Hartzog

  • "One of the reasons that I have been so skeptical of consent is that I think it takes choice with it, it's bringing it down, and it doesn't have to. One of the things that I would really like to see decoupled in our regulatory framework is to remove consent from choice, so that people have lots of choices and they are protected no matter what they choose."
  • "There's an alternative way of imposing affirmative duties on data processors and we've seen this in some of the legislation that has been introduced by Sen. (Maria) Cantwell's (D-Wash.) bill and Sen. (Brian) Schatz's (D-HI) bill have had duties of care and loyalty built in – some people refer to this as the information fiduciaries concept – and to me, that's a way to highlight choice while decoupling it from this notion of consent…"

Lefkowitz

  • "My fear in 2019 was we would have another Cambridge Analytica, somebody would take the bill on their desk, and they were going to pass it. Now I feel like with more time and some more thought and some more discussion and the opportunity to sink into some of these discussions, notions of what's really just ok that we should allow, what's really not OK and we should try to find a way not to allow, it allows us to expand beyond the 50-page privacy policy notice and choice discussion that unfortunately I felt like were left with at the beginning of COVID."

Data Usage

Another topic of discussion was data usage, and specifically the concepts and key provisions from other privacy bills that the speakers felt Congress should incorporate in any federal privacy bill to enable acceptable uses of data. From the industry perspective, Lefkowitz explained the recent development of what he calls "critical data industry," which includes banking, healthcare, research, cybersecurity, transactions, and the massive web of players necessary to enable the seamless sharing of an individual's data, for instance, from the hospital, to the doctor, and to the pharmacy. He translates this concept into his work at Citrix under an algorithmic theory: Accountability + Transparency + Acceptability of Use (anticipated uses) = Trust. All speakers generally agreed that trust is an important component to approaching privacy regulation.

Separately, Citron argued that any privacy law should apply broadly, if only to protect some of the most sensitive information that has real implications for individuals to make the most of every opportunity. She also questioned the presumption that the collection of data is beneficial to consumers, because there is no evidence of that. She explained the concept of "when everything reveals everything," which means that society may soon be collecting so much innocuous data that together it generates revealing, sensitive information. This is again why she supports a civil rights model and putting use (such as legitimate use/anticipated uses) and collection restrictions on the table in the policy debate.

Hartzog added his wish list of concepts and provisions to incorporate in a federal privacy bill for data usage purposes, which includes:

  • GDPR's legitimate interest approach
  • relationship duties (i.e. the concepts of duties of care and loyalty)
  • corporal or structural elements, which are ways to use privacy and civil rights concerns to change the structure of organizations in the way that the GDPR does
  • substantive rules such as limiting data collection or banning things outright to address externalities, such as environmental and attention externalities
  • manipulation, e.g. Sen. Roger Wicker's (R-MS) proposed bill regarding dark patterns
  • algorithmic accountability

Based on the panelists' responses, they generally agree about the importance of data use restrictions through a legitimate interest-type model.

Privacy and Data Security Issues Highlighted by COVID-19

In addition, the panelists had interesting perspectives on how COVID-19 has impacted privacy and data security, such as the increased security vulnerabilities that have been highlighted in the transition to remote work and education. Congress has not yet reached consensus about whether a federal privacy bill should include a data security section, though there is general agreement that a data security bill is critically needed. Commissioner Phillips said significant gaps exist in the FTC's authority to enhance and uphold data security standards, including the absence of a federal law. Fundamentally, he thinks that there is a market problem – the cost of protecting data often is not worth it for most businesses. As a result, data security is a looming liability. Lefkowitz said that most industry stakeholders would likely agree that we need a federal data security law, but that any such law needs to be standards-based, to which Commissioner Phillips agreed.

Hartzog added that the breach notification and safeguards approaches only capture a small number of the actors that contribute to overall risk, but usually only one company gets left holding the data "hot potato" after an incident. He argued that a legislative framework would allow us to break out of what constitutes good data security. Citron added that including new penalties, and specifically strict liability, in a data security law would change the activity level and help address the ecosystem of risk. Lefkowitz conferred that he does not support strict liability, in part because the industry that has developed around data theft is now so sophisticated that it is impossible for any business or institution to be perfect at security. Despite a company's best efforts at security, they can still be victims. However, he agreed that in the absence of any law or standard, attention falls to a small subset in a sometimes unfair way.

The conversation also focused on how COVID-19 has brought about the need to collect health data in contact tracing and telehealth. In terms of the pandemic, Citron agreed that where we need data protection is in healthcare. She was encouraged by Sen. Mark Warner's (D-VA) COVID-19 privacy bill, which talks about data as a civil right in terms of use restrictions. She said the U.S. has not had a coherent, national policy on contact tracing, and that may partly be because there are no federally established standards. She concluded that having transparent rules and regulations about data use restrictions would be helpful in emergency situations like COVID-19.

Commissioner Phillips agreed that the U.S. could use a much better system for regulating health data than the narrow Health Insurance Portability and Accountability Act of 1996 (HIPAA). Hartzog said Congress must be confident in any reforms given how difficult it is to change policies. He thinks Congress should expand and instill trust into HIPAA, which he says acts a solid framework.