On July 7-11, 2014, a group of 25 privacy lawyers met in a historic building overlooking the Keizersgracht, one of Amsterdam’s most beautiful canals, and spent five days learning about U.S. privacy law, European data protection law, and the complex interactions between them. The setting was the Summer Course on Privacy Law and Policy, presented by theUniversity of Amsterdam’s Institute for Information Law (IViR), one of the largest information law research centers in the world. Course faculty included leading practitioners, regulators and academics from both sides of the Atlantic. Course participants came from an even wider geographic area that included Hungary, Greece, Poland, the Netherlands, Hong Kong, Kyrgyzstan, Switzerland, the UK, Belgium and Canada. I was lucky enough to serve as a co-organizer of, and faculty member in, the course. In this post, I describe presentation highlights and identify some cross-cutting themes that emerged during the week.

Dr. Kristina Irion, Marie Curie Fellow at IViR (and the other course organizer) started the course with “An Update on European Data Protection Law and Policy.” The Summer Course does not try to cover every aspect of privacy law. Instead, it focuses on law and policy related to the Internet, electronic communications, and online and social media. In her presentation, Irion analyzed the latest European legal and policy developments in these areas. The most important such development is the proposed General Data Protection Regulation (GDPR) — a major reform proposal that several of the faculty presenters believe will become law in 2015.

The GDPR would replace the current patchwork of national data protection laws with a single, harmonized, directly enforceable Regulation. It will establish new substantive rights and principles such as the Right to be Forgotten and the principle of Data Portability. It will also create new obligations for organizations including the implementation of privacy by design and privacy impact statements. Dr. Irion identified and discussed other significant developments in EU data protection law including the Court of Justice of the European Union’s (CJEU) holding inSABAM v. Scarlet that ISPs may not be asked to filter content to protect copyright, and its landmark decision in Google v. AEPD that individuals may, in certain circumstances, require companies to remove online links to embarrassing information (the so-called Google Right to be Forgotten case).

In my “Update on U.S. Privacy Law and Policy” presentation, I first identified a number of weaknesses in the U.S. privacy law system. For example: sector-based privacy legislation leaves many important areas (e.g. websites, data brokers, mobile apps, social media) largely unregulated; today’s near-ubiquitous collection and sharing of data severely tests the limits of the notice and choice approach to privacy regulation; and the fact that U.S. privacy law is not “adequate” from the European perspective. Current developments in U.S. Privacy Law and Policy can be understood as attempts to address these weaknesses. In the past year the White House and members of Congress have proposed comprehensive privacy regulation, the U.S. Federal Trade Commission (FTC) has paid increasing attention to data brokers and mobile apps, and the U.S. Supreme Court issued a major opinion extending Fourth Amendment protection to cell phones even when searched incident to an arrest.

These actions reduce the gaps that a sector-based legislative approach creates. In its most recent settlements, the FTC has been requiring companies to implement comprehensive privacy programs and submit to third-party audits. Such requirements take the FTC beyond the notice and choice, and into the realm of privacy management. Another current development is the FTC’s increased enforcement of the US-EU Safe Harbor Agreement. This signals a renewed interest in demonstrating that the U.S. system can be “adequate.” Identifying the weaknesses in the U.S. system, and viewing recent actions as attempts to shore them up, provides a framework for understanding current developments in U.S. privacy law and policy and where the field is headed in the future.

Daniel Cooper, Partner and Head of the Global Privacy Practice at Covington & Burling LLP, presented a seminar titled “Doing Business Over the Internet: Legal and Policy Challenges.” Mr. Cooper explained that the 1995 Data Protection Directive is out of step with today’s technologies. The General Data Protection Regulation will bring the law more into line with technology. It will also produce a dramatic change in the practice of privacy law with many new challenges and issues. As Cooper sees it, the most significant legal issues for those who do business over the Internet arise from: online behavioral advertising; geolocation tracking; business use of social media; the use of facial recognition technology, particularly when combined with social media; and the processing of children’s data.

He discussed the latest Article 29 Working Group and data protection authority (DPA) opinions in these areas. These opinions include the Article 29 Working Group’s 2011 opinion on geolocation services in which it made clear that tracking mobile devices or WIFI access points constitutes the processing of personal data; and the Hamburg DPA’s 2011 determination that Facebook should have obtained informed consent (rather than just opt-out consent) before instituting its facial recognition-based Tag Feature, and that it must delete its entire database of faces collected in Germany. Mr. Cooper discussed the Google Right to be Forgotten case and highlighted the uncertainties that businesses will face as they try to implement the balancing test and determine whether or not they must take down particular content.

Sjoera Nas, Internet and Telecom Expert for the Dutch Data Protection Authority and a frequent contributor to the proceedings of the Article 29 Working Group, discussed developments in the Internet sector from the perspective of enforcement. Nas explained that the Dutch DPA has only 75 employees and must prioritize its enforcement actions accordingly. Recently, it has chosen to target its enforcement resources on six Internet-related matters: Google (data from and about WiFi-routers); TomTom (location data); packet inspection by mobile operators (4 reports); TP Vision (Philips smart tv); Whatsapp (with Canadian DPA); YD (online advertising agency); and Google’s new privacy policy (with 5 other DPAs). Nas explained the claims against each entity and the resolutions reached to date.

She then led the course participants in a simulated enforcement proceeding in which a third of the class represented an app developer, a third represented the DPA that had brought an enforcement action against the app developer, and a third played the role of judge. One of the most important lessons to emerge was the the DPA must think in a probing way about the facts that the compay presents, not to just accept them at face value.

Neil Richards, Professor of Law at Washington University School of Law (St. Louis, MO) grabbed everyone’s attention with a compelling description of how the digital economy threatens “intellectual privacy.” Richards defined intellectual privacy as “protection from surveillance or interference when we are engaged in the processes of generating ideas — thinking, reading and speaking with confidants.” He maintained that free speech and free thought are essential to a democracy, and that intellectual privacy is essential to free speech and free thought. He explained that business models from Google’s reading of Gmail, to Amazon’s tracking of which books and pages we read, to Facebook’s collection and retention of much about our social lives, have not been designed with the protection of intellectual privacy in mind.

Though some clearly use the Internet to promote public discussion and civil society, the dominant trend is for the digital surveillance society to undermine the essential conditions of democracy (free speech and free thought). Richards ended with a call for a sensible information policy that would acknowledge the importance of, and seek to protect, intellectual privacy. Legal practitioners in government, industry and the nonprofit sector can expect to encounter these ideas in their work as more citizens become concerned about surveillance and its chilling effect on free speech and free thought. Richards’ book, Intellectual Privacy, will be published by the Oxford University Press in January 2015.

Dr. Christopher Kuner, Senior of Counsel at Wilson, Sonsini, Goodrich & Rosati, LLP, and one of Europe’s leading data protection lawyers, presented a seminar titled “Data Protection Law and Compliance for Online Business.” Kuner explained that data protection law is of great relevance to business today because of the compliance and reputational risks that it poses as well as the opportunities for competitive advantage it creates for companies that become known as good stewards of their customers’ personal data.

Private sector data protection lawyers typically advise on business projects, coordinate multi-jurisdictional (including transatlantic) compliance efforts, interact with regulators, and occasionally engage in litigation. The lawyer must frequently interpret data protection law’s broad principles to apply it to specific situations and works under conditions of legal uncertainty. Kuner said that the Google right to be forgotten case creates a vague test for determining which content an intermediary must remove. Companies will need to interpret this test and do their best to anticipate how regulators and courts will apply it. The data protection lawyer’s task is to figure out an intelligent way through these uncertain waters.

Chris Hoofnagle, the Director of Information Privacy Programs at the University of California-Berkeley Center for Law and Technology, and Dr. Joris van Hoboken, Postdoctoral Research Fellow at New York University’s Information Law Institute, identified the points of tension between the EU and U.S. privacy law systems, and offered ideas about how to overcome them. Hoofnagle said that, to comprehend the U.S. system, it is vital to understand the FTC, the lead privacy regulator. The U.S. has no comprehensive consumer privacy statute, and the FTC is not a DPA. That said, the FTC does have considerable authority that includes the power to investigate, sue in court, and issue administrative complaints, rules and civil fines (where authorized by statute). The FTC can enforce against “unfair or deceptive acts or practices” and does not hesitate to use its powers. Most defendants settle and enter consent decrees.

The FTC has grounded most of its privacy complaints in its deceptiveness jurisdiction, but is increasingly using its unfairness authority. The main points of tension between the European and U.S. privacy law systems, van Hoboken said, include: omnibus vs. sectoral privacy legislation; European rules that limit the export of personal information to nations, such as the US, that do not have “adequate” privacy protections; the terms of the “safe harbor” that currently allows certain U.S. companies to process Europeans’ personal data; free speech as a right to be balanced against data protection vs. free speech as a trump card that can negate privacy protections; and the European obligation always to have a legal ground for processing personal information vs. the U.S. free market approach built on consumer and contract law. Those who would achieve interoperability between the two systems will have to address these key points of tension.

Ian Brown, Professor of Information Security and Privacy at the Oxford Internet Institute, spoke about the technologies of the future and the challenges they will pose for privacy law and policy. Brown described several different ways in which the Internet could evolve. Possibilities include optimistic scenarios in which Internet access is democratized, individuals gain increasing access to online education and services, the spread of Internet-connected appliances and objects (the so-called “Internet of things”) produces a more energy-efficient and productive economy, and government policies emphasize privacy and consumer protection and so increase user trust. More pessimistic scenarios include the possibility that mergers between ISPs, search engines, social media networks and entertainment conglomerates could result in a fragmented, sanitized Internet in which users are intensively profiled and subtly controlled.

The policy choices we make today will influence which of these scenarios more accurately describes the Internet of the future. Designing for privacy — including data minimization and enhanced, user-friendly notice and consent mechanisms — can increase the chances of a more optimistic outcome.