Those who think that privacy is dead these days might want to drop by the Walter Washington Convention Center in early March, for the International Association of Privacy Professionals’ Global Privacy Summit in Washington, D.C. IAPP’s membership, and attendance at its programs, have exploded in the last few years, as professionals have recognized the growing importance of rules and expectations about privacy — more specifically, about the collection, use and transfer of data.
One of these papers, by Neil Richards, a law professor and leading privacy scholar, describes and debunks “Four Privacy Myths.” The myths — that privacy is dead, people don’t care about privacy, we shouldn’t have things to hide, and privacy rules are bad for business — are inhibiting rational discussion and problem-solving, Richards contends.
Richards believes that if we really confront and overcome these myths, we’ll be better able to pick the right path with respect to privacy law — which, he notes, concerns “the rules we have as a society for managing the collection, use and disclosure of personal information.”
Let’s look at each of Richards’ four myths, and his explanations of why they’re not true.
- “Privacy is Dead.” It’s a great sound bite, and a lot of big names in the tech world have endorsed this myth. But new digital technologies haven’t threatened much of what we consider to be private. “We still put locks on our houses, we still wear clothes, and we still use doors to keep the general public out of our bathroom and bedroom.” Similarly, we’re generally protected from unreasonable seizures, and we treat ordinary private communications confidentially.
What about digital data — isn’t privacy dead there? Not at all, Richards says, noting the vibrancy of the debate on data privacy: “if we think about privacy as the question of what rules should govern the use of personal information, then privacy has never been more alive.” He also notes that even while Facebook founder Mark Zuckerberg proclaims the end of the age of privacy, his company maintains tight confidentiality over its operations. If, as Richard states, “Privacy is the shorthand we have come to use to identify information rules,” it is certainly not dead.
- “People don’t care about privacy.” This myth seems particularly strong with young people, who are said to have a whole different attitude. But again, Richards points to evidence that all people want to have some control over how their personal information is collected and used. Young people just seem less concerned about privacy from their peers and more about privacy from the perceived authority figures in their lives. For older people, the myth may be based on their failing to adjust all of their technological privacy settings — but that failure tells more about the “bewildering” nature of those settings than their real interests.
- “If you have nothing to hide, you have nothing to fear.” This myth, Richards says, “frames the question of privacy in ways that ignore the reasons why privacy matters.” Of course everyone wants some aspects of their lives private. Intellectual activities in particular — what books we read, what movies we watch — need a degree of privacy in order to promote broad and free thinking. People who fear that their intellectual activities are being watched, Richards says, “will restrict themselves to the mainstream, the conventional, and the boring.” If we care about a vibrant public debate, he says, we must care about this intellectual privacy right, which benefits both individuals and society.
- “Privacy is bad for business.” This final myth, Richards notes, is often a last resort argument by those skeptical of the need for new or clarified privacy rules for the digital age. The argument is that privacy gets in the way of technological innovation. But he argues that the contrary is true: Businesses that succeed at using personal data often depend on the trust of those who supply personal data, and that trust requires reliable and trustworthy rules and expectations about the proper limits on collection, use and transfer of information.
Facebook, for example, was valued at $104 billion in its initial public offering, when “its only real assets were its users, their data, and their eyeballs as viewers of advertising.” Such valuations underscore “the importance of trust in the digital environment.” And some studies by other privacy scholars similarly suggest that the presence of privacy controls in computer interfaces “makes individuals more likely to share their personal information.”
We’ve faced “privacy panics” before, Richards says. When photography and other new tools shined the light of publicity on elites 120 years ago, privacy law began to develop to protect against intrusions and embarrassing disclosures. When data banks began in the 1960s, statutory protections were enacted, and courts began expanding privacy rights. The new digital environment simply presents a new case of “new technologies and social practices threatening established social norms about how information could be used.”
With the help of Future of Privacy Forum, the thousands of IAPP members, and policymakers willing to eschew simplistic privacy myths, new and appropriate norms will be eventually established to meet today’s “privacy panic.”