"A unique crime fueled by technology." That's the Illinois Supreme Court's description of the crime colloquially known as "revenge porn." But it’s also a phrase suited for many of today's new, troubling technology-based forms of mischief and mayhem. And the court’s handling of one crime fueled by technology may shed light on how other technology-based crimes may be addressed, and litigated.
"Revenge porn" is an inadequate shorthand for the crime at issue in Illinois v. Austin. As the court explained, the Illinois statute covered broader content than pornography and more motives than revenge. Essentially, it criminalized the intentional non-consensual dissemination of private sexual images. In the case at hand, a woman who had access to her former boyfriend's internet cloud archives publicized nude photos of the boyfriend's new girlfriend. The criminal charge alleged every element of a statutory violation: intimate and private photos, intentionally published by the defendant, with knowledge that the woman portrayed did not consent.
The court's decision addressed only the threshold issue of the constitutionality of the Illinois statute covering intentional non-consensual dissemination of private sexual images (but nonetheless referred to as the "revenge porn" act). A lower court had found the act to violate the First Amendment by penalizing speech based on content. The state Supreme Court reversed that ruling.
The Supreme Court decision is most interesting for how it deals with the uniqueness of the crime, due to its having been "fueled by technology." Initially the court had to address whether the circumstances were so extreme that a new categorical exception should be made to the First Amendment — that is, is non-consensual dissemination of private sexual images so bad that it should specifically rank as outside of First Amendment protection, along with obscenity and child pornography? A case may certainly be made for this, but the court chose not to address it, perhaps realizing that the Roberts Court has signaled that it will likely never recognize a new categorical exemption to the First Amendment.
Next the court was confronted with the simplistic but increasingly popular principle that any content-based restriction on speech is subject to the highest level of scrutiny — strict scrutiny. This was the trial court's reasoning; it set the highest constitutional standard and then, not surprisingly, found that the statute could not meet it.
The Supreme Court essentially held that while electronic communications consist of content, not every regulation of electronic communications suppresses protected discussion. (The same is true of traditional speech, where content-based regulations are permitted for purposes of consumer protection, prevention of fraud, and addressing other harmful communications). Here, the court found the statute addressed personal privacy, by regulating dissemination of a certain type of private information. It was not designed "to suppress discussion" of a topic. Additionally the subject of the act (intimate photos) was private and "did not relate to any broad issue of interest to society at large." Accordingly, the court found that an intermediate First Amendment standard, not strict scrutiny, applied.
In giving weight to the legislature's desire to protect privacy, the court invoked the long history of privacy law: "the entire field of privacy law is based on the recognition that some types of information are more sensitive than others, the disclosure of which can and should be regulated."
In applying the intermediate scrutiny standard, the court looked to the government interest (personal privacy), the fit between the statute and the harm (very close in the court's opinion), and the possibility that the statute might chill legitimate speech (slim, in the court's opinion). It therefore held the statute constitutional (by a 5-2 vote).
The decision suggests a few things about cases on "crimes fueled by technology." First, the majority relied heavily on research and published articles about the breadth and nature of the problem of non-consensual dissemination of private sexual images. Research by litigants and amici into new technology-based forms of misconduct may be necessary to inform courts about the context and scope of these new problems created or aggravated by technology.
Second, even on an issue with a broad social consensus, such as non-consensual dissemination of private sexual images, the argument that any content-based restrictions are protected by the First Amendment must be addressed. It may be more difficult to do so in cases in which the regulatory motive is based on something less well-established than the privacy interests that were central here.
Third, the law is as slow as technology is fast. Alerts have been sounded for years about non-consensual dissemination of private sexual images. The first state law was enacted in 2004. Illinois enacted its statute in 2014 and as of October 2019, 46 states have enacted some form of protection — many of them, undoubtedly, like Illinois, still fighting over their coverage and constitutionality. Many victims of crimes fueled by technology may find the law too slow to protect them.