For privacy advocates, it is universally accepted that encryption is a very good thing. After all, encrypted data is deemed a safe harbor under HIPAA and state breach-notification laws, providing an “out” from potential fines and penalties when an encrypted device is lost that contains sensitive health or other personal information. In addition to encouraging encryption via statutory safe harbors, federal and state regulators also use the stick of enforcement actions, costing organizations millions of dollars when sensitive information is put at risk due to the lack of encryption.
So the issue would appear to be settled. Use encryption, and more encryption, and the government will be happy – right? Well, not quite. 2016 brings a battle between privacy advocates, who want more security, and U.S. government agencies, such as intelligence and law enforcement, which deem encryption a threat.
Why would the government consider encryption to be a problem? It all comes down to counterterrorism. In the wake of the Paris, California, and other ISIS-directed or inspired attacks, the U.S. government is concerned that terrorists are able to obtain refuge by using encrypted communications.
The issue of encryption and terrorism was on the agenda at a recent Silicon Valley summit meeting, where various senior Obama administration officials, including the U.S. attorney general and the NSA director, met with representatives from various technology companies, such as Apple, Facebook, and Google. According to reports, the White House told summit participants that to avoid detection, terrorists are using encrypted forms of communications at various stages of attack plotting and execution. The White House expects terrorists will continue to use encrypted communications where law enforcement cannot obtain the content of the communication, even with court authorization. The White House offered to share additional information at classified briefings.
So based on terrorism fears, the government is now pushing technology companies to open a “backdoor” to otherwise-unbreakable encryption technology. A backdoor would enable technology companies to decrypt information on a device at the request of intelligence agencies and law enforcement. But as privacy advocates, including Apple CEO Tim Cook, point out, backdoors not only allow the “good guys” such as law enforcement in, but also leave an opening for the bad guys to gain access to private consumer information, including bank information and medical records.
In other words, once a backdoor is in place, it’s there for everybody, and it significantly weakens security and privacy. Security specialists are also very uncomfortable with the notion of the government keeping the decryption keys safe from hackers and criminals. Those experts contend that poking a hole in encryption could not only put consumers’ confidential data at risk, but also allow access to power grids and financial institutions.
Right now the U.S. government is seeking voluntary cooperation from technology companies, and is not pushing for legislation on this issue. But it’s a different story at the state level. A California State Assembly member recently introduced legislation that would require any smartphone manufactured after January 1, 2017, (and sold in California) to be capable of being decrypted and unlocked by its manufacturer. This follows on the heels of a similar bill introduced last year in New York.
Ironically, it’s the technology companies, which in the past have been hammered over privacy issues, who are now taking a strong stand for consumer privacy and encryption. Meanwhile, the government is seeking to potentially weaken encryption technology in the name of safety over privacy. Will Americans have to choose between privacy and national security in 2016? According to Apple’s Tim Cook, “We’re America. We should have both.”