Encryption and cybersecurity has been in the news a lot recently, particularly following high profile data breaches from Talk Talk, the NHS and Sony, as well as David Cameron's statement last year that terrorists should have no safe space to communicate online. 

The law has a strange relationship with encryption. UK regulators criticise companies where they have suffered a data breach and failed to encrypt their customer data, but this approach doesn't exactly coalesce with the political rhetoric around government access to communications (encryption being a significant barrier to this access) or the current plans contained in the draft Investigatory Powers Bill.

The proliferation of encryption

Encryption has never been something used only by organisations to protect their customer data and those looking to cause trouble. It is used, and protects us, almost every time we make a purchase over the internet or check our bank balance online. Some websites, such as Facebook, now use encrypted browsing by default. It is also now more common than ever for individuals to encrypt their own internet traffic, hiding their activities from their internet service provider and potentially law enforcement agencies, via Virtual Private Networks and other services (all of which can be purchased using anonymous, digital currencies). 

Despite this, perhaps encryption doesn't go far enough.

The recent surge in the "internet of things" (think wearable technology and so-called smart homes), drones and self-driving cars also present a challenge for encryption. Consumer drone manufacturers have been criticised for failing to implement adequate encryption methods, enabling the drone to be controlled by unknown individuals and potentially causing harm to anyone in the immediate area. Worryingly, similar criticism has also been levied at military drones. This is not the end of the story, however, as recent developments in self-driving cars also presents a problem – what if the controls were hackable? Reports are already surfacing that the systems used by self-driving cars to navigate can be confused by a properly configured £20 micro-computer, and encryption will no doubt form part of the toolkit used to secure the systems in emerging technologies such as self-driving cars and delivery drones.

The UK regulator's position

The Information Commissioner's Office, the body responsible for the enforcement of data protection legislation in the UK, takes a dim view when it transpires that organisations have lost unencrypted data. Where hackers, or opportunists who steal a civil servant's laptop on the train, for example, acquire data which is encrypted, the information received is unintelligible unless the decryption "key" can be worked out or otherwise obtained. There appear to be no cases in which the theft of encrypted data ultimately resulted in the corresponding decrypted data being made available online (if you are aware of any examples, please comment below or contact me).

Such is the power of encryption that the ICO advises data controllers that when hackers gain access to encrypted data, they do not need to tell their customers that a breach has taken place (view details). Encryption is also something which can be implemented at relatively low cost and, given the protection it offers, it is understandable that the failure to encrypt customer data is seen as an aggravating factor by the ICO when determining any appropriate sanction on the organisation which suffered the data breach.

A change in the market

For encryption to be useful, individuals and organisations ultimately have to be able to decrypt, or make use of, the encrypted information. Generally speaking, "keys" are used to decrypt information, and much of the recent controversy focusses on a shift in the technology market around who holds these all-important keys. 

With this in mind, one area of particular interest is the recent shift in the role encryption plays in online messaging. Previously, sending an online message (for example via WhatsApp) did not involve any kind of encryption, or if it did, the message operator would hold the keys to the encrypted messages. This potentially enabled the messaging operator, government security agencies and others connected to the same wireless network as you to intercept and eavesdrop on your messages. 

From late 2014, however, WhatsApp implemented "end-to-end" encryption. This means that all messages sent over WhatsApp are now encrypted, and crucially, only the sender and receiver hold the necessary keys to decrypt the message. The result is the transmission of an unintelligible mess over the internet until it reaches the App on your smartphone, at which point its conversion into a message (possibly also with an accompanying emoji, if you're lucky) appears on your screen.

WhatsApp is not the only messaging provider implementing this kind of encryption and this approach can, in my view, now be described as a growing trend across the consumer technology market. 

Why does this matter?

The shift towards end-to-end encryption matters because it makes it significantly more difficult for messages to be intercepted by anyone, including the organisation providing the messaging service (unless that organisation manages the keys centrally) and government security agencies. Many messaging service providers no longer have the keys to decrypt individuals' messages, and essentially now provide a platform for the transmission of secure, encrypted data. It seems then, that this shift in the technology market has precipitated vague government statements to the effect that encryption should be banned (which statements were subsequently retracted, and we are now left with the less clear claim that there should be 'no safe place' for terrorists).

This discussion is all well and good – but more clarity is needed around exactly what technical measures we can expect to be implemented in order to achieve this 'no safe place' for terrorists. The government seems particularly aggrieved at the recent trend towards end-to-end encryption in which only the users hold the keys, but this represents a natural evolution in data security. If it is actually the case that the correct balance in the politician's views is for communications operators to 'hold the keys', and in doing so erode personal privacy to an extent, they should be bold enough to say so.