The British Prime Minister, David Cameron, has indicated that if his party wins the next general election, he will introduce legislation allowing UK intelligence agencies to eavesdrop on the communications of suspected terrorists - subject to the approval of the Home Secretary.
Speaking on 12 January, he said: 'In extremis, it has been possible to read someone's letter, to listen to someone's call, to mobile communications ... The question remains: are we going to allow a means of communications where it simply is not possible to do that? My answer to that question is: no, we must not. The first duty of any government is to keep our country and our people safe.'
The immediate impetus for his statement, of course, was the Paris attacks. But there's a wider context. UK security chiefs have been saying for some time that they believe modern communication mechanisms make it harder for them to do their jobs - here and here, for example. It also marks the revival of plans that were shelved earlier in the life of this parliament because of Liberal Democrat opposition.
It's unclear how this would be implemented. Perhaps the legislation will outlaw encryption altogether? Or maybe it will require vendors who use encryption in their applications to provide a way of 'seeing through' the encryption, i.e. a backdoor of some kind? In the 1990s, the NSA tried this with its Clipper Chip. The idea was to install the chip in physical devices (e.g. telephones) and use it to encrypt communications. Each device would be given a cryptographic key that the government would hold in escrow. If a law enforcement agency was deemed to have a legitimate right to intercept data, it would be given the key allowing it to do so. Clipper was unpopular with manufacturers, consumers and privacy groups and the idea fell by the wayside. Ironically, it was one of the factors behind the development of encryption technologies that David Cameron has identified as a problem.
I don't believe that application vendors would be very happy to provide their customers with a 'snoopable' form of encryption. If anything, in the light of growing privacy concerns over the last 12 months, vendors have been keen to take steps to secure their customers' communications - for example, the implementation of default encryption by Apple and Google last year.
It's also unclear how the government would enforce the use of only those applications that implemented the 'approved' encryption algorithm. What if someone were to use an illegal form of encryption, for example by downloading an encryption application available in another country? I rather suspect that, in the wake of such legislation, cybercriminals and others would seek alternative ways of conducting their affairs that were 'out of reach'.
However, there's another side to this. If the backdoor used by government agencies to monitor encrypted traffic were to fall into the wrong hands, cybercriminals (or governments of other countries) would also be able to monitor such traffic - thereby undermining not only individual privacy, but corporate or national security.
There's an inherent tension between privacy and security. This isn't going to disappear, although the emphasis may shift in different depending on the geo-political situation and security context at any given time. David Cameron is clearly conscious of the fact that there's no way to restrict the use of encryption to honest, law-abiding citizens. However, at the same time, the government has made it clear that it wants organisations in the UK to protect themselves from cybercriminals and other would-be intruders. Encryption is a key part of this. No company can guarantee 100 per cent that its systems will not be breached. But the use of encryption can ensure that such a breach doesn't result in the loss of sensitive information.
The government, it seems, is suffering from cognitive dissonance.