Before 2013, encryption was mostly known in the mainstream community primarily due to its use in spy movies. That changed after whistleblower Edward Snowden revealed several of the National Security Agency's mass surveillance programs, and in the half-decade since, encryption went from something Tom Cruise had to overcome to save the world, to a major selling point for nearly every tech product. This shift has repeatedly frustrated government agencies, and the Five Eyes intelligence alliance is once again pushing back against the rise of encryption.
Five Eyes is a partnership between the United States, Canada, United Kingdom, Australia and New Zealand through which their respective intelligence agencies share information (the U.S. shares data with other countries, like Germany, but those relationships aren't as official as the one between these partner countries). The result is a massive surveillance network that encompasses most of the world and lets these countries assist each other with law enforcement investigations and matters of national security. Encryption makes this much harder.
That brings us to Five Eyes' latest call for encryption providers to offer ways for law enforcement and intelligence agencies to access user information. The alliance explained in the Statement of Principles on Access to Evidence and Encryption this week that the same encryption used to protect "personal, commercial and government information" is also used by "criminals, including child sex offenders, terrorists and organized crime groups to frustrate investigations and avoid detection and prosecution." They effectively argue that defending the former group enables the latter.
This argument has been made countless times before (the mention of child sex offenders and terrorists during a debate about encryption might as well be free spaces on the conversational Bingo card). Perhaps the most notable example in the U.S. came during the investigation into the San Bernardino shooting of 2015, when the FBI tried to force Apple to break the encryption on an iPhone owned by one of the shooters. The company refused, saying complying with the request would set precedent and expose flaws in the iPhone's security that others could exploit.
There have been many other times when concerns about governments forcing tech companies to put backdoors in their products were piqued. The argument is often the same: privacy advocates say encryption is required to allow people to live their lives without invasive government scrutiny, and law enforcement agencies say that offering secure products hinders their investigations. Thus far the privacy advocates have won because it's simply impossible to guarantee a backdoor would only be used for lawful investigations; anyone could exploit the vulnerability.
But Five Eyes made the same appeal in its recent Statement. The alliance said:
"The increasing gap between the ability of law enforcement to lawfully access data and their ability to acquire and use the content of that data is a pressing international concern that requires urgent, sustained attention and informed discussion on the complexity of the issues and interests at stake. Otherwise, court decisions about legitimate access to data are increasingly rendered meaningless, threatening to undermine the systems of justice established in our democratic nations. ... Should governments continue to encounter impediments to lawful access to information necessary to aid the protection of the citizens of our countries, we may pursue technological, enforcement, legislative or other measures to achieve lawful access solutions."
It's been all too easy over the last few months to forget that many of the security features tech companies introduced--and started advertising--in the wake of the Snowden revelations are constantly scrutinized by government agencies that want access to as much data as possible. But the Five Eyes statement appears to be another declaration in the ongoing "Crypto War," and that could mean an increase in government efforts to force tech companies to put backdoors in their products.
Do they really think the bad guys will say "Oh, it's illegal to encrypt now... oh well... I'll just click send anyway." or "Oh, it's illegal to encrypt now. we better stop being bad guys."
Just like firearms, the bad guys won't care whether it's illegal to obtain, own, or use. They'll use it anyway. Meanwhile law abiding citizens/subjects/ will suffer because of it.
Reality is the bad guys just make an excellent and handy excuse to strip the rest of us of rights that are supposed to be natural/inalienable or unalienable (at least in the eyes of the founding fathers of The United States.) Remember the Bill of Rights was put in place to restrict government from infringing on what was and are inalienable rights... or as they felt... God Given rights. (Yes. many were theists, not atheists or agnostics.)
I don't think that's a good comparison. The authorities here aren't saying it should be illegal for individuals to use encryption, but rather that companies should implement back doors in their hardware and software that allow them to easily get past the encryption. If implemented on the hardware level, this would in fact have a profound effect on everyone's ability to encrypt their information, regardless of how much they care about obeying the law. The only way to get around it would be to build your own CPUs, which is obviously not an easy task.
If the backdoor exists, whether known or not, they will be found and exploited. Would you want your credit card information on a server that is open to the world?
The other thing to remember is that people have been encrypting messages for millenia without the use of computers. The use of a simple one-time pad is fairly simple and there is no backdoor to be exploited by someone trying to read your messages.