Security Experts Dismantle Ray Ozzie’s Smartphone Backdoor Proposal

Former Microsoft CTO Ray Ozzie recently made a proposal together with the National Academy for Sciences for a smartphone backdoor that would “end the Crypto War.” Johns Hopkins cryptography professor Matthew Green laid out the multiple issues that this proposal brings and how it could put billions of smartphone users at risk. Other security experts have also criticized the plan.

“Ending” The Crypto War

Ozzie, who also happens to have launched Microsoft’s “Azure” cloud services when he was the company’s CTO, recently came out with a proposal for law enforcement to decrypt smartphones “safely.”

It should be noted that although Ozzie has decades of experience in software engineering and architecture, his expertise isn’t in cryptography. One of the main rules in cryptography is “never roll your own crypto,” which is advice not just for software developers without any cryptography know-how, but also for the vast majority of cryptography experts, too.

The reason why is that it’s difficult even for top cryptography experts to develop secure protocols without flaws. It’s also why cryptography standards typically have to pass years of vetting before being approved by standards bodies. Even after being approved, most companies aren’t in a hurry to adopt new protocols until they're more battle-tested.

Ozzie believes that the “Crypto War” is at an impasse, which already seems to frame the issue in a way that implies that the only way to “end” the war would be to give the governments the backdoor they’ve always wanted.

An alternative framing could be that the government has already lost the war, because as privacy activists and security experts have said for decades, having a backdoor in devices is either not secure or it will be abused by the government (or both).

Ozzie’s Backdoor Proposal

Ozzie’s proposal is essentially a “key escrow,” and it’s a proposal that has been made in the past with the infamous (and ultimately non-secure) Clipper Chip. The idea is that Apple and Android smartphone manufacturers would create a pair of keys, a public key that would be stored on the phone, and a private key stored in the manufacturer’s “vault.”

That public and private key pair would then be used to generate a secret PIN for the device, that, supposedly, only the manufacturer can use to unlock the device. The government agencies could then ask manufacturers to unlock certain devices for them.

A “highly-trusted” employee would have to be sent to the vault to unlock the device using the private key inside the vault. This operation may be needed to be done many times a day, depending on how many requests the police has every day.

The decryption code would only work with that specific device, and supposedly a special chip inside the phone would “blow itself up.” This would still allow law enforcement to get the contents of the device, but it make the phone unusable afterwards. However, this last feature may be ignored by the government when writing the policy, even though it’s a critical one, because it may not want every phone to become unusable afterwards.

Ozzie has filed for a patent for this solution, so presumably if Apple and Android manufacturers are mandated by governments to adopted it, they’d all have to pay Ozzie royalty for every device they sell with it for the next 20 years until the patent expires.

Not All Manufacturers Will Have Apple’s Security

Ozzie’s proposal seems to rely on the fact that Apple will keep this private key as secure as possible, just as it does with other private keys for software signing, for HTTPS encryption for its services, and so on.

However, all cryptographers and even Apple knows that these are not as secure as everyone thinks they are. In fact, last year we learned that at least two companies, Cellebrite and Grayshift already have products on the market that can unlock every iPhone.

That shows that even Apple’s strong security can be broken. Worse yet, it’s not possible to now secure those devices with just a software patch. According to Green, this is the same problem Ozzie’s solution will face, too.

And if that wasn’t bad enough, apparently Grayshift’s servers were recently hacked and the code for unlocking iPhones has been stolen. Now the hackers are asking for 2 Bitcoins from the company, or they will otherwise make the code public.

This, again, shows we can’t just assume that Apple will forever be the only one that will be able to use Ozzie’s solution to unlock the phones. Eventually, hackers will find another way to unlock the phones, too, especially if they already know that a backdoor giving law enforcement access will be implemented.

Other smartphone manufacturers can’t even keep up with software updates for the most part, so they should be even more vulnerable to hacking, or they won’t even be able to afford a highly-secure vault for the private key.

Green said:

So let’s be clear. Ozzie’s proposal relies fundamentally on the ability of manufacturers to secure massive amounts of extremely valuable key material against the strongest and most resourceful attackers on the planet. And not just rich companies like Apple. If ever a single attacker gains access to that vault and is able to extract, at most, a few gigabytes of data (around the size of an iTunes movie), then the attackers will gain unencrypted access to every device in the world. Even better: if the attackers can do this surreptitiously, you’ll never know they did it.Also, did I mention: you have to keep these keys secure forever.

Building A Backdoor Is Easy - Securing It Is Impossible

Security expert Robert Graham also seems to mirror Green’s thoughts in thinking that vaults can’t scale across the industry and that the more people access them or the more times they are accessed, the less secure they’ll be.

This has been the main criticism against encryption backdoors from day one. It’s not that backdoor solutions can’t be built - they can be. It’s just that securing them will be impossible, according to Graham:

We are talking thousands of requests per day from 100,000 different law enforcement agencies around the world. We are unlikely to protect this against incompetence and mistakes. We are definitely unable to secure this against deliberate attack.

Ozzie Admits: This Isn’t 'The Answer'

Despite promoting his solution as a technical one that finally solves what cryptographers previously said is unsolvable, Ozzie has admitted on Twitter that there are significant trade-offs to be made in security with this solution. He also said that it’s more of a policy solution, rather than a technical one.

However, that ignores the fact that if the solution isn’t technically secure, then it will put billions of users at risk, worldwide. The issue about an encryption backdoor has always been primarily a technical one, because backdoors can’t be secured against attackers.

Also, despite the fact that Ozzie himself seems to admit that it’s not “the answer” to building a secure backdoor, it’s likely that law enforcement will now use it as an excuse to legislate encryption backdoors because the solution has finally been found. However, according to cryptography and security experts that couldn’t be further from the truth.

Lucian Armasu
Lucian Armasu is a Contributing Writer for Tom's Hardware US. He covers software news and the issues surrounding privacy and security.
  • jackt
    I just cant believe that a opensource smartphone OS, still doestn exist.
    Reply
  • jdog2pt0
    20923034 said:
    I just cant believe that a opensource smartphone OS, still doestn exist.

    I suspect that there has been attempts, but the biggest problem is getting users to actually buy the phones and use them. Problem with that is, they won't unless there's a decent selection of applications available to complement the OS, and the app developers won't waster their time developing the apps for the platform unless there's enough users to justify the effort. See the problem? It's a catch 22 (No users, no apps made. No apps made, no users.)
    Reply
  • Olle P
    I find only one tiny flaw in mr Green's comments:
    ... you have to keep these keys secure forever.
    Forever is a very long time. I find it hard to believe that there would be any harm if someone 150 years from now got their hands on a few keys that could unlock devices produced now...

    Other than that I totally agree that all deliberate weaknesses are bad!
    Reply