Apple And The FBI: Intended And Unintended Consequences Of An iPhone Backdoor
A federal judge ordered Apple to comply with the FBI’s request for technical assistance in the recovery of data from the San Bernardino gunman’s iPhone 5C. The FBI wants Apple to create a custom backdoored firmware for the iPhone 5C that will disable some security features such as the PIN rate limiters and the feature that auto-erases after 10 failed attempts. Then, it wants Apple to push that update to certain phones so the FBI can brute-force them in minutes or hours. This sort of request has both intended (by the FBI) and unintended consequences.
Intended Consequences
De Facto Two-Century Old Backdoor Law
The FBI used the All Writs Act of 1789 to convince a federal judge to compel Apple to unlock an iPhone 5C. If the request stands, it could create a major precedent for how the FBI will deal with technology companies from now on.
“The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge,” said Apple in an open letter.
The FBI hasn’t succeeded in getting Congress to pass a law to backdoor encryption, but if it’s able to force companies to comply with its requests to unlock devices no matter what, then that would be just as good from the FBI’s perspective. It would appear that this is exactly why the FBI wants this case to become a precedent and why it’s fighting so hard against Apple on this.
Weak Security Everywhere
The FBI is “lucky” here, because the iPhone in question is an older version that didn’t have a “Secure Enclave” and some of the stronger encryption features. The encryption and security of the newer iPhones is supposedly protected by the Secure Enclave even against tampering from Apple itself. That would mean Apple couldn’t weaken the security of these devices even if it wanted to.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
However, if the FBI gets its way, it could use the example of this case as a precedent for similar cases in the future. If judges presiding over those cases don’t understand the difference between the new and the older iPhones’ security, and why Apple could help the FBI unlock it then but can’t anymore, then Apple could be forced to weaken the security of its iPhones in a more permanent way.
Right now, the FBI’s request is supposedly only going to affect the iPhone 5C, because Apple can modify that security software with an update. If a judge orders Apple to unlock new iPhones as well, then Apple would have to make it so that the Secure Enclave isn’t tamper-proof anymore. For this to work, all the Secure Enclaves in all iPhones would have to be crippled from factory.
Unintended Consequences
Return Of The iPhone Thefts
Some have argued that removing the PIN rate limiter and the auto-erase feature is not the end of the world, but it would actually drastically weaken the PIN security feature. If the FBI is able to brute-force a device’s PIN number in minutes or hours, then so will others.
These PIN protections were added by Apple because, for instance, it didn’t want stolen iPhones to be easily cracked. Now, after arguing for so long that iPhone thefts are a huge problem, the government wants Apple to reverse those protections and make iPhones just as vulnerable to thefts as they were before.
There is, of course, the argument that Apple or the FBI would only use the custom firmware on a case by case basis. However, that ignores the fact that such firmware would work on any such device, and once the FBI has it, there’s no way for Apple or anyone else to ensure it’s not being abused.
“The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable,” added Apple in its letter.
This sort of power would quickly trickle down to other law enforcement agencies or even the police, just like cell tower simulators (Stingrays) have. Once Apple builds that capability, it’s guaranteed to be asked by other governments to use it as well.
The U.S. government also has a rather poor track record in protecting itself from digital attacks, so if it gets hacked, then those hackers would have access to the same software. It’s no different than the golden key/encryption backdoor discussion from earlier, in this case.
The End Of Secure Apple Pay
If Apple is forced to comply with such requests in the future, even though it technically couldn’t unless they make it so the Secure Enclave can be tampered with, then anything else that is protected by a currently “tamper-proof” enclave, including fingerprint data and credit card information from Apple Pay, could potentially be stolen by malware-infected apps.
“The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe,” said Apple.
Civil Liberties Groups' Support
The ACLU said it would support Apple in this fight against the U.S. government, because it believes the Constitution protects the company against having to hack its own customers:
“This is an unprecedented, unwise, and unlawful move by the government. The Constitution does not permit the government to force companies to hack into their customers' devices. Apple is free to offer a phone that stores information securely, and it must remain so if consumers are to retain any control over their private data," said the ACLU.
The EFF also announced that it will file an amicus brief to support Apple's position.
Fight For The Future (FFTF), another civil liberties group, said it will organize nationwide protests outside of Apple’s stores with a simple message for the U.S. government: "Don’t Break Our Phones."
Lucian Armasu is a Contributing Writer for Tom's Hardware. You can follow him at @lucian_armasu.
-
g-unit1111 This is why I will never use a phone pay system, or buy a car that has an app to start it from your phone. I do stand with Apple and FFTF on this one.Reply -
LORD_ORION Remember when the US used to make fun of these commie totalitarian behaviors in cartoons? There would be a sweet little girl reading a letter from her pen pal, except the voice over was some burly hairy goon speaking in a Slavic accent.Reply
-
TwoDigital I thought the FBI was supposed to be this super-smart group of cyber-criminal trackers. If they HAVE the phone, why can't they just read data right from the memory chip and then brute-force it in an environment that doesn't have a 10-chances-and-it-blows-up keycode? Also, if they google this, they can buy this: https://www.intego.com/mac-security-blog/iphone-pin-pass-code/ ... just saying.Reply -
DeadlyDays probably because this isn't about cracking the phone, it is about the FBI, amongst other US groups, to establish a prerogative for companies to comply with requests to break/decrypt secured devices.Reply
I would be incredibly surprised if they didn't have the technology to clone the device virtually and run dozens if not hundreds of instances to break into it bruteforce style. Like it says in the article, if there is a precedent than a judge unfamiliar with technology may misunderstand it and the fbi and others could use it to force companies to do this for situations where it is much, much harder/nearly impossible to break in via bruteforce. -
chicofehr I think what they want is for apple to circumvent the 10 attempts limit which in itself would reduce the value of encryption. of course like others said, cloning the memory chip itself and cracking it in a virtual environment would make more sense.Reply -
InvalidError
When the FBI wants to access a device, they usually do not want to wait several hours or days to get in - that's assuming the data self-destruct does not get triggered in the meantime.17519645 said:Also, if they google this, they can buy this: https://www.intego.com/mac-security-blog/iphone-pin-pass-code/ ... just saying.
For devices with Secure Enclave backed encryption and equivalents, duplicating the eMMC does you no good since the non-readable UID code hidden inside the Secure Enclave used to generate encryption keys is not externally accessible - Apple claims the hardware lacks any ability for the software/firmware to read the UID back after it is written. That limits you to having to go through the Secure Enclave's 80ms key generation latency for each password guess attempt or directly brute-forcing the 128+ bits file/block encryption. -
d_kuhn I thought the FBI was supposed to be this super-smart group of cyber-criminal trackers. If they HAVE the phone, why can't they just read data right from the memory chip and then brute-force it in an environment that doesn't have a 10-chances-and-it-blows-up keycode? Also, if they google this, they can buy this: https://www.intego.com/mac-security-blog/iphone-pin-pass-code/ ... just saying.
I'm sure they can do just that on older phones... but they're also cheap and lazy... right now they have to REALLY want that data in order to pay what it would cost to disassemble the phone and manually extract the stored data. This law gives them a cheap way to do the same thing and makes it easy to distribute that capability to other agencies and use in less serious situations. I'm sure the goal of this move is not to get the data on that particular phone (I'd be surprised if they don't already have it) but rather to weaken phones to cheap/wide access from government agencies. -
bloodroses75 What I don't get about this whole thing is that it is illegal to internationally travel between some countries with encrypted electronic devices. So, does this mean that every person that has an iPhone is breaking the law if they internationally travel with their phone?Reply
https://www.princeton.edu/itsecurity/encryption/encryption-and-internatio/
(one of many links about the subject) -
none12345 I rarely get to say kudos to apple. But stick to your guns apple, don't cave to that BS.Reply
Destroying personal liberty/security in the name of protecting you against terrorism, means the terrorists have already won. -
targetdrone I think what they want is for apple to circumvent the 10 attempts limit which in itself would reduce the value of encryption. of course like others said, cloning the memory chip itself and cracking it in a virtual environment would make more sense.
I think what they want is for apple to circumvent the 10 attempts limit which in itself would reduce the value of encryption. of course like others said, cloning the memory chip itself and cracking it in a virtual environment would make more sense.
We are talking about a government agency here. Making sense is against the rules.