FBI Once Again Asks Apple To Unlock Encrypted iPhones

(Image credit: Shutterstock)


The New York Times reported that FBI General Counsel Dana Boente sent a letter to Apple asking the company to unlock the two encrypted iPhones of Second Lt. Mohammed Saeed Alshamrani of the Saudi Royal Air Force, who authorities believe  shot three sailors at Naval Air Station Pensacola in December 2019. 

This confrontation could turn into a new test case in which either Apple will have to unlock the devices or the FBI will have to retreat once again, as it did in two previous cases. In the San Bernardino case, the agency was forced to admit that it had other ways to unlock the devices, and the judge sided with Apple in a New York case.

The FBI has confirmed the existence of the new letter to Apple. The agency had checked internally and with other intelligence agencies to see if there was a way to unlock the device without Apple’s help, but the response came back negative, according to someone familiar with the investigation.

Apple’s previous argument against the FBI’s criticism was that it can give the FBI “all the data in its possession” (meaning any messages and files that were backed up to iCloud, or related services metadata, as those don’t use end-to-end encryption), but only the owner of the device can decrypt the local phone’s data. As Alshamrani is now dead, that most likely means that nobody will be able to unlock the data. 

The FBI now claims it doesn’t want an encryption backdoor (at least for now), but instead simply wants Apple to open the devices for the agency, according to the NYT report. However, Apple has also said the only way to unlock a device’s data is to put all iOS devices at risk by creating a compromised version of its operating system

The company recently renewed its pitch to protect user privacy to the best of its ability. The company has been working on end-to-end encrypted cloud backups for a couple of years now, too, but so far it has kept that technology unused. It’s not clear if that’s so it doesn’t draw more criticism from the U.S. or other governments. 

Attorney General William P. Barr recently attacked Facebook’s efforts to switch Facebook and Instagram chat services to WhatsApp’s end-to-end encryption, so Apple may want to avoid similar attacks for now. So far, Facebook has ignored the government's threats, and is said to be moving ahead with its encryption plans.

Lucian Armasu
Lucian Armasu is a Contributing Writer for Tom's Hardware US. He covers software news and the issues surrounding privacy and security.
  • Mr3dPHD
    I don't think that the government should be allowed to force a company to bend to their will, but I also think from an ethical standpoint that it's super messed up that Apple won't voluntarily comply. I'm a known Apple hater though so I may be biased, ha ha ha.
    Reply
  • InvalidError
    If what I read of Apple's Secure Enclave documentation is accurate, there is no way for Apple to unlock the devices even if it wanted to unless it maintains its own database of secure enclave keys linked to each CPU by device serial number and even then, SE also uses locally generated keys to blend with the SE key to generate file system, password and other keys so even having the SE key might not help much.
    Reply
  • Math Geek
    Mr3dPHD said:
    but I also think from an ethical standpoint that it's super messed up that Apple won't voluntarily comply.


    as has been the case for a long time now, apple does not have the ability to decrypt the data from the phone. even if they wanted to, they can not. that's the whole point of the encryption scheme. it does no good to encypt something if everyone has the key to unlock it.

    that's what the gov's all over the world are arguing right now. that unless they have some sort of super key to get into everything, then they'll never be able to get into anything.

    and of course apple and every other tech company out there returns the argument that any backdoor, super key programmed into the software, will make sure that anyone and everyone can get into anything. which again, defeats the purpose of encryption in the first place.

    the real argument is, "do we as citizens have the right to privacy anymore or does the threat of a few criminals mean no one should ever have any privacy every again just in case....?" i'm willing to let a few bad guys hide their activities here and there if it means my i can keep the last little bit of privacy that exists in this world for the average person.
    Reply
  • jgraham11
    I totally agree with your statement about privacy. I don't agree with the idea that there are "bad guys" out there.
    That is a Disney mentality.
    The reality is that everyone is looking out for themselves and the people who are important to them in their culture.
    For example, Iran: do you believe they are bad people? All of them?

    Think about how the media portrays people who break the laws as "bad people".
    I jay walked the other day, am I bad? If I'm not, where is the line?
    Bad decisions don't mean bad people, they're just bad decisions...
    Bad intensions don't mean bad people, as there are no laws for bad thoughts, yet...
    Reply
  • Math Geek
    "bad guys" is just a generic term. don't read too much into it like that :) this article is about the FBI asking to unlock a phone (which again they can't) from a foreign national who killed american servicemen in florida. not sure how you say that person is not a "bad guy". i'm pretty sure that is a bad thing done by a bad person and deserving of the title "bad guy"

    however, as i said just because criminals, terrorists, those who wish to skirt the law (do i need to define "bad guy" any further?) may use the tools to hide their activities, does not to me mean that no one should be allowed to use the tools. that's the basic argument the various governments are using in asking for backdoors and wanting to outright ban encryption in many cases.

    i applaud those who are building the encryption into the various apps here and there as well as the phones' file system in general. at a time when we have lost almost all our privacy in all aspects of our lives, it is nice to know there could still be a small space where amazon can't see what i am doing in order to target ads at me.
    Reply
  • Mr3dPHD
    I dunno, it doesn't seem all that complicated to me.

    If Apple created a unique 64 digit encrypted code tied to each individual phone, and only a small division of the company had access to those codes under very unique circumstances like these....perhaps the codes can only be implemented in conjunction with a similar secondary code (two factor authentication) which is generated by a similarly small division of the FBI that Apple has no access to, then literally nobody's privacy is affected other than those who are convicted of committing a major crime which fits XYZ criteria. This means that the only way anyone is getting through this back door is if both Apple and the FBI both agree that it's appropriate.
    Reply
  • InvalidError
    Mr3dPHD said:
    This means that the only way anyone is getting through this back door is if both Apple and the FBI both agree that it's appropriate.
    Nope: if the data exists, it is only a matter of time until a government decides to abuse its power to way more of it than it has legitimate uses for or for it to leak either by accident, sabotage or hacking. If you don't want the potential liabilities for negligence or any other applicable crime, your best option is to not have the data in the first place. If I had to design an SE-like system, I'd make the main key internally generated every time the device gets factory-reset so even the manufacturer has no clue what it might be. The only way to get the keys out would be atomic-force microscopy to directly read the NVRAM cells after polishing the CPU substrate thin enough to read them.
    Reply
  • Alabalcho
    In the same way as Chelsea Manning is held without a trial until she agrees to testify, how long will be before Tim Cook lands there "until he cooperates"?
    Reply
  • InvalidError
    Alabalcho said:
    In the same way as Chelsea Manning is held without a trial until she agrees to testify, how long will be before Tim Cook lands there "until he cooperates"?
    If the SE key is only known to the CPU it sits in, then there is no amount of jail time or torture that could coerse Cook into doing anything.
    Reply