Why Apple’s iPhones are so easy for the FBI to crack

Bet you didn’t know your smartphone spends most of its days in an insecure AFU (After First Unlock) state.

From Lily Hay Newman’s “How Law Enforcement Gets Around Your Smartphone’s Encryption” posted last week on Wired:

New research indicates governments already have methods and tools that, for better or worse, let them access locked smartphones thanks to weaknesses in the security schemes of Android and iOS…

“It just really shocked me, because I came into this project thinking that these phones are really protecting user data well,” says Johns Hopkins cryptographer Matthew Green, who oversaw the research…

The researchers assumed it would be extremely difficult for an attacker to unearth any of those keys and unlock some amount of data. But that’s not what they found.

“On iOS in particular, the infrastructure is in place for this hierarchical encryption that sounds really good,” says Maximilian Zinkus, a PhD student at Johns Hopkins who led the analysis of iOS. “But I was definitely surprised to see then how much of it is unused.” Zinkus says that the potential is there, but the operating systems don’t extend encryption protections as far as they could.

 When an iPhone has been off and boots up, all the data is in a state Apple calls “Complete Protection.” The user must unlock the device before anything else can really happen, and the device’s privacy protections are very high. You could still be forced to unlock your phone, of course, but existing forensic tools would have a difficult time pulling any readable data off it. Once you’ve unlocked your phone that first time after reboot, though, a lot of data moves into a different mode—Apple calls it “Protected Until First User Authentication,” but researchers often simply call it “After First Unlock.”

If you think about it, your phone is almost always in the AFU state. You probably don’t restart your smartphone for days or weeks at a time, and most people certainly don’t power it down after each use. (For most, that would mean hundreds of times a day.) So how effective is AFU security? That’s where the researchers started to have concerns.

My take: Fascinating. Scary. Thanks to friend-of-the-blog Jerry Doyle for the link.

11 Comments

  1. David Emery said:
    Remember, though, this requires physical access to your phone.

    (I bet Apple changes this Real Soon Now.)

    1
    January 18, 2021
  2. Fred Stein said:
    Note to self: “reboot iPhone”.

    My Mac reboots every week. I assume that’s MacOS default.

    0
    January 18, 2021
    • Grady Campbell said:
      am I understanding correctly? Your Mac reboots itself? I have not had that experience (except due to a power failure) and only occasionally reboot either of my Macs at all. I don’t think it’s a MacOS default unless there’s some setting that schedules it.

      2
      January 18, 2021
    • Roger Schutte said:
      Fred, weekly restarts are not a default in MacOS – I’ve gone months between restarts. If you are okay with them occurring then no worries. But if you want to stop them, first place I’d check is to see if automatic installs are enabled for new versions of the OS. Another place to check would be to see if someone setup a weekly schedule in System Preferences/Battery.

      1
      January 18, 2021
  3. Gregg Thurman said:
    Reboot or Wake up? I have to unlock my iPhone several times, after it goes to sleep, during the day. I’m wondering if the researcher/author of this piece understands the difference.

    1
    January 18, 2021
  4. Jerry Doyle said:
    Question: Can anyone expound on why governments mount major calls for tech companies to provide “back door access” when it appears governments already have needed tools for accessing data on phones? Are governments using the calls as political leverage and political exploitation against big tech? This aspect of the issue puzzles me. Can anyone elucidate this head-scratcher for me?

    1
    January 18, 2021
    • Thomas Nash said:
      Answer is that it is very easy to put the phone into Complete Protection mode as I pointed out in another message here. And this information is well known. So the “authorities” don’t like that, for good and bad reasons. And yes, I would think there is a large component of “political leverage and political exploitation against big tech” in their public unhappiness about this.

      It is clear that the government is not capable of properly protecting the keys to the “back door” they are asking for. So this is a policy (not technology) question of balancing insecurity for everyone from all kinds of bad people who would likely get access to the “back door” versus the legitimate goal of government to gain intelligence about bad actors, like terrorists, etc.

      1
      January 18, 2021
    • Brian Nakamoto said:
      I think one reason is that governments are trying to discourage companies from defaulting to more secure implementations of end-to-end encryption; e.g. where only the sender and receiver(s) can decrypt a message.

      Forensics company, Elcomsoft, has several good blog posts about what is possible with Apple and other devices/services – blog.elcomsoft.com. As with viruses, Apple security isn’t invincible, but they still do a better job than others.

      0
      January 18, 2021
  5. Thomas Nash said:
    Two ways to make your iPhone Complete Protection secure:
    a. Push side (right) button 5 times. SOS screen appears where you can summon help or power down by sliding a screen switch.
    b. In Settings/Face ID & Passcode, be sure Erase Data (at bottom) is toggled on. Then 10 failed entries of passcode will erase all data on iPhone. (Particularly easy and quick to accomplish if you are wearing a face mask, for example just enter 111111 10 times, assuming that’s not your password 😉 Data can be recovered later from iCloud if you still have the phone.

    1
    January 18, 2021
  6. Steven Philips said:
    @Jerry: Why the head scratch? Companies are always trying to eliminate the problems that allow unauthorized data access. Then the governments have to do more scrambling to find new holes. With a “government approved” back door the cat and mouse is eliminated. Permanent and free access.

    1
    January 18, 2021
    • Gregg Thurman said:
      I’m thinking of the “free” security hole research Apple and others get when governments are looking for a back door.

      1
      January 18, 2021

Leave a Reply