Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not a backdoor, it's a frontdoor. In cryptography, there's no way to make repeated attempts more computationally expensive. The lockout just an extra feature Apple put on, that Apple could easily remove. If we're going to have 4- and 6- digit PINs, there is no way to stop a dedicated attacker frome brute-forcing it. None.


You can't make crypto behave slower on repeated attempts but you can still make each attempt more expensive. For example: https://en.m.wikipedia.org/wiki/Pepper_(cryptography)


True. But Apple, with such a focus on UX, cannot reasonably afford more than ~200 millisec when checking a password; and still it scales linearly, so the solution for concerned users still involves creating a more complex password. Doubling the amount of time it takes to hash a password will have the same effect as adding 1 more bit of entropy to the password, which can easily be beaten by adding a single character to it.


If you consider caching of keys, there's no reason that the first login attempt after a cold boot couldn't take 1-2s. Each subsequent login would be roughly instant.


"there's no way to make repeated attempts more computationally expensive"

That's not true actually. For example, the industry standard for storing passwords on a server (bcrypt) is specifically designed to slow down password match attempts.


It is true. You're confusing making _repeated_ attempts progressively more expensive with making all attempts more expensive to start with


Ah yes. You are right, I was confusing those two things. Thanks for the clarification!


Bcrypt isn't an industry standard.


  > no way to stop a dedicated attacker from brute-forcing it
Wipe after x incorrect? Can't stop the attacker, but you can make it futile, surely.


Nope. If the attacker (Apple in this case) can replace the OS, they will just do so before the phone gets wiped—replacing the OS will remove that wipe feature.


Not if the check and wiping is done in hardware as claimed by Apple for newer devices than the one in question here.


Meh. Then the attacker can simply replace the hardware. Remember, our attacker model is Apple; non-cryptographic security measures mean very little to a company with such complete knowledge of the hardware and software involved.


Nope. On newer devices the key is derived from a random key fused into the SE during manufacturing, a key fused into the ARM CPU, and a key randomly generated on-device during setup (derived from accelerometer, gyro, and altitude data) and stored in the SE. The SE's JTAG interface is disabled in production firmware and it won't accept new firmware without the passcode.

You can't swap the SE or CPU around, nor can you run the attempts on a different device.


Can't you? Seems like the kind of problem you can point an electron microscope at, and perhaps some very high precision laser cutting. In any case, I imagine if you are willing to spend resources on it, you could read the on-chip memory somehow and start cryptoanalysing that.

Against a sufficiently capable adversary, tamper-resistance is never infalible, but good crypto can be.


    > Against a sufficiently capable adversary, tamper-
    > resistance is never infalible, but good crypto can be.
Nonsense, it all comes back to "sufficiently capable", every time.

To a sufficiently capable adversary, _all_ crypto is just "mere security by obscurity".

"Oh, mwa-haha, they obscured their password amongst these millions of possible combinations, thinking it gave them security - how quaint. Thankfully I'm sufficiently capable.", she'll say.


The point is that the key is stored there too (part of it burned during production in silicone) and can't be read or changed.

Sure, if they wanted to they could implement a backdoor. But assuming they correctly created and shipped the secure enclave it shouldn't be possible to circumvent it even for Apple.


It's sounding like that's the problem. They left an opening for this sort of thing by allowing firmware updates to the secure enclave. That basically makes it a fight over whether the FBI can force Apple to use the required key to sign an update meeting the specifications the FBI directs.


Well, i read elsewhere in this thread, that updating the firmware for the secure enclave wipes the private key contained within. Which means you've effectively wiped the phone.


You need to unlock the phone before you can update the firmware on the secure enclave.


Secure Enclave is not really ‘hardware’; despite being isolated from the main OS and CPU, it is still software-based and accepts software updates signed by Apple.


If those software updates force it to erase its private key store, though, then it's functionally isolated from that attack vector. An updated enclave that no longer contains the data of interest has no value.


Ok, if that part is updatable you have indeed a backdoor.

In theory it should be possible to make it fixed (which Apple doesn't seem to have done).


Making it fixed just means you can't fix future bugs. The secure approach is to ensure that updates are only possible if the device is either unlocked or wiped completely.


The Secure Enclave enforces time delays between attempts.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: