Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm clearly in the minority here, but I don't really understand Apple's position here, nor do I understand why everyone is rallying behind them.

Apple built hardware which was not particularly secure. The software defaults to a four-digit PIN. They attempt to mitigate this by adding an escalating interval between entries, and by optionally wiping the phone after too many failed tries, but this is not set in stone and those limits can be removed with a software update.

The government is coming to Apple and saying, "You can remove these limits. Do that for us on this phone." Coming as a legitimate court order, I see no problem with this request. The government isn't even asking them to crack the phone, they just want Apple to remove the limits so the government can try to brute force it. They're even paying Apple for their trouble.

If Apple didn't want to be put in a position where the government can ask them to compromise their users' privacy, they should have built hardware which even they couldn't crack. And of course they did; starting with the A7 CPUs, the "secure enclave" system prevents even Apple from bypassing these limits. The phone in question just happens to predate that change.

If the government was demanding that Apple do the impossible, I'd be on their side. If the government was demanding that Apple stop manufacturing secure phones, I'd be on their side. But here, all they're asking is for a bit of help to crack an insecure system. They're doing this in the open, with a court order. What's the problem?



> The government isn't even asking them to crack the phone, they just want Apple to remove the limits so the government can try to brute force it. They're even paying Apple for their trouble.

Well this exact thing isn't THAT big of a deal but it's a slippery slope. If Apple agreed to this then what else can the government ask them to do under the banner of "public safety"? And if Apple were to give the government an electronic way to brute force the touch codes, it would break the trust of every iPhone owner.


I don't see the slippery slope here. The government is asking Apple to do something that is both possible and reasonable. I see no slope to that from other typical court orders.

Giving the government a way to brute force PINs wouldn't break the trust of every iPhone owner, merely the owners of iPhones with pre-A7 CPUs. And great, if they trusted Apple on this their trust was misplaced. You can't trust companies not to unlock stuff when the government requests it with a legitimate court order. If you want Apple not to decrypt your data, the only way to ensure that is to make it so they can't.

Again, Apple has (so far as we know) made it so they can't, on newer hardware. But this phone that the FBI is trying to get into is older hardware and built such that Apple can get into it. If you're looking to point fingers, blame Apple for building not terribly secure hardware. But don't point fingers too hard, because they're doing it a lot better now.


The slope is that if they can order Apple to engineer one thing, they can order them to engineer another.

It is possible for Apple to the weaken the secure enclave on all future iPhones. It would be reasonable to do so from the point of view of giving law enforcement a useful tool. Therefore since Apple can be ordered to do engineering to make law enforcement easier, why should they not be ordered to do this?

That is the slippery slope.


> The slope is that if they can order Apple to engineer one thing, they can order them to engineer another.

How does that at all follow? Right now, a cop can lawfully order me to identify myself. Does that mean they can also lawfully order me to go to the nearest coffee shop dressed as Bozo the Clown and shout, "I am in love with the ghost of Princess Diana"?

I don't understand how complying with an order to use an existing security hole to break into in someone's device somehow sets a precedent that the FBI can in the future go to Apple and set the parameters for how their products are designed.


Because the hole doesn't actually exist unless Apple engineers custom code for the FBI. If the FBI can force Apple to engineer code to create security holes for them, that establishes a precedent.

Explained better by someone else here: https://news.ycombinator.com/item?id=11120036


I would argue that the hole is the fact that Apple can even load new software that allows this attack. It already exists.

But I'm not sure that distinction is important. The other comment you linked to lays it out pretty nicely and it doesn't rely on a hole existing it being created. It's ultimately just about compelling creation.

I wonder, what if the FBI just requested the relevant signing keys and source code? That seems like a much worse outcome, but at the same time less of a reach.


Why is that less of a reach?


Because it's just asking them to turn over information the authorities need for their investigation, which is a pretty normal sort of request. None of this troublesome asking them to build new software.


Fair point.


"Per US versus Apple in the San Bernardino matter, we request a warrant be issued for encryption bypass in this (separate) matter".


They address this at the end of the letter. They say it's "an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority." They go on to talk about what that precedent would mean. It's at the very bottom.


I don't buy it. The FBI is not trying to dictate how Apple builds their devices. They want Apple to take measures to unlock one device. How do they get from that to "[the government] would have the power to reach into anyone’s device to capture their data"?

Apple seems to be saying that if the FBI can ask Apple to install special software on one person's phone, then they can ask Apple to install special software on everyone's phone. But that's not how it works. The whole idea of requiring a court order is to only do this stuff on a limit scale when there's justification for it. It's like saying that the police shouldn't be able to have a warrant to search a suspect's house, because that means they could search everyone's house.


I disagree. This is all about setting precedents. Once one of the dominoes falls, it's much easier for the others to start falling as well.

Apple is saying the FBI is using this law to expand their power to mandate a backdoor in all devices. If this is successful, then the FBI can mandate that all secure hardware/software companies backdoor their products.

Do we roll over and let the FBI do this because "oh, this is just one case, it's fine" or do we set a boundary? If a child does something wrong, you scold them immediately. You don't wait for them to do it the 10th time. By then it's too late.


Apple says the FBI is using this law to mandate a backdoor. And that's exactly what I don't buy. The FBI's demand here is merely to use an existing security hole, which Apple created.

I simply don't see the leap from "this device is insecure, pleas unlock it for us" to "all devices you make must be insecure."


Your response feels very naive or short sighted. Or both.

If this goes through, you better believe that there will be court orders left and right, which can't be authentically argued against since Apple has already done it before.


Court orders for what, exactly?

If there are legitimate court orders for cracking the security on the phones of criminal suspects, I don't have a problem with that.

The problem would be if:

1. A court orders Apple to crack the security on a phone they cannot actually crack (presumably any A7+ phone), and imposes some punishment for failing to do the impossible.

2. A court orders Apple to modify the design of their phones to make sure they are always crackable.

Those would be huge problems. But I don't see how you get from here to there.


Is it possible to update a phone without the user accepting the change (with the phone locked)?

Do you want such a tool (the one the removes the security of updating) to exist so that anyone with physical access can replace the OS with something else?


Apparently it is possible to update older phones like this. On newer ones, the OS in general is not an issue, since the security bits are handled by the secure enclave. I'm not sure what the software update policy is (it's only briefly mentioned in Apple's iOS security paper) but I'd wager that it's impossible to push an update to the secure enclave software without either unlocking it or erasing the crypto keys.

I don't understand what you're getting at with the "tool" question. Nobody's talking about building something that lets anyone with physical access replace the OS. The phone's secure boot system will still require updates to be signed by Apple. Apple can replace the OS with physical access. On older hardware, it seems they can do this without wiping the data. Do I want Apple to be able to do this? It doesn't matter what I want, the fact is that they can. Since they can, and since the FBI has a court order, I don't see what's wrong with requiring them to do so. If you don't want them doing this to your phone, buy a newer one with the more secure hardware.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: