".. what this means is that even Apple can't break into an iPhone with a secure passphrase (10+ characters) and disabled Touch ID - which is hackable with a bit of effort to get your fingerprint."
That is not exactly true. They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.
From what I understand Tim is doing, and I greatly admire, is trying to avoid a judicial requirement that they be able to do this on demand. The so called "back door" requirement, because he knows, as others do, that such a feature would be used by more than the intended audience, and for more than the intended uses, to the detriment of Apple's users.
What I really find amazing is that I was at a talk hosted by the East-West Institute where the Air Force General of the new cyber command (whose name escapes me) complained that "we" (silicon valley) had let the government down by not writing strong enough crypto to keep our adversaries out. I remarked that it was the ITARS regulation and the Commerce department at the behest of the NSA which had tied our hands in that regard, and that with a free reign we would have, and could do, much better. Whit Diffie was there and also made the same point with them. And now, here we are 10 years later, and we "fixed" it, and now its our fault that they can't break into this stuff? Guess what? Our adversaries can't either!
The right to privacy, and the right of companies to secure that right with technology for their customers, is a very important topic and deserves the attention. I am really glad that one of the most valuable companies in the world is drawing a bright line in the sand. So I really support Tim's position on this one.
> That is not exactly true. They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.
No, they can't. A quick update to recent hardware practices: modern SoCs like Apple's have something called "Secure Enclave Processor" that's on-die. This is the first thing to start when the chip is powered up, the thing that loads a cryptographically-signed bootloader, and the thing that gates a lot of IO with the outside world (like the NAND).
Hardware encryption on consumer hardware has existed for over a decade (look up Intel's TPM), and while it hasn't obviously taken hold on the more open Intel world, locked-down hardware platforms like Apple's top-to-bottom design has had much more liberty in implementing secure computing.
Furthermore, all debug/probing features can be disabled by fuses at the factory. The manufacturer can test the chip with those features on, and once verified, blow those fuses. No-one's JTAG-debugging that chip, not even Apple.
That said, Apple's focus on security and privacy ramped up in recent years. You want more secure, get more recent hardware. The downside, of course, is that if even Apple can't hack the software... neither can you.
As I understand it, if the code running in the "secure enclave" (containing the private keys) is ever upgraded, the hardware side intentionally deletes the private keys as part of the upgrade, whether the upgraded code would want it to or not.
The reasoning there is far from conclusive. The argument is that the secure enclave has been updated in the past (to lengthen enforced delays) without wiping user keys.
However, without more information, this does not tell us whether it is possible in this case. The obvious implementation for a secure enclave resisting this sort of attack is to only allow key-preserving updates when already in unlocked state (which would be the case for any normal user update). All other cases should destroy the user keymat, even if the update is validly signed by Apple. This would be done by the hardware and/or previous firmware before it loaded the new firmware so you can't create an update that bypasses this step.
If this isn't how the secure enclave works now, I'll bet it will be in the next version (or update if possible).
I bet if Apple is forced to comply with this order they will make sure that they will find a way to design the iPhone such that they physically can't comply with similar requests in the future.
> No, they can't. A quick update to recent hardware practices: modern SoCs like Apple's have something called "Secure Enclave Processor" that's on-die.
Yes, they can. The particular phone in question is from before the Secure Enclave Processor
To play devil's advocate: If they can be compelled to produce de-novo firmware for the purpose of data extraction they could also be compelled to design the means necessary to extract the data from the secure enclave, e.g. by prying the chips open and putting it under a scanning tunneling microscope.
The packages on those chips are usually designed to be melded to the underlying structures enough that opening them destroys the keys they're attempting to hide.
Some people trot the argument that it's OK for the government to compel apple to deliver the backdoored firmware because the measures it would circumvent are not of cryptographic/information-theoretical nature.
Then one could expand that argument by saying that compelling physical reverse-engineering is also OK because the devices are not built to be physically impossible (read: laws of nature) to pry open.
> They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.
When a passcode is entered, the SoC queries the Secure Enclave with the passcode. If the passcode is correct, the Secure Enclave responds with the decryption key for the flash storage.
The best Apple could do is sign a malicious update to the Secure Enclave firmware that either removes the time delays or dumps the keys. However, some people suspect the SE erases its secrets on firmware update, although this behavior isn't documented in Apple's security reports.
> The best Apple could do is sign a malicious update to the Secure Enclave firmware that either removes the time delays or dumps the keys.
Dumping the Secure Enclave would not result in the keys necessary to read the files on the filesystem. Each file has a unique key, which is wrapped by a class key, and for some classes, the class key is wrapped by a key derived from the passcode. If you don't have the passcode, you can't unwrap any of the keys (Page 12 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf).
Yes, if you could recover the SE key (or factory-burned SoC key for older phones), you could crack the passcode on your own cluster rather than being limited to what the phone hardware/software can do.
To my knowledge, all keys are still wrapped with the UID, and the UID is still a factory-burned SoC key (not accessible to any firmware). Possible to extract, but not easy to do at scale.
>>> I am really glad that one of the most valuable companies in the world is drawing a bright line in the sand. So I really support Tim's position on this one.
Tim's position today might not be apple's position tomorrow. Apple is a large publicly traded company. They owe a duty only to shareholders. Fighting this fight will probably impact the bottom line. Tim's continuation may turn on the outcome.
Cooperation may see Apple hurt. The perception of cooperation was part of RIM's fall from grace. Non-cooperation may also cause issues. Through it's various agencies, the US government is Apple's largest customer, as it is Microsoft's. Large contracts might be on the line should Apple not play ball. Either way, this order has probably wounded Apple.
I'm having trouble finding numbers, but I seriously doubt this. The reason that's true (or more likely true) for Microsoft, is Windows. The US gov't has massive site licenses for Windows and most of MS's software portfolio. Apple is used where in the US government? Some cell phones? A few public affairs offices that convinced their purchasing officer to buy a Mac Pro for video editing? Maybe some labs that wanted a unixy OS and, again, convinced their purchasing officer to buy a Mac Pro?
The bulk of Apple's revenue comes from outside the US. Perhaps the US government is their largest single customer (I still hold this is a dubious claim), but it is not essential to their continued existence. They would do just fine without those sales.
My experience is primarily DoD, which is the bulk of the government's spending on this sort of thing. For enterprise spending, Apple doesn't get much love, outside a general trend towards iPhones for "company" phones, but even that's rare, with people using their own phones instead of being issued one a lot of the time (they can get their plans paid for or subsidized if the phone is a requirement of the job). Labs often end up outside a lot of the enterprise type procurements and so have a bit more leeway in that regard, but while they spend a lot of money, it's still a drop in the bucket for Apple.
Your characterization ("Maybe some labs that wanted a unixy OS and, again, convinced their purchasing officer to buy a Mac Pro?") is a little off -- where I work, MBP's for laptop replenishments are treated exactly the same way as Windows systems, you just tick a different button on the order form.
Tim Cook is probably more popular than Obama (and surely is WRT this issue.) Apple is about a thousand times more popular than the NSA and blessed with almost infinitely deep pockets and a very, very good marketing team.
Not to mention the fact that most of the people who use computers and phones don't even live in the USA.
Unfortunately, the NSA/FBI likely have far more money than Apple, and if Apple spends too much, their shareholders can demand that leadership backs down.
Tim Cook is responsible to his BoD and shareholders.
> Large contracts might be on the line should Apple not play ball.
That almost certainly isn't the case. It is doubtful whether any other government organization cares about how they handle this case. Heck the FBI likely doesn't care as long as Apple doesn't do anything illegal.
Lots of places OUTSIDE the US will care, though. This is exactly the sort of $hite that is causing European companies and governments to avoid dependencies on US providers: there's no way to garauntee freedom from US government surveillance.
Historically, corporations were understood to be responsible to a complex web of constituencies, including employees, communities, society at large, suppliers and shareholders. But in the era of deregulation, the interests of shareholders began to trump all the others. How can we get corporations to recognize their responsibilities beyond this narrow focus? It begins in remembering that the philosophy of putting shareholder profits over all else is a matter of ideology which is not grounded in American law or tradition. In fact, it is no more than a dangerous fad.
The Myth of Profit Maximizing
“It is literally – literally – malfeasance for a corporation not to do everything it legally can to maximize its profits. That’s a corporation’s duty to its shareholders.”
Since this sentiment is so familiar, it may come as a surprise that it is factually incorrect: In reality, there is nothing in any U.S. statute, federal or state, that requires corporations to maximize their profits. More surprising still is that, in this instance, the untruth was not uttered as propaganda by a corporate lobbyist but presented as a fact of life by one of the leading lights of the Democratic Party’s progressive wing, Sen. Al Franken. Considering its source, Franken’s statement says less about the nature of a U.S. business corporation’s legal obligations – about which it simply misses the boat – than it does about the point to which laissez-faire ideology has wormed its way into the American mind.
>>In reality, there is nothing in any U.S. statute, federal or state, that requires corporations to maximize their profits.
Laws and statutes don't enforce contracts. But courts do. You are trumpeting a theory I've heard many times before. It's creators lack a basic understanding of contract law or corporate organization. :ookup "shareholder derivative actions".
In a lot of consumer devices, JTAG is at least partially disabled (sometimes they can throw a fuse that only lets you do boundary scan for manufacturing).
I would not be surprised at all that Apple's internal 'backdoor' (if you can call it that) is just resetting the security enclave, essentially erasing everything on the NAND. That'd be fine for refurb/manufacturing, desirable even as that guarantees that full system wipes happen before a refurb goes to a new customer.
There are no more actual fuses, but JTAG access can be limited or completely disabled on most (or possibly all) modern ARM chips by setting a value in the non-volatile (flash or EEPROM) memory. Even the IC manufacturer (Freescale, ST, etc.) can be locked out completely.
Most ICs can be completely erased to remove the limitations on access, but this usually requires a 'mass erase', where the entire non-volatile memory is erased (taking any codes, passwords, and encryption keys with it).
source: I am an embedded software engineer who works with these settings in bootloaders and application software.
> That is not exactly true. They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.
Is this true? That would have to mean that either the passphrase is stored on the device or that the data is not encrypted at rest. Neither of these sound likely, frankly
It doesn't have to mean any of that. As long as the 10 mistakes limit is enforced in software or in a separate chip that can be replaced without replacing the actual encryption key, it can be bypassed. Then it is a simple matter of simply brute-forcing the pin code. Since these are usually 4 digits, there's only 10000 possibilities, which is laughable to a brute force attacked.
They don't even have to do that. They wrote the OS, they have the signing key for OS updates. All they need to do is push an update to the device with a backdoor that allows reading off the unencrypted contents post-boot (possibly with the addition of a judicially compelled fingerprint scan or PIN brute force to get the encryption key out of whatever on-device escrow it's stored in).
The only way to secure the device against that would be to have the users manually memorize and key in a complete 128 bit encryption key, which they could then refuse to provide.
I think we tech folks were fooling ourselves with the idea that Apple had somehow delivered a "snoop-proof" device. They really didn't (no one can!) as long as they're subject to government or judicial control.
This is not really true. The secure enclave is a separate computer. It doesn't get software updates.
> possibly with the addition of a judicially compelled fingerprint scan or PIN brute force to get the encryption key out of whatever on-device escrow it's stored in
This is the whole problem. The keys are in the SE. You can't brute force the PIN because the SE rate-limits attempts (and that rate limiting cannot be overridden by an OS update because the SE is not run by the OS).
If you can get a fingerprint scan then all bets are obviously off, but then you don't need Apple at all.
and yet it would be less interesting to consider if the password was a "six-character alphanumeric passcode with lowercase letters and numbers" because even if the software rate-limiting was disabled with a rogue firmware update, the PBKDF2 or similar iteration count makes brute-forcing impractical.
> A large iteration count is used to make each attempt slower. The iteration count is calibrated so that one attempt takes approximately 80 milliseconds. This means it would take more than 51⁄2 years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers
Is that really true? The enclave's firmware is in ROM and non-upgradeable? I'd always assumed it got a signed blob like everything else does. Obviously it's possible to design a system like that, I just haven't seen it reported anywhere that it actually works like that.
Edit just to be clear: the requirement really is that the firmware be stored in a ROM somewhere, probably on the SoC. That's a lot of die space (code storing a fully crypto engine isn't small) to dedicate to this feature. Almost certainly what they did is put a bootstrap security engine in place that can validate external code loaded from storage. And if they did, that code can be swapped by the owner of the validation keys at will, breaking the security metaphor in question.
The key thing would be for it to lose all stored keys on update when the current passphase has not been provided, and it sounds like that may not currently be the case.
Maybe in this case, Apple could comply, but a simple tweak would make it impossible in the future?
But "on update" isn't really the issue. If the code can be swapped underneath it, how does it know an "update" took place? Again you're in the situation where all of that process would have to be managed by hardware, when what is really happening is that the enclave is a computer running signed software that can be replaced.
Sure, but the signed firmware could be written to delete any stored keys when it accepts an update and the phone's passphrase has not been provided. That's assuming that it manages it's own update process, has the firmware securely stored within it's own die, etc. It's entirely possible ... but only Apple really knows.
I would not actually be shocked if they originally did wipe out stored info on firmware update, but had some issues with people updating their phone and losing everything, so they ifdef'd that particular bit out in the name of usability.
But... the firmware is stored in external storage. How does it even detect that an update was performed? You're saying that if the hardware had affirmative control over all the secure storage (e.g. the RPMB partitions on eMMC, etc...), then it could have been written to do this blanking.
Well, yeah. But then you'd have a system that couldn't be updated in the field without destroying the customer data (i.e. you'd have a secure boot implementation that couldn't receive bug fixes at all).
It's a chicken and egg problem. You're handling the problem of the "iPhone" not being a 100% snoop-and-tamper-proof device by positing the existence of an "interior" snoop-and-tamper-proof device. But that doesn't work, because it's turtles all the way down.
Ultimately you have to get to a situation where there is a piece of hardware (hardware on a single chip, even) making this determination in a way that has to be 100% right from the instant the devices go out the door. And that's not impossible, but it's asking too much, sorry. We're never going to get that.
It's certainly possible to design such a system with external firmware and still allow for secure updates.
The enclave would store (in secured storage) a hash of the last used firmware. Hardware would have a hash update capability, but this destroys all other stored information (i.e., keys) if used when the enclave is not currently in an unlocked state.
On boot, hardware verifies firmware signature as usual but also compares the firmware hash (already calculated for the signature check) to the stored value. If there is a mismatch, update the stored hash. Since the enclave is currently locked, the hardware clears the keys.
Since it's in hardware, you're correct that it would have to be 100% right, but that's quite feasible for a simple update mechanism (indeed, the most complicated bits are reused pieces from the signature check which already has this requirement).
Have it store the firmware itself encrypted with the UID. It never leaves the secure enclave so only the secure enclave itself could "sign" updates. You could still allow for recovery by providing a method to reset the UID.
I'm not claiming it is in ROM or that it is not upgradeable if you are Apple and have physical access to the device. I'm not sure on that point. What I think must be the case is that Apple can't remotely upgrade the SE firmware as part of its iOS update mechanism. Although, to be perfectly honest, I have not seen this explicitly documented.
So... it sounds like you more or less agree with me. Apple can comply with this court order and open your secure device. We just differ as to whether they can do it over the air.
(FWIW: OTA firmware updates are routine in the industry. I've worked on such systems professionally, though not for Apple.)
The "bootstrap security engine" could store a hash of the blob in it's secure flash storage, and only update the hash if the user enters their pin, then reload the firmware blob. If the stored hash and blob hash ever don't match on boot, wipe the keys.
How can they simply "push an update"? I've never seen iOS auto-update without first prompting the user, which I'm assuming is a very intentional limitation.
"From what I understand Tim is doing, and I greatly admire, is trying to avoid a judicial requirement that they be able to do this on demand. The so called "back door" requirement, because he knows, as others do, that such a feature would be used by more than the intended audience, and for more than the intended uses, to the detriment of Apple's users."
To be fair - the only reason he's doing it is because it would cause a significant drop in sales for Apple devices. People overseas would immediately stop buying because "The American government is listening", and that's assuming countries like China and Russia wouldn't ban them outright.
This is the big thing American politicians are missing or glossing over in their campaigns to get re-elected: Forcing American companies to compromise their products will result in a significant loss of revenue overseas. Microsoft, Google, et al, have already reported it and foreign governments have already started banning goods/services (due to the Snowden revelations).
"To be fair - the only reason he's doing it is because it would cause a significant drop in sales for Apple devices."
That's not being fair at all. To say the only reason he is doing it is to protect iPhone sales doesn't speak to Tim's character. Of course he cares about sales, but he also cares about privacy.
Doing the right thing is very often (if not almost always) the most profitable thing. It's very difficult to make a business out of serving your customers poorly; if you disagree, give it a shot, and let me know how it turns out for you.
Serving your customers well is good for profit, but it is not the same thing as doing the right thing. Ad-based companies have customers and users. By serving their customers too well they easily screw over the users, with obnoxious and even unethical ads.
That seems like the classic mistake of believing that the larger market assigns importance to repairability. I haven't seen much evidence this is true.
Normally people on HN are much more skeptical about airy promises and assertions from corporate executives. I don't see what behavior on Tim Cook's part has indicated he's more to be trusted than anyone else.
Tim Cook was asked at a shareholder meeting to only do things that were profitable. He passionately rejected the idea, and named accessibility for the blind, environmental issues/climate change, and worker safety as areas where Apple invests because it's right, without considering the "bloody ROI". [1]
Compare to GE, which rolled over [2].
So I believe Mr Cook when he says his opposition to the FBI's request is rooted in a desire to do the right thing, and not the bottom line.
Just to sharpen your comment, here's the summary of that interaction with a shareholder:
[Cook] didn't stop there, however, as he looked directly at the NCPPR representative and said, "If you want me to do things only for ROI reasons, you should get out of this stock."
That's a very blunt statement that the immediate stock valuation is not Cook's only consideration.
Some people seem to have a hard time taking Cook at his word, but he's been quite consistent. This massive skepticism feels more like nostalgie de la boue than anything based in facts.
I think the only problem here is the strongness with which it is worded. It would be fair to say one of the reasons he is doing it is because of stock price, but to assert the only reason he is doing it is to cast him as completely uncaring about security and privacy. Without evidence to back this up (in this case, evidence that he doesn't care about security and privacy), it's an attack on his character. It's entirely possible his reasons include both, and that the moral sentiment behind the message is entirely truthful.
We should be careful about plainly stating what someone else's motivations are when it contradicts their own story.
Edit: s/it's/his reasons include/ for clarification.
He's an older gay man, I would be shocked if this didn't influence greatly his opinion in this realm. He's lived through some times that weren't too friendly to "his kind" that were open.
We still live in a world where all people are not treated equally. Too many people do not feel free to practice their religion or express their opinion or love who they choose. A world in which that information can make a difference between life and death. If those of us in positions of responsibility fail to do everything in our power to protect the right of privacy, we risk something far more valuable than money. We risk our way of life.
Not more to be trusted than anyone else, perhaps. But the claim was, "the only reason he's doing it is because it would cause a significant drop in sales for Apple devices" (emphasis added). I can trust him no more than I trust anyone else, and still not be totally cynical as to his motives. They're mixed, they're not completely altruistic, but they're (probably) not completely mercenary, either.
Cook's responsibility is first and foremost to the stockholders, and secondarily to the customers. Decrypting the iPhone would seriously compromise the security of Apple's products, gravely damage the company's credibility, hurt sales, and drive the stock price down.
No CEO is going to take such a drastic step unless they are a craven, cowardly type who meekly obeys ask-for-the-sky demands from overbearing federal law enforcement types, and Cook surely did not rise to his current position by being a pushover.
That's not to say there won't be some kind of secret deal made behind closed doors, but secrets tend to get out. Apple would not be so foolish, I think. Yahoo? Microsoft? They just handed over the keys to their email to anyone who demanded it -- the Chinese government, the NSA -- but Apple has no history of this type of behavior. Surely Snowden would have revealed it if they had.
Not saying I necessarily agree (or disagree), but the premise is that right now iOS phones are the only ones that do security right, out of the box. They're worth the premium price for that feature. As soon as they no longer have that edge, there's no reason to choose them over any commodity Android device, so sales would drop as they lose their differentiating feature.
They'd go to Android. Apple only has a significant share of mobile users in the USA. Most other countries they are losing (globally they have something like 8% vs Android's 85% market share).
You're probably right, but it's great to explain this decision in economic terms. It'll sink through to people who think privacy is only for "good" actors, and those who don't like the government hurting businesses.
This is a good way to think about things skeptically, but to say it as though it is fact is misleading. Is it that unlikely that a business professional can frame his beliefs and knowledge in this domain in such a way to justify himself to the shareholders? Or are we all just too skeptical that people in power act against morality whenever easy or possible?
Let's suppose you can see in to Tim Cook's heart and this is true.
So?
I understand wanting to know people's motivations, from both the perspective of predicting future action and just because we're nosy monkeys. But frankly, what's in Cook's heart doesn't matter. Actions do. And to date, in my view, he's done pretty much exactly the right thing on this issue all along.
Maybe he's defending customer privacy because he believes the Lizard People have religious objections to invading until all humans have Freedom of Math. It doesn't simply matter.
Being the vocal advocate to stand up to the gov't IS risky. The wording is carefully crafted to prevent spin damage (pro-terrorist, anti-law enforcement).
> the only reason he's doing it is because it would cause a significant drop in sales for Apple devices. People overseas would immediately stop buying because "The American government is listening", and that's assuming countries like China and Russia wouldn't ban them outright.
That hasn't happened with other devices or earlier iPhones that aren't as secure.
That is not exactly true. They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.
From what I understand Tim is doing, and I greatly admire, is trying to avoid a judicial requirement that they be able to do this on demand. The so called "back door" requirement, because he knows, as others do, that such a feature would be used by more than the intended audience, and for more than the intended uses, to the detriment of Apple's users.
What I really find amazing is that I was at a talk hosted by the East-West Institute where the Air Force General of the new cyber command (whose name escapes me) complained that "we" (silicon valley) had let the government down by not writing strong enough crypto to keep our adversaries out. I remarked that it was the ITARS regulation and the Commerce department at the behest of the NSA which had tied our hands in that regard, and that with a free reign we would have, and could do, much better. Whit Diffie was there and also made the same point with them. And now, here we are 10 years later, and we "fixed" it, and now its our fault that they can't break into this stuff? Guess what? Our adversaries can't either!
The right to privacy, and the right of companies to secure that right with technology for their customers, is a very important topic and deserves the attention. I am really glad that one of the most valuable companies in the world is drawing a bright line in the sand. So I really support Tim's position on this one.