I've worked with the FBI and the Secret Service investigating computer crime.
The Secret Service is extremely competent when it comes to computer forensics, and when they don't know what to do, they don't guess, the consult with experts.
The FBI is the opposite in every way, mostly because of budget constraints and the subsequent lack of training. I hope that this is a good learning opportunity for them and a chance for them to increase their training budget in this area.
Also hearsay personal experience but a friend of a friend had his house raided by the FBI and all his computer equipment impounded. When he got it all back they told him his hard drives were empty or unformatted. He had them all formatted as ZFS...
I'm unsure as to how to read this; are you saying because they were ZFS, the FBI couldn't read them and because of their lack of expertise, assumed they were empty?
Or that they formatted them before returning them?
I read it as his drives were formatted as ZFS and the FBI couldn't read them because of their incompetence (probably only thinking to try and read them with Windows as FAT32 or NTFS.
Judging by how Windows, when presented with an unknown FS, acts as if the disk were unformatted and immediately offers to format it, my guess would be both...
(And by "unknown" I mean of course "anything that is not FAT* or NTFS"...)
Johnny clubfingers 5-0 clicks Ok, and destroys "evidence" for being a total idiot... I take it he/she was hungover during that 2 hours of computer forensics in the academy. For crooks, this is good news; for stopping potential future victims, this isn't good, and misapplied to innocent/MPAA/RIAA enforcement, it's destructive and lowers LE credibility.
The moral of the story is for individuals, whom should implictly fear government overreach no matter whom is in office, one has to back their shit up and make it SWAT-proof, even if that means running several TahoeLAFS boxes in countries like Switzerland, because running a server (physical or Linode) or just replicating data to a friend's server just doesn't cut it and never did.
I wouldn't read too much into this. SOP for most police forces is a tiered approach to seized drives. They plug it into a windows machine and see what's what. If that doesn't get the immediate results they have to hand it to the professional data recovery people. Even a cursory forensic examination of a drive starts at 1,000$ and may yield nothing if encrypted. With a small pile of drives to look at, they probably didn't bother trying after spotting it wasn't going to be plug-and-play.
Why? It's people following a forensics script/program which looks for Windows partitions. There aren't a huge number of people who understand this kind of thing and most of them are able to work at places where they're paid much better than the FBI.
the forensics programs (EnCase, FTK) have a lot of problems but they don't assume that the drive is "windows partitions" though it's possible that they don't deal with ZFS.
you would think that national level stuff would have people take more care, but if you're dealing with state or local police, they will have someone who has taken an 8 hours Encase or FTK class driving a GUI to gather evidence and if the tool doesn't support it, there's effectively no evidence to gather.
Yeah, I once had to explain to an officer what a known_hosts file is so they could send MLAT requests and release me from jail without risking that I would wipe (the already automatically wiped) servers myself when I got out.
You're going to go through the trouble to get a warrant for a raid and then apply such gross incompetence to the seized evidence? When was the last time you pulled a hard drive out of a functioning machine, found it to be empty and didn't immediately think, "huh, that's odd" before throwing it in the trash.
I guess it depends on whether the hard drives were the target of the raid or just a collateral of "grab anything that may be useful" mindset. They could already have gathered enough evidence without needing to waste time/money on digital forensics. Hard to say without specifics.
The fact that they went out of their way to assure him that his own hard drives were empty reeks of manipulation.
Slightly off topic, but it really isn't hard to find partitions. Just run the various software to scan and that's it. I've done it at least twice when accidentally erasing a windows partition (once with fdisk when I misread the output, and at that time I didn't even know what a partition is).
> Also hearsay personal experience but a friend of a friend had his house raided by the FBI and all his computer equipment impounded. When he got it all back they told him his hard drives were empty or unformatted. He had them all formatted as ZFS...
I had imagined that computer forensics "experts" would make this mistake somewhere at some point, but I did not think I would actually hear that it had happened. Thanks for the affirmation, even if it is just hearsay.
It makes me curious about what goofs were made with the IRS hard drives that had "crashed" according to forensics "experts".
This is so true. It has been a number of years since I worked with either agency doing computer forensic work, but my experience was pretty much the exact same. While SS consistently made a concerted effort to investigate crimes and build a solid case, FBI seemed often to be interested primarily in raising the public profile of the cases they worked on to get more visibility in the press, often at the expense of (what appeared to me at least) due process.
Whereas SS had a number of excellent specialists who understood acquisition, the FBI seemed to be wearing clown-shoes most of the time. I'm not surprised at all that they botched this case so badly.
It's clear as day that Apple is on the right side of this argument. It's not their job to bail out the FBI for yet another colossal screwup. Especially not when it damages their product so severely.
<<The FBI is the opposite in every way, mostly because of budget constraints and the subsequent lack of training. I hope that this is a good learning opportunity for them and a chance for them to increase their training budget in this area.>>
Besides the legal precedents and other associated drama, I think is one of Apple's major concerns, and one of the reasons they implemented the "we don't have the keys" approach to their encryption. If the FBI can always just call on Apple (or Google) to fix whatever mistakes they made, there is little motivation for training / getting better on this front, effectively making Apple the computer forensics arm of the government.
The request they submitted to Apple was clearly written by competent people. They knew exactly what and why they wanted to do, how Apple can help and why only Apple can help.
I think part of that is Apple is/was actively working with the FBI to find alternative solutions. I would bet that the engineers described what would need to happen, i.e. the new OS. As is often the case, the Apple engineers probably documented alternative solutions. The FBI took that "solution" and ran with what they described. It's the "well Apple told us this is the only way to do this, but they won't do it for us" scenario.
> I would bet that the engineers described what would need to happen. ... The FBI took that "solution" and ran with what they described.
I think you absolutely nailed it!
For a high-profile investigation like this, Apple would have given the FBI access to the key developers in the security group. The developers are smart guys trying to be helpful. They are not thinking about Apple policy, or constitutional law, or the big picture of world liberty and privacy. They are tasked with finding the solution to a technical problem: How to get access to protected data.
What likely happened--exactly as you already suggested--is that the FBI asked the developers to explain how the security system could have been designed so as to permit easy government access in cases like this. The FBI was asking "hypothetically" of course. The developers happily gave a blueprint of how the system could have been designed.
The FBI now demands that blueprint be implemented.
Apple should have talked to the FBI through lawyers only.
> For a high-profile investigation like this, Apple would have given the FBI access to the key developers in the security group.
> Apple should have talked to the FBI through lawyers only.
You went from "would have" to "should have", turning your hypothesis into a certainty...
Why wouldn't the developers in the security group think about constitutional law? Have you ever seen an internet forum that talked about computer security regularly, yet did not talk about constitutional law regularly? If not, how would those developers have possibly avoided regular reminders about the 4th amendment?
They didn't have to avoid any reminders. They were most likely just asked "how" it could be done, not to do it. The law comes into effect now, where the FBI is trying to get the courts to order them to comply. Simply telling someone how to potentially do something illegal is not illegal itself, and really doesn't cross any boundaries in my opinion. A white hat hacker uses many of the same techniques that a black hat hacker uses, but in one instance it is legal and in the other it is illegal.
Well, that's the overhead of selling closed-source devices.
If you think about it, consulting vendors is probably a better use of taxpayer money then RE-ing every stupid crypto system on the market.
They contacted Apple, did their homework and came up with specific and generally sane demands. They even went as far as suggesting to perform the hacking at Apple site to ensure that insecure firmware doesn't leak outside.
BTW, this last part looks very much like a response to concerns voiced by Apple, which means that the official statements from both sides are just a tip of the iceberg.
Sure it does. If all the hardware and software associated with iPhones was open-source Apple could tell the FBI to fuck off and write their own firmware. Then the only thing they would need Apple for is signing it once it's complete. And if each user could sign their own firmware updates with a key based on their password or provide their own key then it's game over.
They've put themselves in a weird legal situation because they've made it so that they are the only ones who can actually write and sign the firmware the FBI is demanding. A judge would laugh them out of the courtroom if the FBI was technically capable of writing the firmware and demanded Apple's help because it was too hard.
> Sure it does. If all the hardware and software associated with iPhones was open-source Apple could tell the FBI to fuck off and write their own firmware. Then the only thing they would need Apple for is signing it once it's complete.
This is an example of a non-free software feature. Why are the keys baked in and can't be disabled. And "write your own firmware" doesn't solve this problem -- they could just pay a developer to do it $X an hour. A better security model should've been used -- where updates have to be confirmed (read: signed) by the user before they are applied.
> Spivak 1 hour ago
Sure it does. If all the hardware and software associated with iPhones was open-source Apple could tell the FBI to off and write their own firmware.
No, not based on the interpretation of the all writs act that the FBI is attempting to use. As far as the FBI is concerned, they could force my Grandma to write a backdoor if they deemed her the best person to do so. Given that she can't answer the phone most days it'd be a lon wait, but I wouldn't put t past them.
Poor choice of words, I meant general "closedness" of the platform - from undocumented design, through lack of source code up to centralized code signing.
The only reasonable way for law enforcement to deal with even a single one of those factors is to request help from device vendor.
There is one thing the FBI is very good at, and that's writing a compelling narrative. It's possible that there are highly competently people who know everything, but it's also possible there are moderately but not dazzlingly competent people who are really good at writing a story that feels complete and keeps one from asking questions outside the narrative.
Though, on second thought, I have to add that we don't know how many back-and-forth mail exchanges happened before they were able to come up with the officially published request.
Maybe they were just competent at working around excuses from Apple.
Exactly. Yet another reason to fight the court order. We should expect FBI to be competent, embarrassed if they aren't, and fix the problem. It's not a good state of affairs when a company is more trusted to do forensics.
Software update signing keys, which can't be disabled by the end user. This is what most people would consider "a flawed security model". Even UEFI lets you change the trusted booting keys.
Please enlighten me. Is this not exactly what the FBI is asking for? For Apple to flash a custom version of iOS that doesn't have the software rate-limiting and auto-wipe, which only someone with Apple's private key can do. A four-digit PIN is only secure in combination with those features. Having Apple's code-signing key is in fact "having the keys", except in the most pedantic literal sense.
Then they should apologize to Apple for tarnishing their name all over the media for "helping terrorists" to the point where the front-runner of a party, who has a chance of becoming the next president, called for their boycott because of it - when all along it was FBI's dumb mistake for not getting that data and Apple has no responsibility to fix FBI's stupidity. I think it's only fair.
That's assuming it wasn't all staged to take advantage of this situation to pass some backdoor law or set a precedent here, in which case, I don't expect the FBI to retract anything, because then their goal isn't to unlock this phone, but to set that precedent.
Just curious, and sorry if this is intrusive, but your blog is very up front about your employment history... reddit, netflix, paypal/ebay, sendmail...
At which job were you working so closely with the FBI and SS that you were able to ascertain their technical abilities?
In the mid-90s I had several encounters with the FBI while running an ISP in Oklahoma City. They asked me to write code to pull out specific entries in our massive amount of dialup logs and when I told them they had to pay me, they refused based on budget constraints.
I can well imagine jedberg has had to deal with them just being a systems admin for such large deployments as reddit and paypal.
Talk about a hell of a first day... We pretty much switched into full "get news about the bombing online in some form or fashion, however possible" mode for about 48 hours. This was before CNN had a huge online presence, so we had stuff going like a RealVideo stream of a webcam pointed at local TV news, a couple of people went down to take pictures, etc, etc...
Those early ISP days both sucked and were fun in their own special way. Trumpet Winsock for Win3.x users! Getting OS/2 online and older Macs! Trash-talking in the local Usenet groups!
Where you kept a copy of the Solaris Operating system for all of your coworkers to peruse. We use to track you back then as you were one of the larger pirates of our software. :P
What? That's a bit libelous, I'd appreciate an explanation.
If it happened at ioNET after September '96, I wasn't involved.
All of Texas.Net's systems ran legit copies of Solaris; a large amount of them were brand new and came with OS entitlements. I remember going to the post office one day to pick up our brand new copies of Solaris 2.6.
Are you talking about me running SUNHELP.ORG, which started in '97? That was (and is) a third-party user community and resources, along with mailing lists. It did not provide Solaris downloads.
I happen to have a (personal) archive of Solaris releases, but it's not public access.
The closest thing to "piracy" I could ever be accused of was reposting design documents for an unreleased system that I found on Sun's own publicly-accessible website in 2000.
If anyone at Sun held me in bad regard, it was never mentioned to me by anyone, and I had a lot of contacts at the company.
- I was one of 250 people to be picked as an external pre-release beta tester / community liason for the release of OpenSolaris.
- In 2005, Sun donated a fully loaded T1000 server (8 cores, 8G RAM) for use by me in running the site. I'd think that if I was a huge software pirate the company wouldn't encourage it by giving me expensive hardware for free.
This is the first time I've ever been accused of being a pirate! There's a first time for everything I guess.
To be perfectly honest: I vaguely recall source of an older Solaris version being passed around the user/hobbyist community, but I wasn't the originator. I'm sure I downloaded it at one point, but don't remember ever doing anything with it OR putting it up for coworkers to browse.
Like the (released and quickly discontinued) version of Solaris 2.6 for PPC that ran on certain RS/6000s, it was "out there" and easily obtainable by anyone in the hobbyist community if you asked enough people.
No idea what was done at TN after I left; I had issues with how management treated the technical staff and resigned in late '98.
Dunno if I want to be called "famous". Tried my best to do what i could for the Sun / hardware rescue communities, but I'm still a relative nobody in the grand scheme of things.
What's really weird is being called a "peer" by people who I looked up to when learning and just getting started. Massive case of impostor syndrome...
There are a ton of people out there, who are better talented and have contributed more to UNIX / Linux, open source, and the hobbyist/maker community in general that deserve recognition and fame.
I'm just a fat old fart sysadmin who's had a good run and was lucky to be able to enjoy most of it. The best I can hope for is to be thought well of by others.
Ebay and PayPal. The secret service doesn't like it when foreigners steal from Americans (their main job is to protect the currency, protecting the President is an add on).
I've worked with vice-principals at elementary schools investigating porn on library machines who had a better grasp of forensics.
Really, I teach a course in a local forensics program and have helped a couple local schools over the years. Any teenager with an iPhone could have explained that resetting the AppleID was a bad idea. There are plenty of very intelligent kids out there who need jobs. For the FBI to act in such a manner is inexcusable. Ditch some of these hack cops and hire some proper technology experts.
Wonder if folks realize this is the work phone not the personal one. The personal phone was destroyed by the terrorists. I doubt there's anything of value on the work phone.
But then again obviously FBIs long term goal is to break in all the phones regardless of the circumstances.
Wired reports: "...the company’s engineers had first suggested to the government that it take the phone to the suspect’s apartment to connect it to the Wi-FI there. But since reporters and members of the public had swarmed that crime scene shortly after the shootings occurred, it was likely that any Wi-Fi there had been disconnected" [1]
Am I the only one concerned that an iCloud backup translating into information disclosure is a major security weakness in Apple's platform?
Also, since Apple remembers old iCloud passwords to prevent reuse for a year, what stops them from setting it to the original value in their database? Even if there were information lost in their database when the password changed, surely they have backups, right?
backups I can see being a problem. Though just because you can prevent people from using duplicate passwords doesn't mean you can reset it to that password. Just use a hash.
This is like the icing in the cake! If it turns out the FBI's incompetence to secure the apartment prevented them from reconnecting the iPhone to the wifi in order to get a fresh backup off of it.
Seems like that part is just bad reporting. I presume they mean the Comcast (or whatever provider it is) account is no longer active, not that a router has been simply unplugged (annoyingly, the average user/journalist doesn't understand the relationship between a copper connection, modem, and router/WiFi and that hasn't been helped by Comcast bundling them as a single device).
It would still be relatively trivial to restore the connection so it's not a valid reason for requiring the backdoor, but I think that was the writer's intent.
Also, even if they didn't restore the broadband at the guy's apartment, couldn't they just take the router and plug it in back at the station, since the login credentials have already been established?
Even the mention of the reporters swarming the apartment highlights the absurdity of the FBI's logic. How can they justify needing to scour every last digital crevice when they couldn't even be bothered to secure and scour the physical space they had access to?
That's exactly what I said in the comment that you replied to:
"Also, even if they didn't restore the broadband at the guy's apartment, couldn't they just take the router and plug it in back at the station, since the login credentials have already been established?"
However, the sibling comment below my original asserts that iOS won't re-connect to WiFi after booting until the pin has been entered (and thus no backup will occur). If that's true, and if the phone had been powered down at any point, then retrieving the router isn't a viable solution anyway.
Actually, iOS devices do not automatically connect to wifi until they are unlocked for the first time after a reboot - I'm not sure about the cellular networks.
They do connect to cellular networks, but will not do an automatic iCloud Backup unless they are a) on wifi and b) connected to power, so this method would not have worked.
But even if it was incompetence, the February 19th Motion to Compel they stated that the owner of the phone tried to reset the password. Which seems a bit disingenuous if the FBI told them to do it.
Or is this the FBI carefully parsing words by saying the owner of the phone (i.e. the attacker's employer) tried to reset the password, without mentioning that said reset was done at the FBI's request?
If it is malice - the order for Apple to help should be reversed. If it is incompetence then that bolsters the argument that any backdoor will lead to leaks and unauthorized access to the backdoor, causing more harm to everyone.
Indeed. The best way to not have information stolen is to not have it, and particularly not have with someone proven to be incapable of security. http://www.softwar.net/stupid.html
>If it is incompetence then that bolsters the argument that any backdoor will lead to leaks and unauthorized access to the backdoor, causing more harm to everyone.
The backdoor the FBI wants to exploit cannot be leaked. They are specifically requesting it be limited to the specific phone.
If it were possible to take an apple update, edit it, and then apply to other phones, they wouldn't need Apple's help.
If you want to talk about "enablers of authoritarians", consider that Apple has built a system in which they are the sole authority, that is opaque to all introspection by third parties, and against which Apple -- and only Apple -- must protect their users against all comers, for all time.
In addition to which, our security relies on Apple themselves never changing their business priorities and choosing to exploit their position of absolute authority.
The fact that they have that authority is why the government can compel them to do anything in the first place.
If you want to talk technical specifics, then no, your assessment is incorrect. The Apple ID password was changed, but that doesn't affect the on-device keying.
PIN numbers are the weakest link in the iPhone crypto chain. Apple strengthened that link through non-cryptographic means: tamper-resistent key derivation software that either runs on the main CPU, or in later devices, on the secure enclave CPU.
That software enforces a limited number of retries, essentially strengthening the PIN number. However, Apple also retained the ability to subvert the owner's lock on the device and install new key derivation code that does not include those security features; this applies to both the 5c and later devices with secure enclave.
If Apple hadn't retained that backdoor, the FBI would have nothing to ask for. Apple has, however, and has consistently made themselves the sole authority and gatekeeper of these devices.
Thank you for explaining. I'm still not clear on what the FBI's major malfunction is.
How would relaxing the tamper-resistent key protection help here? One needs the PIN to reimage the device. Chicken & egg. Creating a one-off OS image can't help without first having the PIN.
And the goal is to get the data, not crack the phone. Why can't the FBI use the backups? And what do they hope to find that don't already know (by other means)?
Just sounds like CYA to me. The more I learn about this silliness, the less plausible the FBI's narrative becomes. The FBI screwed up, is now just finding scape goat.
---
Authoritarian already has a widely recognized definition.
> How would relaxing the tamper-resistent key protection help here? One needs the PIN to reimage the device. Chicken & egg. Creating a one-off OS image can't help without first having the PIN.
The weak link is that a PIN can be cracked very quickly; in hours or days. The search space just isn't very large.
The only thing preventing the FBI from doing so is the Apple-signed iOS code that erases data keys after too many unsuccessful retries.
So, if Apple uses their privileged backdoor to disable that check, the FBI can brute force the encryption key by trying as many PIN combinations as they like.
In effect, this means Apple already has the cryptographic backdoor necessary own any PIN-protected iPhone in the world.
That's small potatoes, though -- they can also install new software on locked devices, and push modified updates to applications distributed through the AppStore. After all, apps are resigned with Apple's signing key, discarding the original software authors' signatures.
When you factor in bitcode (in which Apple compiles the actual binaries server-side), application authors can't even verify that distributed binaries match what they uploaded, and the use of a relatively high-level bitcode allows Apple to much more easily patch/rewrite significant portions of the application.
In other words, Apple built a system in which they have almost absolute authority over every iPhone, and due to strict platform DRM, there's almost zero transparency into their use of it.
> Authoritarian already has a widely recognized definition.
"adj. Characterized by or favoring absolute obedience to authority, as against individual freedom: an authoritarian regime."
Can you install software on your iPhone that pre-empts Apple's authority over the device?
Can you install software without Apple's approval?
Can you prevent Apple from installing whatever software they like on your iPhone, including software that implements CALEA-compliant real-time surveillance?
The answer to all three is "no", and why I think this absolutely fits the "authoritarian" definition.
You can, of course, use a different vendor's phone. The situation there will be roughly the same. Eventually, if nothing else changes, we'll see CALEA expand to cover smart phones in the same way it expanded to cover internet traffic once the ISPs were sufficiently consolidated. The vendors' authority over the devices makes this easy.
Why can’t it be leaked? The software needs to be created and installed on the device. Even if it’s entirely in Apple hands, there’s no guarantee it will ever be leaked.
It wouldn’t need editing. It’s intended to disable the timeout when brute forcing passwords. It’s incredibly dangerous software to even exist. And also insanely valuable. Even more incentive for someone to leak it, even at Apple.
The software would only disable those features on that specific device, which would be hard coded. Even if you moved the software to another device, it wouldn't work. Even if you had the source code, and modified to work on a different device or all devices, you wouldn't be able to do it unless Apple signed the modified software as well.
It's not clear to me at least that the phone identifier they would be hard-coding into the build is actually designed to be a cryptographically secure and unalterable identifier protected by the secure enclave. Well the 5C has no secure enclave anyway so how is this ID secured?
If you can reprogram or electronically intercept and alter the ID as it is read by the firmware, the backdoor build could be run on any phone.
For example if it is tied to the UDID, the UDID = SHA1(serial + ECID + wifiMac + bluetoothMac). Here's an article where Apple says the ECID is alterable through the BPP (Baseband processor) [1] so perhaps exploitable by connecting to a BSE and hacking the BPP via LTE vulnerabilities. The serial number, WiFi and Bluetooth MACs can all be altered as well. So I'm not convinced UDID locked builds cannot be worked around by a motivated adversary.
Heck, finding a SHA1 hash collision by altering only the most easily set MAC addresses is computationally feasible and costs less than $1 million!
It's shocking to me that people who ostensibly know something about software development are comfortable making statements like "it would never be able to leak because Apple can simply write a (bug-free, unexploitable, perfectly secure) set of checks locking it to the device"
It's shocking to me that people who ostensibly know something about software development take Apple at their word in this case.
Apple already has the backdoor.
Are you willing to guarantee that Apple will never lose control over their signing keys, giving whoever acquires them the ability to end-run the security of a locked device and install software that you, the device owner, are sandboxed from inspecting?
> Are you willing to guarantee that Apple will never lose control over their signing keys, giving whoever acquires them the ability to end-run the security of a locked device and install software that you, the device owner, are sandboxed from inspecting?
No. But this doesn't mean I don't think there's ALSO harm in the FBI or any government agency being able to demand companies build tools that expand the use and usability of that backdoor to parties beyond the company holding the key.
It sucks that Apple has a one ring when it comes to iOS security. It's incredibly dangerous if a government can require them to wield that one ring for arbitrary purposes via a contortion of the All Writs Act. And it's just plain stupid for software professionals to base their opinions on a belief that anyone is capable of writing an unexploitable check for device identity.
The FBI or any other government agency has always been able to demand companies do exactly this.
It was the height of naivety to think otherwise; it's not like we lack historical examples of what happens when a small number of companies make themselves the linchpin of trust/security:
Prior to this event, I had no idea that this generation of programmers seriously thought they could centralize so much information and control into their own hands, and somehow keep it out of the government's hands when they eventually came knocking.
Even if Apple wins this argument, they'll have to keep winning every argument, every network intrusion, every espionage attempt, forever. This particular argument is pointless; the high-value-single-point-of-failure security model is fundamentally flawed.
So we should what? Throw up our hands, not protect anyone who needs a cell phone? Concluded that everyone who isn't running RMS' Lemote Yeelong is fucked, and throw them to the wolves?
It seems obvious to me that we have to take the world as it is; yes, centralization of security is bad. Yes, we should fight to get away from this centralization of power in companies like Apple.
But as it stands now, it's incredibly important to support Apple's fight against this dramatic expansion of the All Writs Act's powers. The fight isn't "pointless", it's the exact opposite -- the security and privacy of hundreds of millions of people in the world, today, rests on the success of fights like these.
How much better it would be if we were all running Gnu/Debian Mobile on our OpenPhones is completely irrelevant. That's not the world we live in and better, open solutions are going to take years and decades to work toward.
We are never going to get to that world if Apple loses fights like these. We already have legislators working to make even the security offered by iOS today, for all its flawed dependence on Apple, illegal. Once these privacy and security rights are gone, that's the new normal, and open, truly securable phones won't even be legal to manufacture in the first place.
In addition to lobbying against the government in this case, we should loudly criticize Apple for putting this backdoor in their design!
Let's invert the use of the "ticking bomb" propaganda - say we've got a phone with data that can prevent an eminent attack. What person is going to say we shouldn't unlock the phone, because it would set a bad precedent? I'm a steadfast believer in the idea that computing devices should be private extensions of one's mind, but I would still say it's stupid to not hack into such a device if it can be done!
If you want a device that guarantees individual privacy against the manufacturer/USG, it has to be designed for such. You can't build an insecure system and then expect an A for effort.
This is not a dramatic expansion of the AWA's power. This is exactly what the AWA is meant to do. The few carve-outs of the authority of the AWA go much farther -- that's exactly what CALEA is, for example, and it has been held up by the courts repeatedly.
Your references to the GPL nonsense are a false dichotomy; All we need is ownership rights over the electronics we buy, like we've had -- even on Apple platforms -- for decades.
Supporting Apple now just ensures that when Apple does fall -- whether it's an expansion of CALEA, or espionage, or just a shift in their business priorities -- we may never know about it, and the impact on privacy/security is likely to be much worse.
If Apple loses this battle, there's no real impact to the risk profile. Apple already had the backdoor. The only change is that we're forced to be honest about that backdoor's existence, and start thinking real hard about how to avoid this kind of centralization of power, and its inevitable use, going forward.
In this case, the corporation is the threat, and corporate statism is probably the worst possible end game.
The state isn't really the problem, though -- centralization of authority is. The DoJ issue is basically moot; this is a lawful request made under public scrutiny and judicial review.
Whereas Apple could just change their business priorities at any point, and never even has to tell us.
The risk footprint is rather limited at this point, that much I trust. Each step in the sequence required to arrive at a compromised version of iOS that behaves the way FBI wants, is a step that increases the risk footprint for Apple and everyone with an iOS device. We could argue the size of this expansion, but we can all agree it is non-zero. And by definition trust is lost in proportion to that increased risk footprint.
I think that's an unreasonable burden on any company, but including users. This isn't just limited to Apple. Any company signing code is at risk of being asked to apply digital signatures to the code equivalent of malware, and to the free speech equivalent of falsehood. No.
Your argument is no different than the arguments that claim crypto backdoors can be kept secure. The problem is the existence of the backdoor, not the processes or politics that ostensibly will prevent its abuse.
Apple could ship encrypted backdoored binaries under an NSL gag order tomorrow, might not even know it themselves, and we'd never notice because we can't even introspect the device. In a few years, the federal government could extend CALEA to cover Apple, and there'd be little we could do because we can't override Apple's control over the software.
The security model is flawed; it requires Apple to fight and win every argument, every battle, every espionage attempt, in our favor, forever. The longer we propagate this security myth that putting absolute trust in the hands of the few is a viable security model, the worse things will be when it fails.
In the meantime, complying with this legal request doesn't meaningfully move the risk needle. The risk already existed. All it does is force Apple to admit that they hold a backdoor -- something they obviously are loathe to do, as noted by the US Attorney when she was forced to submit an additional court filing responding to Apple's public, calculated attempt to define the public debate before even responding to the court.
I disagree with the characterization that there's already a backdoor. Just because there's something of a black box involved in Apple's source code, compiling, and signing process with which a backdoor could be injected, is not proof of a backdoor.
However, I agree that the security model they have has a weakness, which is that it requires them to keep fighting against sovereigns, not just the U.S. government, for all time. That's a problem, I'm sure they're coming to terms with what that means, as are other companies and even users and governments. Historically Apple has been a closed-hardware company, it's difficult to imagine they'll shed that anytime soon, and if that's true there'll always be something of a black box involved.
But they could still alter the OS and firmware to require an unlock code to do OS or firmware updates, and if one can't be provided that all keys on the phone are erased first. Short of unknown backdoors, that obviates the current government request that Apple change the software. A law could possibly prevent them from shipping such an OS or firmware update. So the next step is making the user passcode stronger, and its hash algorithm much more computationally expensive. Even if there's a backdoor in the future the ability of friend or foe getting into the equipment is probably just too expensive within a reasonable time frame.
But if you're stuck on open hardware being the end goal, I'd probably agree with that, even though I think Apple will go to great lengths to avoid that.
That's a good point. The unique device ID is baked into the hardware, but it doesn't look like it can be read directly, so it might not even be able to put in logic based on the ID into the firmware anyways.
Even if the original software was built to work only one device, if it's ever leaked it provides a roadmap for targeted attacks which may apply similar workarounds though other means than the update path.
It could be leaked but if properly designed it just wouldn't work on other devices.
If you mean that Apple could leak it, that isn't a real risk. There is already a risk that Apple has it's key that it signs updates leak. If that leaks anyone can write the modified software.
Apple should just write the modified software to only work on that specific iphone (by serial number).
The software already exists. You just have to lightly modify the existing software to turn off security features. The problem is that we don't have apple's key.
The problem is, Apple has to fight this now because once they've done this once, they're in a losing position when the government wants it done 800 more times.
Right now, Apple can argue undue burden. Someone needs to sit down, nop out a bunch of security measure in an older branch of iOS, add boot and installation tests that lock it down to a particular serial number in a way that isn't vulnerable to any easy spoofing, test it all, and finally sign it.
If they do all this now, the second time the FBI shows up at the door Apple can't decide to then start arguing undue burden. Any government lawyer could win the argument that Apple already did all the heavy lifting, and that merely changing the serial code checked for could obviously not now constitute an undue burden on the company.
Once they've started down this road it's just a slow frog boil of "obviously not undue burden" small changes to "here's a list of 500,000 potential terrorists whose data we may need to access. Push an OTA update to them that has bypassable security"
>The problem is, Apple has to fight this now because once they've done this once, they're in a losing position when the government wants it done 800 more times.
Since each of those 800 times will used after court issued a legal warrant, that is actually good.
This is the technical aspect that I think isn't getting nearly enough attention. While I agree with Cook that this will set a precedence where governments, both the US and others, begin making these requests more frequently, it's not actually creating a master key in this case. Apple would have to be compliant in each individual case to apply this software to another device assuming they code in the specification that it's only applicable to this device.
This still leaves them the ability to remove their ability to do so in the future, by requiring the passcode to update the firmware on both the hardware itself and the secure enclave.
Someone replied and then deleted (which is fine). IIRC they said something like "a/this court order isn't creating a master key." (my words, summarizing from memory).
Which is technically true. But consider each event a black box, into which you throw a targeted phone, and it comes out unlocked. Whether that was through unique effort for each case, a general capability developed and archived by Apple, or even an actual backdoor/masterkey developed by Apple out of exasperation from the expense of being legally compelled in each individual case and others to come, the effect is the same, the phone is reliably opened.
We are too incompetent to use the front door. Can you make a back door for us? We for sure won't messing anything up when working with it, what makes you think so?
Sure simplest, but lets not pretend the fbi (or some people within) doesn't have a history of doing sketchy shit and if it were malice it wouldn't be the first time and i doubt it would even be the last if such were the case, you know humans and all lol
The simplest possible explanation for them shutting themselves out has to be incompetence rather than malice, right?
I don't know why so many people feel better about incompetence than they do about malice. Malicious parties can at least be expected to act in their own interests, even if in reprehensible fashion. Incompetent parties might do anything at all, e.g. permanently damaging our communications infrastructure for a chance at short-term political advantage.
You people seriously want me to believe that it was the fbi's incompetence that led to the gov throwing away their only get in free card for the most popular American phone used to coordinate the only substantial ISIS affiliated attack on US soil. Come on now, do you guys seriously think that this was unintentional?
You don't put rookies on this and I'd seriously be surprised if the NSA wasn't involved in this matter personally.
The government wants a back door installed into all iPhones period. I mean how do you expect apple to build a tool that can bypass the same security features the government is trying to deal with right now without them inadvertently letting everybody and their mother know that there is some fatal flaw in the security layer of every modern iphone and/or iTunes.
There's no magic way to fine tune a tool like this and if out spy agencies don't know this then god help us all. Isis is probably gonna win. rolls eyes
I mean jail breaking is one thing. This is vault busting and once people know there's a bug and where to look they will find it and exploit it.
And apple's only remedy will be to patch the backdoor. Which is obviously what the gov is trying to prevent apple from being able to do by getting a precedent established in the courts that wags a finger at Apple saying "ah, ah, ah you didn't say the magic word"
Please goddamnit!
The gov doesn't want to be Samuel l Jackson anymore. They want to reverse the roles and this case is the perfect cover. Just like the gov exploited the bombing on 9/11 to pass the patriot act. This is no different.
What's with this stupid 'terrorist's communication device of choice' meme? Oh wait, it's not a meme because no one but the FBI uses it, it's a propaganda ploy. Let's test this:
U.S. dollar is the criminal and terrorist currency of choice. We must therefore, of course, break the dollar.
iMessages for one. They're encrypted and aren't stored on Apple's servers, but are easily accessible (no password on the app) once the phone has been unlocked.
Why would you ever do this with the real device without thoroughly testing the circumstances with a stand-in first?
From a technical perspective, it seems very simple and easy to replicate before actually doing it and locking yourself out completely like they seem to have done.
Yes. The analytical chemist's credo: never test an irreplaceable sample using an untested procedure. Or as Norm Abrams from This Old House put it, "Measure twice, cut once." Spares one that embarrassing moment when you say, "I cut it off twice and its still too short." :)
Shit they didn't even have to do that. A simple phone call to Apple saying "Hey if we reset this password how much are we going to fuck everything up?" would have sufficed.
There are separate passcodes for the phone and for iCloud. An iOS device will automatically sync a backup to iCloud once a day if iCloud backups are enabled provided the phone is connected to power and a wifi network. When you first enable iCloud, the phone requests your iCloud password and stores it. If you change your iCloud password, automatic syncing is disabled until you unlock the phone and enter your new iCloud password there.
The FBI cannot unlock the phone. By changing the iCloud password, they have made it so that the phone cannot even possibly sync without first being unlocked. Knowing the iCloud password is of no utility for them.
I would like to see Apple implement a new firmware signing scheme that requires the user to sign the firmware using a key generated on that device and not backed up that is protected by the passcode etc. Once initialised the device will only accept updates signed with this key and upstream updates would be verified against the Apple key before being signed with the local key.
This would eliminate this vector and not drastically effect the usability of the device. Though it would also need a way to fully reset the device including the removal of this signing key in order to bring the device back to factory settings in the case of loss of the device specific signing key.
If Apple did write the firmware that the FBI wants and then signed it, would changing the device UUID hard coded into the firmware not invalidate the signature? Is the concern that there are somehow other signing keys in the chain of trust that exist outside of Apple that would make it a general exploit or is the concern that they would be a much lower threshold for getting this sort of thing? Apple might have a point if it is the latter, but if it is the former, the security of the iPhone is already compromised.
Why can't the FBI just work directly on the phone hard drive (removed the hard drive from the phone and connect it to another computer)? Why are the going through IOS operating system?
The FBI wasn't asking for the device to be decrypted. They were asking for a new operating system to be installed that circumvented the 10 wrong tries and the phone is erased.
The half-solution security chip plus the device's software form a psuedo trusted computing module.
It does seem that if the FBI desoldered the security chip and subpoenaed the KDF algorithm, they could recover the raw encryption key using their own hardware. But perhaps I'm missing a detail.
I don't think Apple is in the key escrow business anymore; on OS X they had such an option, to show the user the DEK and optionally store it. I don't think the DEK or KEK are backed up at all in iCloud. If you forget your password, all options I see involve device erasure.
> Apple encrypts your iCloud data in storage, but they encrypt it with their own key, not with your passcode key, which means that they are able to decrypt it to comply with government requests.
It wouldn't add security anyway. In order to encrypt data, they first need to read it unencrypted. Thus, they can't advertise iCloud as "we can't see your pictures", because they can, in theory.
You must mean something else here. Data certainly may be encrypted twice. In that case it would need to be decrypted twice in order to be useful. Perhaps you meant to say that data must be "unencrypted" in order to perform some other operation, e.g. generating thumbnails.
The Secret Service is extremely competent when it comes to computer forensics, and when they don't know what to do, they don't guess, the consult with experts.
The FBI is the opposite in every way, mostly because of budget constraints and the subsequent lack of training. I hope that this is a good learning opportunity for them and a chance for them to increase their training budget in this area.