Hacker News new | past | comments | ask | show | jobs | submit login
Tim Cook on iMessage Security: It’s Encrypted, and We Don’t Have a Key (techcrunch.com)
146 points by Tangokat on Sept 16, 2014 | hide | past | favorite | 107 comments



There is zero control over what public keys get handed over to your phone to encrypt an iMessage with. For all we know, whenever you want to send a message to $USER, your phone gets a public key for $USERs iPhone, her iPad and the NSA master key.

Tim Cook can state that they can't decrypt the message all he wants, but as long as there's no control over what public keys we encrypt the message with, the statement that Apple or the NSA can't read the messages is a half-truth at best.

Don't use iMessage for anything you wouldn't be using email for. Assume every message you send over iMessage to be public.

>If the government laid a subpoena to get iMessages, we can’t provide it. It’s encrypted and we don’t have a key. And so it’s sort of — the door is closed.

But the government can (and probably has) force you to have the phones send a copy of every message to some government server encrypted with the government's public key. They don't need to subpoena messages - they already got them all.


He didn't make a statement that Apple or the NSA can't read your messages. He said that Apple can't read your messages. However since you are so concerned about half truths, if you're going to criticise someone's statements, it would be nice if you'd address what they actually are saying.

It's entirely possible that the NSA has hacked Apple, or that an Apple employee has been subverted by the NSA and inserted a back door into the encryption system. Tim Cook wouldn't know about that and can't give assurances of that kind, and isn't trying to. All he can do is state what Apple as a company intends and can do acting according to it's policies.

So yes it's entirely possible Apple or the NSA has back door keys to iMessage. Tim Cook is now publicly on record saying that Apple don't. That's not a 'half truth at best'. It's either true or it's a lie. I'm not telling you to believe him or not, but historically these things have a way of coming to light eventually, one way or another.


The half-truth is here:

If the government laid a subpoena to get iMessages, we can’t provide it. It’s encrypted and we don’t have a key.

It's encrypted and they don't have the key, but since the user does not have any control over the public keys being added, they could add a trusted public key and get it anyway. So they can actually provide messages if they really wanted to.

I don't believe they really want to. But the thing we should've learnt from last year's revelations is (1) that companies can be forced to do so anyway via secret courts; and (2) the NSA is willing to make a 'technical solution' otherwise.

So, Tim Cook is not being completely honest here. Apart from what hardware can do, the only way to trust such an application is if you had the source code, the source code of the operating system and the source code to firmware blobs, and some way to prove that everything was compiled from public source code without modifications. Since that is not going to happen, iMessage should be considered as secure as unencrypted e-mail when it comes to governments. Of course, it does provide more protection than e-mail against less equipped actors.


They can't create trusted public keys after the fact, they'd have to already have them. So there's no half truth. Either they currently create and store such public keys or they don't. Tim Cooke is saying they don't. That statement can't be half true. It's either true or false.


They can't create trusted public keys after the fact,

Of course they can. They own the directory server that hands out keys.

http://blog.cryptographyengineering.com/2013/06/can-apple-re...

tl;dr: Apple can send you a public key of Bob's new device. Apple can pretend to send you a public key of Bob's new device. And since it's proprietary software, they can trigger a resend of your recent messages to Bob.

Moreover, if you use iCloud backups, they have the keys to your kingdom, since it fails the 'mud puddle test':

http://arstechnica.com/security/2013/06/can-apple-read-your-...


So this ends up being as "simple" as answering the question: Do you trust Apple? Given they control the operating system and all around it, having the directory server controlled by someone else (or distributed) doesn't solve the problem as they have access to anything they want in your device, meaning they don't need any keys to begin with.


I wouldn't trust any company that blatantly dodges the question of security with a half truth. It's clear he's playing word games for PR points.

You can reset your password and redownload all of your messages to a new device if you use iCloud backup. Cook is full of shit when he says that Apple doesn't have the capability. They own the system.

Even a dedicated civilian could reset your password, associate a device with your account and receive all messages going forward. To state that Apple cannot is such a laughable claim that it becomes clear that it's just a PR game. Which calls into question how sincere he is in his feelings about privacy.


For the record, you cannot redownload old messages in iMessage; if you use iCloud Backup, a civilian could fetch messages from there, but if not, they're out of luck.


Fair enough, I was mistaken. Security is hard, but no one should get a pass playing games like this when it comes to security.

A civilian could still associate another key (device) to the account if they're dedicated with the password or a password reset (not as stealthy) assuming 2FA is disabled. And Apple could surely do it stealthily since they own the system.

They have the capability, and it's too kind to his statement by calling it a half-truth in that respect because it's really a lie when he says "[Apple doesn't] have the capability."


>And since it's proprietary software, they can trigger a resend of your recent messages to Bob.

Apple control iOS so they could just release an update that disables crypto and leaks all the messages - your point is irrelevant since they can't trigger a resend without an update to iOS.


And you know this ability isn't latent with iOS currently how?


They can't create them for old messages, but they could theoretically add them for messages going forward. So it could work like an old-fashioned wiretap, where you get the subpoena, install a bug and start listening -- but can't go back in time and listen.

How? iMessage encrypts and sends a copy of every message to every device registered to a recipient (iPads, iPhones, Macs, etc.), each of which has distinct public keys. It's actually a pretty ingenious aspect of iMessage, that it can support multiple devices without any private key exchange.

But..... those recipient public keys are provided by Apple. So in theory Apple could add itself or law enforcement as another "device" -- let's call it "fbiPad" -- completely bypassing all that security.

However the "good" news is whether Apple is actually doing that in a given instance could theoretically be detectable by analyzing network traffic, since it would result in an additional copy being sent. Even though it's encrypted you could probably tell by the amount of data.


Could is an understatement. They are likely to be legally required to have this feature under CALEA.


True. But they can replace any program on any iPhone with an NSA/Justice system version, and give the replacement access to any keys stored on the iPhone.

There can never be any security on a closed system where a central party has all the control. That means apple's system is a lost cause for user security.


Apple can silently slipstream applications onto a users iPhone ? That would means dozens of employees would be involved in a conspiracy with the US government. And over all the years none of them has leaked anything ? Sounds far fetched.

Also better not use a phone. Because Android and Windows Phone would have the same problem.


"Hey, where'd this U2 album come from???"

Of course Apple can "silently slipstream applications onto a user's iPhone". Whether they'd risk getting caught doing it (even for non-all-powerful law enforcement agencies) without a court order or for anything less indefensable than a kiddie porn investigation is less clear. (And whether the NSA would even need Apples help to do it themselves is questionable - I suspect whoever owns the baseband has as much access as you'd even need, so now you're relying on AT&T/Comcast/Verizon/TMobile to put protecting your privacy above keeping the corporation on-side with powerful government agencies...)


Apple can remotely delete apps on iPhones. They've used it to delete apps that were removed from the app store. Apple can force your phone to download the latest U2 album. Upgrading your apps silently is probably not outside their control if they wanted to.


My phone certain did not automatically download Songs of Innocence. For those users whose phones DID automatically download it, they must have enabled the "automatically add purchased content to this phone" feature.

According to my research, Apple has NEVER used the "kill switch", unlike Google: 'Google also possesses a remote "kill switch" for Android apps, but unlike Apple, it has made use of the feature before. In 2010 the Android security team deleted two apps created by a security researcher after they "misrepresented their purpose in order to encourage user downloads." Its kill switch is referred to by the company as the "Remote Application Removal Feature.' [1]

[1] http://www.businessinsider.com/brazil-orders-apple-to-use-ip...


The point is they have the capability and because of your government's secret court system the general public very well may never find out whether or not the capability has been taken advantage of.

I can guarantee you that the "automatically add purchased content to this phone" check box does absolutely nothing to protect your phone from downloading and integrating data from Apple silently if they should choose to target you. And you would likely never know if they choose to target you.


iPhones by default are set to automatically download purchased content from the iTunes Store. It's a feature so purchases made on one device automatically appear on others. All Apple did was "buy" the U2 for everyone. Nothing magical about it.

To bring that up in the context of iMessage security shows either ignorance or stupidity.

And remote deleting is quite a bit different to silently upgrading apps for the purpose of spying.


So when do we find out the NSA has made it's own little "purchase" for everyone ?


They can issue an "update" for the system that adds that functionality, they have the trusted keys to sign apps.


And the source code to the compiler they use. And the source code to those compilers, ad infinitum.

And the masks for every IC inside the products, and the schematics of all the circuitry.

And then someone to verify that they are all the correct ones.


And someone to verify that the verifier is correct, and someone to verify her...

Ad absurdom...


Even if a new public key is added to the keybag, the NSA wouldn't get old messages, only messages from that point forward.

Still troublesome, and I wish there was some way to see what keys are being used and cross verify them with the remote user, so that if a new key is added you would be notified.

But there is no way for Apple to retrieve old messages and send them to the NSA.


Euhm, no, it would have access to any message you can see from the phone, old or new. Plus I don't think there's a per-message key at all. So they could have logged the messages, then install a patched iMessage on your phone to send them the key.

That is assuming Apple's claim is true at all, and they need this in the first place.


>He said that Apple can't read your messages.

which is also a half-truth. They can't read messages encrypted with another phones public key, but they can certainly read messages encrypted with their public key which they might or might not send to your phone in addition to the actual recipient's public key.

Neither me nor he is saying that Apple does in-fact read your messages (they probably don't), but saying that they can't read your messages is not correct. They certainly can by sending your phone an additional public key.


Not really. He's pretty clear:

"If the government laid a subpoena to get iMessages, we can’t provide it. It’s encrypted and we don’t have a key."

There's no wiggle room there. He's not saying we don't have the users key, he's saying categorically they can't provide iMessage information. I don't understand why you think that can be read as they can't get the information through mechanism X but that they can through mechanism Y.

That's not to say the NSA can't read them - possibly with Apple's help, possibly without - but he's really not in a position to talk about what the NSA can and can't do.

From a corporate PR perspective, he really doesn't have to say anything about this. He doesn't do many interviews, he could have declined this, or declined to talk about certain topics as part of the conditions around doing the interview. If Apple can indeed read your messages then the easiest thing for him would be to shut up and say nothing so it would seem odd to go public with a lie when he has that option.


What is missing is that they can add another key for intercept. They really need to be able to do so to comply with the law.

It's well established that the police can intercept communications with a warrant. Many HN posters have an issue with that too, but that is just a hard truth with decades if legal precedence.

The issue, IMO, is the warrantless collection part. IMO, iMessages probably protects you against the latter, although 3-letter agencies may record the messages and use other means to decrypt them later.


They could add that but as things stand right now that second key doesn't exist - or at least it did then Cook would be out and out lying here. Obviously anything can change in the future but based on what Cook is saying either he is lying or Apple can't read your iMessages - I see no wiggle room which creates a middle ground or half truth.

In terms of what the police and other agencies can request, they have been able to ask for stuff in the past because it was information you had. If someone produces a court order they have to comply with that but there is no way of making you produce information you don't have.

As a result some companies are now setting stuff up so they don't have the sort of information which might be requested.

I may be wrong about this but I'm not aware of any law that says that a company can't do this. Certainly if there is then there are plenty of businesses aimed squarely at this sort of privacy and security which are going to have major issues given that it's basically their USP.

And if a company has to do it then the same law will apply to any individual producing open source software to do the same - any such law would be highly unlikely to be dependent on the product being created and distributed by an incorporated company.

Where there will be a problem (and this is the point Cook was making) is for companies where gathering that data is fundamental to their business. Apple can happily survive without being able to read or track iMessages. Facebook or Google have fewer options when it comes to not recording or storing stuff.


Or Cook could just reap some PR-points with a convincingly stated flat out lie. It's not like anyone will remember it in a couple of weeks anyway.


I disagree. If things go south (meaning they do what they claimed they couldn't, and then get caught) people will surely remember it, and in that case it would be negative PR-points, doubled. That doesn't mean he's not lying, it just makes it less likely.


What's about iCloud backups?


>> "It's entirely possible that the NSA has hacked Apple, or that an Apple employee has been subverted by the NSA and inserted a back door into the encryption system."

"Entirely possible" is the understatement of the year. I think you mean "absolutely guaranteed." This is the raison d'etre of signals intelligence agencies. The alphabet agencies would be utterly failing at what they see as their job if they didn't have many, many plants inside Apple/Google/AT&T/large communication organizations.

And Tim Cook isn't stupid; he knows this. His claims might be technically true in a strictly literal sense, in that he doesn't have the key, but the claims are certainly misleading to the nontechnical public. He knows full well that iMessage is uber-compromised, probably in several different ways (legal wiretaps, technical intrusions, HumInt, etc).


"Let's not worry about it, it's probably all ok, you can't say anything unless you have 100% cast-iron proof"?

That's an excellent recipe for sleepwalking into this mess.


- "It's either true or it's a lie"

Or is it both at the same time?!


It's even worse that that. As of about a year ago, iMessage didn't do any certificate pinning: http://blog.quarkslab.com/imessage-privacy.html

Unless that's been fixed (I haven't come across any evidence one way or the other), you're not just worrying about Apple and the NSA: Your iMessages are vulnerable to anyone who can forge a certificate and MITM your connection to Apple's servers. I'd say that's a reasonably high bar except for one thing: I believe it covers the vast majority of corporate iPhones used on company-internal networks.


Certificate pinning usually (ex: Chrome) is implemented with an exception to allow a company administrator to install a new root CA cert on the device and MITM connections. Does iMessage not allow this?


This Chrome behavior surprised me so I checked the source. Indeed, you are correct. Pin checking is short-circuited if the cert is signed by a root.

` if (!is_issued_by_known_root || !HasPublicKeyPins(host, sni_available)) { return true; } `

https://code.google.com/p/chromium/codesearch#chromium/src/n...


Well I'll be damned. Those sneaky bastards. Thanks for pointing this out.


The fact that Apple could in the future modify their system to permit them to read iMessages (e.g. by interposing themselves between sender and receiver using fake public keys) is not really a fair basis for alleging that Tim Cook or anyone else is telling "half-truths" in respect of what Apple is currently doing. He has said they are not reading them and that, at present, they cannot read them. I'm not aware of any basis for impugning his credibility in that respect.

But it remains the case that anyone seriously concerned about security should continue to guard against that possibility in the future.


"as long as there's no control over what public keys we encrypt the message with, the statement that Apple or the NSA can't read the messages is a half-truth at best."

This.

That said, it is reasonable to believe the way iMessage is architected would likely make mass surveillance harder in general, just like widespread use of SSL does, so it is not entirely useless either.

The bigger problem IMO is that they then upload the message database unencrypted (from server's standpoint) to iCloud in the backup process (which is admittedly optional, but effectively on for most users). That, of course, is easily readable, as the celebrity hack shows.


make mass surveillance harder in general

More specifically, it makes it so that Apple is not forced to conduct mass surveillance by giving up everything when they receive a legal wiretap order, in the vein of Lavabit.

If you see someone else running a message system that has no way for the cops to read it, that should be a sign that it's insecure -- not technologically, but architecturally.


I'm unclear as to what you mean by that last sentence. If someone is running a message system that is distributed and keys to encrypt and decrypt are store locally, not on the server, then why wouldn't it be secure? The message system may be anything as simple as an addressing system

Ex: Email, which is run by any number of providers, however if an email client is configured to use PGP and access is via POP/IMAP and not webmail, it's still secure as far as we know. A message system that may not be email, but still doesn't store keys on the server, still provides no way for cops to read it. Except perhaps to see some message was sent, not what the message was.


> If you see someone else running a message system that has no way for the cops to read it, that should be a sign that it's insecure -- not technologically, but architecturally.

What's that supposed to mean? What about OTR or TextSecure or PGP over email?


someone else running


Your presumption is that iMessage encryption is useless if the NSA can still read the messages. Speaking for myself, I don't care if the NSA can read my messages. My biggest concern is keeping my private data protected from for-profit corporations like Apple and Google.


Crypto it's useless in the sense that the trust model is broken. The point of crypto is to trust math, not people. Apple can read iMessages too, but it's a safe bet that they don't.


I don't really disagree with the reasoning, I just disagree with how Apple only is the recipient of such scrutiny. Same deal with payments stuff. And the reason basically boils down to, they have fancy ads and their products look really nice so they must be lying to us.


> Assume every message you send over iMessage to be public.

You should assume any message that you send over the internet to be public unless you're a crypto expert.

And if you want to go one step further then you should probably consider anything you type into a computer to be public.


Even if you could set your own keys, there's no guarantee their infrastructure doesn't leak them to NSA. And considering Snowden's revelations, it's more likely than not.


Even if you could provide a key to iOS, would you trust that it's actually using it as you'd expect it to?


"Our business is not based on having information about you. You’re not our product. Our product are these, and this watch, and Macs and so forth. And so we run a very different company. I think everyone has to ask, how do companies make their money? Follow the money. And if they’re making money mainly by collecting gobs of personal data, I think you have a right to be worried. And you should really understand what’s happening to that data. And companies I think should be very transparent about it."

Honestly, I am starting to see the wisdom behind consumers choosing the companies with these kinds of business models. Its not that I dont trust the companies -- I guess its that I accept that governments and laws transcend companies and their explicit arrangements with their consumers.


For as much as you may criticize Apple for many things, you can't really say they are not sincere when they say something like this. A company has to make profits, and profits come from a well executed business plan. Apple's business plan is to sell hardware and create software and platforms that drive hardware sales. Collecting data for reasons that are not related to the functioning of a service makes no sense for the bottom line. Google makes profit selling ads crafted on people's data and a new controlled user accessing the Internet is entering its realm. That is good for the bottom line. It may be an oversimplification, but I guess that you can say a lot and get back to this.

Anything that has to do with NSA mass surveillance, though, shouldn't be transposed on this discourse. Not Tim Cook, nor Larry Page, nor any other CEO, individually and in their function, can possibly know everything about their company. Fifth columns, NSA rats, unconstitutional secret requests of information... They are American citizens, running American organizations they have to comply to the rules or they may compromise billions of shareholders dollars. But are they lying to us right now? I think they are not, because it will hurt their bottom lines more than ever. Is Google in a weaker position than Apple because of the very nature of its business model based on data collection? Absolutely yes.

The problem is ingrained with the system though and the only solution is to extirpate the cancer called NSA. I doubt it will happen anytime soon, though. We all love our gadgets and services too much, and we don't really have time to wrap our brain around such big problems for nerds.


Applications and software? Songs? Television? Movies? Health?

Seems to be a bit disingenuous to suggest that they aren't making any money from your habits and data.


No, it doesn’t, not really. Look at where their money is coming from.

You mentioned health. Apple has no plans at all to make money with that. It’s just an API that makes devices more useful for users.

Think critically about this, but please do it in a honest way.


Rubbish[1].

The idea that Apple is in the market to sell these things, and specifically, to sell more, isn't a leap of neurological fortitude.

To suggest that Apple isn't analysing your purchasing to augment sales in some capacity is akin to clinging to the fallacy that they avoid market research[2].

There is no myth of constancy here; Apple is not strictly a hardware vendor any longer. However it is spun, it would seem entirely plausible that they leverage past purchases and other such metrics of data against future purchases.

I apologize for not abiding by your definition of thinking “critically” in an “honest” way.

[1] http://9to5mac.com/2014/08/21/apple-talked-healthkit-with-in...

[2] Apple vs. Samsung debunked this myth entirely.


> You mentioned health. Apple has no plans at all to make money with that. It’s just an API that makes devices more useful for users.

So there's no 'Made For Iphone' program?


That seems quite unlikely, given how the API is structured (and how that market currently operates; please inform yourself before making claims).

Even if there were such a program for health hardware I’m not sure how it follows that Apple will want to sell user data … that’s just completely illogical.


"Our business is not based on having information about you."

That would be awesome if that was true, but Apple knows a lot about all of us, including our credit card numbers (iTunes/Appple Pay), your music, movie and media purchases, contacts, files, photos ... the list goes on.

Cook can draw the line where he think it's appropriate, but for others that line is much, much different.

Apple is simply on a spectrum of companies that store personal information about you. They aren't the worst, but they certainly aren't the best either.


Of course, there's nothing stopping Apple from selling both the product and the user...


But that reduces the value of the product. I also don't think it is in Apple's DNA to do that.

My concern is more that, given how bad they generally are at providing robust cloud applications [1], that their security is equally bad.

[1] iTunes Match would always refuse to play some songs from my collection. Even when a stopped using an iPhone last december, iMessages would often come in in non-chronological order and sometimes not at all, etc.


They are publicly traded, so they publish their financials. Something would show up there. Even if they cook their books, something would show up in somebody else's books.

If you are paranoid, you may think of the options "Sell the product and give away the user" and "Everyone is in on this". I think those are far-fetched.


These statements again... Time to get some downvotes, I guess.

How do I know? How can I even know that YOUR own device for which YOU wrote software and YOU designed hardware (although it may be based on some standards, no one can guarantee it's unmodified) won't share my private information with YOU?

How can I know that you're not sending my private key encrypted with your server's public key (one simple example of many) to your side?

Am I supposed to take your word on it? No thanks.

As much as I'm against Apple and their policy, these statements make no sense from anyone (Google, Apple, whatever). Unless you design your own phone from scratch, you can not be sure that it is secure (however, when you do that, all medium your phone may use is still not secure). It is simply not worth it. Anything in digital world is not secure (only a matter of attacker's determination and resources available to them) and there is no point in saying otherwise.


I tend to agree, it's much easier to just assume all online commutations are monitored than to try to escape the gaze of the five eyes (of Sauron).

Maybe this attitude means they have already won?


For some reason I think only diversity may help.

1000s different systems built differently are harder to milk for useful information if user is smart.. Centralization is the enemy of privacy (centralized = compromised).

Unless they have an AI which can analyze such systems' patterns at the speed of light and present solutions (which is not as unbelievable as self-conscious AI) but it's still more of a fiction.

Really, the problem is not in tech, it's in people. We're not fixing this one anytime soon.


[deleted]


It's just.. this statement barely has any meaning to it then. Carefully crafted sentences, yes, but what difference does it make to the outcome, to the end user?

We most certainly know that iMessage (like any other service) is not secure and now we are also left pondering if he is telling the truth or not about something that doesn't even matter in the end :)

Maybe I'm being too critical or maybe I'm missing something.


Apple is a PRISM participant, as of Oct 2012 (one year after Steve Jobs died). I highly doubt any iMessage is more secure than a plain text SMS sent via any cell carrier's network. Apple probably offers a sexy interface for the Feds to read everything.

Reference: http://tctechcrunch2011.files.wordpress.com/2013/06/prism-sl...


Do any down-voters care to offer a comment? Is it that one contests that Apple is a PRISM participant?


I think you're being downvoted because you got a little too excited while bashing Apple. iMessage is most certainly more secure than plain text in most cases, save for the VERY exceptional case where the NSA might be watching IF they do have a key on your phone. Plus, anyone who's used Apple's partner portals knows they're just as capable of shitty design as everyone else.


"In most cases" isn't enough any more these days. People are targeted alone by metadata - and killed too.


Thanks for the insight. Rather than excitement, I am actually most disappointed in Apple caving into the PRISM scheme because I hold them to a bit higher standard than other companies when it comes to privacy.. kind of like being told that Santa doesn't exist.


I didn't downvote you, but the leaked PRISM docs don't show "participation". They show that the NSA was getting the data. If they were doing it by, ex, inter-datacenter fiber tapping then Apple wouldn't have "participated" or even known about the leak.


It's Encrypted, and We Don't Have a Key. But it's not open source so you don't get to check. ... and we control the key distribution. We also do control your device, so technically, we don't need the key.

Fixed that for him.


And we put music on your device without asking you.


Is actually possible to not have the ability to decrypt messages remotely?

At first I thought that if just an iphone held the encryption keys and these were not on apple servers this statement could be true ...

however considering that imessage can be setup on a Mac and an iPhone via your Apple ID ... its more likely that this statement is just hyperbole for the Apple's approach to privacy


"When a user turns on iMessage, the device generates two pairs of keys for use with the service: an RSA 1280-bit key for encryption and an ECDSA 256-bit key for signing. For each key pair, the private keys are saved in the device’s keychain and the public keys are sent to Apple’s directory service"

"The user’s outgoing message is individually encrypted using AES-128 in CTR mode for each of the recipient’s devices, signed using the sender’s private key, and then dis- patched to the APNs for delivery."

Source: http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...

When you send an iMessage, you actually send a separate encrypted and signed copy for each recipient device. So, it is possible, but these are the lengths you have to go to.


But they could just inject a "fake" recipient device with their own public/private key and decrypt messages as they transit the system. They might not be able to decrypt messages you've sent in the past, but I can see no reason why they couldn't read messages as you send them if they wanted to (or were required by a wiretapping agency, for example).

I also recall a while ago a researcher who showed that if you forgot your iCloud password, there was a way to get Apple to reset the password and give you access to all your previously-stored data. If they had no way to decrypt your data remotely, that should be impossible.


On the first point, you're coreect. That's also why you get those extremely annoying modal dialogs each time a device/key pair is added to your iMessage account, because a device added without your knowledge could be used to eavesdrop on you.


You only get that dialog for the devices you add. The public key added by the NSA or Apple themselves does not trigger the dialog.

(explanatory note as the sarcasm in the comment might not have been obvious: I do not know whether such a facility exists in the services or not, so this might or might not be true.

But: The fact that it's possible that this "feature" already exists or can easily be added in the future, potentially even without an update of the client leads me to my current opinion which is that iMessage is not secure and all traffic is open to Apple, rogue employees at Apple and whatever government Apple is cooperating with).


Exactly, you can reset your password and re-download all of your old messages.


It is not impossible so long as there is a linked device that has access to the data. Was there a precise accounting of the state of the researcher's icloud account before the reset, and everything done during the reset?


Apple previously said that in order to read your iMessages they would have to re-architect the entire system. Technically, they could in fact do that, so if they really wanted to, they could.

But the system is not currently designed to allow message interception. The details about how it operates can be found in the iOS Security PDF from February:

http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...


>But the system is not currently designed to allow message interception. The details about how it operates can be found in the iOS Security PDF from February

The system is currently designed to allow very easy message interception by just sending both Alice and Bob a fake public key of their respective communications partner.

You have zero control over what public keys your phone encrypts data with.


That isn't really definitive, because the NSA has required companies to change software to capture keys before. There was an encrypted email company who was told they have to start recording private keys used in their web client.


> we finally got an agreement from the administration to release how many times we had national security orders on Apple. And in a six month period, and we had to release a range, because they won’t let us say the exact number, it’s between zero and 250. That’s the lowest number you can quote. Zero to 250.

So does anyone else think this might be a bad number (being so low)? If the NSA had access to everything, wouldn't you expect the official requests for data to be low and not because Apple doesn't store data or it's all encrypted anyway as Cook implies.


This statement is not enough... there could still be a third party (e.g. law enforcement) with a key or for more plausible deniability, a portion of a key (n bits out of N), and the statement would still be true. If they can do so truthfully, they should also state that nobody else other than the user has a key or any portion of a key, and that there are no keys or portions of keys in escrow where anybody else can conceivably get at them.


Honestly, I really don't care if they have the key or not. My concern is if they can read my messages. I am more worried about backdoors added to iMessage by Apple. This is known to happen often in Apple products, maybe for the purpose of Development or Support, but it is still there. That is exactly what I want to know. Because if Apple can use that backdoor, a patient hacker can too.


> This is known to happen often in Apple products, maybe for the purpose of Development or Support, but it is still there.

Could you elaborate on this? Are you talking about the development tools that require your phone to be cabled to a laptop?


This is just one example: http://www.networkworld.com/article/2456967/security0/apple-...

But if you google it, you'll find more.


What known backdoors?


Apparently skype uses encryption, according to:

https://support.skype.com/en/faq/FA31/does-skype-use-encrypt...


He should instead just have said "Have you tried iMessage?? Its a disaster, we couldn't pull messages out of there even if we wanted to. Be thankful anything shows up on the screen at all!"


It's nice to see that he thinks user privacy is something valuable, and that this is an issue worth talking about. However, I think he's really only addressing one of the three ways that user data can be compromised when it's held by a company:

1. Selling or using it internally (e.g. Google's ad targeting)

2. Stolen by criminals (e.g. Apple's recent snafu)

3. Requisitioned by the government (e.g. Yahoo's daily fines for refusing to join PRISM)

Data that isn't collected in the first place can't be lost, but that isn't always possible (and it often defeats the entire purpose of the service).

I think he's mostly talking about the first one, that Apple hasn't built their business around harvesting user data to feed advertising or other systems. I think that's laudable. I would rather a company focus their energies on one product, rather than selling a byproduct of their real money maker. I also don't really like being the fuel for a free service -- I'd rather pay for what I use and have everything above board.

Security is where I think Apple has its biggest problem with user privacy. Relative to companies like Google, they just aren't very good at running Internet services. Despite operating several huge services (iTunes, a CDN that handles iOS updates, iCloud, etc.), they aren't an Internet company at heart. Google and others are leaps and bounds above them in this regard.

They can improve this with hiring and changing the culture in those groups, and I think they started this process a year or two ago. It will take some time, and they're never going to be the world's best at this stuff.

Tim Cook touches on the third point a little, and some people think he is being disingenuous in his description of how iMessage works. I think that it doesn't really matter. It's great that Apple has designed a system that places an emphasis on keeping user conversations private, but there is literally nothing that they can do about government interference if they want to continue operating as a legal entity.

I hope that they do everything in their power to curb government overreach. Ultimately the government will get what they want. Simply building a system designed to keep you from accessing the data that flows through it is not sufficient: you will be forced to subvert that system or face serious consequences.

I strongly believe that the only way people will ever have privacy from the government again is to decentralize the Internet and the services that run on it. I doubt that a political solution is possible now that pandora's box has been opened, and it seems like it will only become more widespread as companies have to answer to more governments around the world and the technology to broadly intercept traffic becomes more commonplace.


erm. "with every mention of 'encryption' comes the issue of 'key management'? " ?

and possibly a subset of 'key management', 'key exchange'..?


all these claims are worthless when you are not even administrator on the device you use... (they have access to your messages before they get encrypted...)


Wasn't there a leaked FBI PowerPoint that very explicitly stated that they can't intercept iMessages? (and it was a point of frustration)


Jobs is rolling over in his graves, Cook is a shill. Jobs kept the five-eyes out of Apple's shit until he croaked.


But the NSA do have the keys :p just kidding


The tech heroes of our age are becoming some kind of politicians, this is very sad.


Does that mean its just a ROT13 cipher? I mean its technically encrypted, but also doesn't need a key...


Let's take this to the extreme: If you are threat to the NSA, you'd use an iPhone to send a message because Tim Cook said it's encrypted?

YES - NO - I DON'T KNOW

ps. Let's wait for Apple to do a better job at keeping private naked pictures people make for fun using an iPhone and we'll talk iMessage encryption later.


"we don't have a key, at this moment". Come on, Apple is still able to get your messages. Next time when you set up your iPhone, and if the messages are downloaded to your new device, you know they have access as well.


3 words - National Security Letter.

Take what Timmy says with a grain of salt, until they should you the source code. Oh wait...

Apple fan boys: bring on the downvotes, but enjoy your surveilling


I didn't downvote you, but I imagine it has less to do with what you said and more with how you said it. NSLs as a "boogeyman" is a poor substitute for lacking evidence. So far, we know that these things are sent to US companies which are in the business of collecting and bartering data. While Apple has a corner of that market, it isn't its entirety.

So here they have snippets of our information Ex: email, credit card, home address for app purchases. However, it's just as possible to use the phone normally without downloading a single app. That leaves the SIM provider holding the bag for your information ready to be collected by the authorities. Effectively, Apple washes its hands of your data on phone calls, but there is no reason to presume lies regarding backdoors on iMessage.

Also, I'm not an "Apple fan boy". I'm just unconvinced of widespread eavesdropping at Apple and that they're willing to risk destroying such a massive advantage over Google's Hangouts platform.


In a post-Snowden world, when it comes to leaking your data, companies are guilty until proven innocent. That is the only sensible stance to take given what we have learned in the last year.


This.

The Director of National Intelligence James Clapper tells Congress that the NSA doesn't conduct intelligence on American citizens, and then later gets caught out. He committed Perjury without any ramifications.

Why do people think that CEOs are telling the truth when they are all in the same boat paddling up shit river?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: