Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The U.S. also attempted to force Apple to add a back door just a decade ago.

> Tim Cook, the C.E.O. of Apple, which has been ordered to help the F.B.I. get into the cell phone of the San Bernardino shooters, wrote in an angry open letter this week that "the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create." The second part of that formulation has rightly received a great deal of attention: Should a back door be built into devices that are used for encrypted communications?

https://www.newyorker.com/news/amy-davidson/a-dangerous-all-...



The US succeeded, according to American lawmakers: https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...

  Apple has since confirmed in a statement provided to Ars that the US federal government "prohibited" the company "from sharing any information," but now that Wyden has outed the feds, Apple has updated its transparency reporting and will "detail these kinds of requests" in a separate section on push notifications in its next report.
Apple's hidden at least one warrantless backdoor in their systems for the purpose of federal surveillance. I have no reason to believe the exploitation stops there.


Apple and Google had no choice but to comply with the National Security Letters demanding access to user's push notification data.

They also can't refuse to comply with warrants demanding any such unencrypted data that is stored on their servers.

That's not the same thing as adding a back door to allow access to encrypted user data that is stored on the user's device.

It's also different than storing encrypted user data on your server, when you have purposefully designed a system where you don't have access to the user's encryption key.

Encrypted user data backup is the feature that Apple disabled access to in the UK rather than comply with the order to insert a back door in the OS.


To clarify: When you get an NSL, not only is it impossible to refuse and stay in business, it is also impossible to talk about it. That's the scary bit.


Certainly. At least with a normal warrant you can publicly speak out and notify the user(s) involved.

I would also point out that it was Senator Wyden who initially informed the public of how much the government was already spying on their unencrypted communications.

His record on civil liberties is excellent.


You'd better hope you're right. Nobody is auditing Apple who can hold them accountable. The lack of transparency is how we ended up on this slippery slope in the first place.

Good security models typically don't hinge on being lucky.


Nobody is auditing Google to prove that they aren't selling user data to third party data brokers.

Should we disbelieve them when they say they don't do so?


You need to think about what they don’t say with these matters.

He said Apple does not have and won’t create a backdoor. That was well crafted and means exactly what he said, any implicit meaning is an artifact of your brain.


I might postulate that while Rhubarb LTD absolutely doesn't hold and will never create a backdoor, Celery Inc does. Ignore the fact that Celery is staffed by some of Rhubarb's senior engineers working part time. Ignore the fact Celery are contracted to do security assessments so have access to all the source code, radio firmware and schematics...

I absolutely don't actually know anything about Apple, but I've seen some of the ways even small companies legally split themselves up to avoid tax or various forms of liability. Multiple phone numbers to the same phone, multiple domains and email providers to the same laptop. Multiple denials that you've ever heard of the other company let alone happen to share the same office space...

There's a massive difference between a truthful statement and an honest one; anyone that works with code should understand that.


[flagged]


I don't think anyone's surprised by that. Our emails have literally been used to target ads at us since like 2006. Cell phone carriers are happy to mine voicemail, call logs, SMS, etc. in the hopes of finding a revenue stream that doesn't involve them having to do irritating work like running fiber to cell phone towers.

This leaves contact mining as the odd one out, but given how many apps want to see your contacts, you know that those are being sold by at least one of those apps.

None of this stuff has ever been end-to-end encrypted, so there can't be any way people expect it to be private.


That's not a revenue stream at any cell phone carrier I've seen. They do what they are legally obliged to do, and while they do get paid for it, it's a fraction of the actual cost of providing the data. The state tends to drive the hell of a bargain. The service providers, such as Facebook, Google and Apple though, that's entirely different.


Almost all telcos in the US sell data to brokers.

Like for example, when they got caught selling location data they were required to protect. [0]

[0] https://www.theregister.com/2024/04/29/fcc_telecom_fines/


Verizon silently opted everyone into sharing their browser history (based on DNS I guess) with advertisers: https://arstechnica.com/information-technology/2021/12/veriz...

I have to imagine that the other companies are doing this as well.


Warmer, think of the other vector, from the data broker point of view…


No-one is surprised by the existence of the security arms race.

It's the reason that Apple and Google recently started rebooting devices that haven't been unlocked in a while.


With physical access to the device or not?


Not



That's old patched spyware. I'm talking about something entirely different. No device install needed.


Emails and GSM calls yes, obviously. But e.g. Signal communications? You need a Pegasus-tier exploit for that, which means that unless you're high profile enough you should be safe.


You're 'not at liberty' to post wikipedia links? Or you have knowledge of programs wikipedia doesn't?


The latter. Wikipedia doesn’t know everything. NDA’s are enforceable.


Extraordinary claims require extraordinary evidence. If you really have access to secret information of that significance and you really are under an NDA that prohibits you from talking about it then why are you casually posting innuendo about it on HN?


To point out that your data isn't safe from law enforcement. Quite the contrary. I think everyone should be aware of the state we are in. And while I can't go into detail about how I know, I want others to be aware that anything on their devices is fair game. Now a day's with or without a warrant. Three letter agencies are operating with impunity. Using this very tech.


Again - extraordinary claims require extraordinary evidence.

It's no secret that there are groups actively looking for new exploits and that sometimes vulnerabilities are discovered that become zero days. It's a good bet that police and security services take an active interest in those vulnerabilities when they are found.

But that's very different to claiming the police can easily unlock any device any time they want to and there is a range of private companies around who provide that service to them.


It's not extraordinary at all. Ron Wyden, a US Senator subject to special briefings, basically repeated the same thing when asked about federal backdoors:

  "As with all of the other information these companies store for or about their users, because Apple and Google deliver push notification data, they can be secretly compelled by governments to hand over this information," Wyden wrote.
https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...


Push notifications for e2e messaging apps carry e2e encrypted payload, which can’t be decrypted unless Apple reads the private keys from those apps sandboxes…


In the case of iMessage/iCloud, it seems like that already happens: https://s3.documentcloud.org/documents/21114562/jan-2021-fbi...


That document appears to be over 4 years old, predating the availability of Apple's Advanced Data Protection system that claims to provide proper E2EE on most iCloud back-ups. The latter was controversially the subject of a specific legal attack by the British government using the Investigatory Powers Act resulting in Apple withdrawing the feature entirely from the UK market rather than compromise the security of their system - according to public reports anyway. Before ADP much of the data stored in iCloud backups was not fully end-to-end encrypted and Apple itself did not claim otherwise.


Those apps generally distribute keys, and E2E is if no help unless you validate those keys out of band. Do you, really?

Then there are all the ways, both white and varying shades of gray, of installing software in the end devices. That's your primary threat right there.


I'm going to assume they are referring to any cloud backups of said devices. Since they are stored on servers managed by not you and are unencrypted, able to be accessed for "national security reasons".


There’s nothing extraordinary about the implications of what was said.


There is nothing extraordinary about a claim that multiple commercial organisations routinely and reliably defeat the security of modern devices on behalf of law enforcement - something that would clearly undermine numerous public claims about the security and privacy of those devices made by their manufacturer? You and I have very different ideas of what is extraordinary!


Multiple vendors advertise and sell devices and software to crack iPhones, they have for years. In the US, any decent size city or county sheriff has access to one. State level forensics labs probably have several types.

The manufacturer provides the means to bypass many of the cheaper tools, but few people use them.

There are more exotic tools that can bypass security controls. These are more niche and not generally available to law enforcement. There may be some crossover when counter-intelligence interfaces with law enforcement. (Ie. FBI, DEA, RCMP, ICE, etc)


Multiple vendors advertise and sell devices and software to crack iPhones, they have for years.

Yes they do. Now name one that works consistently against a fully patched modern iPhone.


I like the term exotic tools like they aren’t utilizing GovCloud…


There are a lot of things that are publicly known but if he's signed an NDA he can't point at them or acknowledge their authenticity. Anyway Pegasus isn't even the correct ballpark lol.


Just about every confidentiality clause or NDA I've ever signed had a provision specifically excluding information independently in the public domain from its scope. I find it strange to the point of lacking credibility that someone working in a security-related field would have an NDA that required them to pretend to ignore even public domain information yet permitted them to post the kind of innuendo seen in this discussion.


Why should I disclose public domain knowledge when it’s public? The whole point was to point out there’s ways that aren’t public being used.

Believe it or not, I actually care about privacy. Innuendo is not my intent, no maliciousness here, only stating there are programs that have access to your data. Telegram/Signal/Encrypted or not. They don’t need access to your device. Only access to the Internet.


The whole point was to point out there’s ways that aren’t public being used.

For which you have provided not a shred of evidence here beyond the same type of innuendo you've been posting all along - even while implying that some of this is public knowledge that you could therefore cite to establish at least some credibility.

Your claims in combination appear to require that the technical foundation on which almost all serious security on Apple devices is built must be fundamentally flawed and yet somehow this hasn't leaked. That's like saying someone found an efficient solution to the discrete logarithm problem and it's in widespread use among the intelligence community but no-one outside has realised. It's theoretically possible but the chance of something so big staying secret for very long is tiny.

As I said before - extraordinary claims require extraordinary evidence. Thank you for the discussion but there seems little reason to continue it unless you're able to provide some.


From 7 months ago which is already old information: https://www.dhs.gov/ai/use-case-inventory/ice

It’s not extraordinary if you’re in this space. This is but one of many such initiatives. A few have already been in the works for years.


Yeah, I agree but it could be his thought process.


Cellebrite, on the other hand...

Edit: And Magnet, and the internal capabilities of an acronymical agency or three...


Pretty sure no NDA ever says it's forbidden to discuss the subject of the agreement, but cute little internet innuendos whispered from behind a coy little fan are ok.


I mean Cellebrite has been a public name for a long time now, and LEO pays for that and similar devices which basically launder zero days and physical exploits to get your stuff.


Correct, they are one known actor...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: