Hacker News new | past | comments | ask | show | jobs | submit login

>So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud? What happens when a party in India demands they start scanning for memes associated with a separatist movement? What happens when the UK demands they scan for a library of terrorist imagery? How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering “extremist” political material, or about your presence at a "civil disturbance"? Or simply about your iPhone's possession of a video clip that contains, or maybe-or-maybe-not contains, a blurry image of a passer-by who resembles, according to an algorithm, "a person of interest"?

What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services so the point at which the scanning is done is mostly arbitrary from a process standpoint (I understand people believe there are huge differences philosophically). They could have already scanned our files because they already have full control over the entire ecosystem. If they can be corrupted by authoritative governments, then shouldn't we assume that have already been corrupted? If so, why did we trust them with full control of the ecosystem?




In years previous, take the San Bernadino shooter for instance, Apple argued in the court of law that creating backdoors or reversible encryption was insecure and also subject to exploits by malicious actors, and thus not reasonable and was "unreasonably burdensome". As well, they made the argument that compelling them to do write back doors also violated the first amendment.

It was most likely a winning strategy that the FBI actively avoided getting rulings on and found a workaround.

What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.

All of that can easily be ordered to be bypassed. So it can be a scan, single hit for x, report.

Ill take the downvotes, but if anything, someone more conspiracy minded could easily take this as a warrant canary. Given the backlash apple ahs faced and ignored, it doesnt make much good business sense for them not to back off unless they are

A) betting on it being a vocal minority to resorts to action (which is entirely possible, especially given the alternatives and technical hurdles to get to a suitable alternative)

B) Being pressured by governments now. (also entirely possible given their history with the FBI and previous investigations).

[1] https://www.rpc.senate.gov/policy-papers/apple-and-the-san-b...

[2] https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_d...


> What apple is creating here is an avenue for the FBI/NSA/Alphabet agency to create a FISA warrant and NDL to mandate hits on anything. The argument its gotta be pre-icloud upload or subject to manual review or on some arbitrary threshold is something is just the marketing to get the public to accept it.

Why would they make things even more complicated with limited access, since they can already access everything in cloud? Let’s leave out the argument for expanding scan to whole device. If that is what happens, then people start really discarding their phones.


Well for one scanning on-device lets them expand the amount of stuff they search for without an impact on their servers.

We can all assume they will eventually start scanning for more things than just photos only before they are sent to iCloud. It can easily and _silently_ be expanded to be any file on the phone.


You can do silently every imaginable thing right now. iOS is not exactly open-source system.


Except that right now thy don't have a plausible reason to be scanning things, and any indication of something like that happening without prior expectation would be an even bigger deal than this is. Setting the expectation that this is acceptable is how you hide overstepping and abuse.

Just because my neighbor physically can run out and physically attack me every time he sees me exit my house isn't a valid defense of him running out and verbally abusing and threatening me every time I leave, not is it a valid excuse not to worry about it escalating to that.


We were talking about silent things here. So there is no prior expectation for them? Silently expanding scan for every file for example would count still overstepping similar way for the most of the people. Because that is abuse, and usually there is zero tolerance. Apple has avoided marks of the abuse in the past pretty well.

But anyway, Photos spotlight, Files app or Siri are already scanning you files and getting metadata. Metadata is even stored to iCloud to be able to keep sync process working. There are excuses if you want to make them.


> Except that right now thy don't have a plausible reason to be scanning things, and any indication of something like that happening without prior expectation would be an even bigger deal than this is. Setting the expectation that this is acceptable is how you hide overstepping and abuse.

How do we know Apple isn't doing this right now? How do we know if and when they do? Are people keeping track of everything the phone sends back to Apple's servers? Is it even possible any more?

Considering Apple doesn't let you have full access to the device, the phone could do anything, encrypt the message and send it. The only way I know would be by monitoring the traffic off-device on the network all the time, which means only while on Wifi. And that wouldn't give you content, only metadata, as by then it's encrypted and you don't have the key.


Because they dont have access to everything in the cloud.You dont have to use iCloud, or Siri, or Spotlight.

This was specifically addressed in the San Benadino and other cases. Apple gave the FBI everything in the cloud. FBI was looking for everything on the device.

What this change does is all a method, without an opt out option, for them to scan for anything on the device. Be it a string of text/keywords, or certain pictures of a place with certain metadata etc.


This is just speculation. Current technical implementation limits scan only for images to be uploaded into cloud, which can be opted. If you don’t trust that, you can’t trust to use their devices right now either.


That seems like a reach.

>Current technical implementation limits scan only for images to be uploaded into cloud, which can be opted.

That is conflating policy with a technical limitation. Their changes negate the technical discussion at this point.

Their POLICY is that it will only scan for images to be uploaded. They no longer have a *legal* argument to not comply with government requests for device scanning of any data now, since the framework is now included.

That is a big change in that regard. Whereas in the past there was a layer of trust that Apple would hold governments accountable and push back on behalf of a users privacy (and there is a very tangible history there), this implementation creates a gaping hole in that argument.


Actually it is not just POLICY. This scanning is build very deeeep in to the iCloud upload process. They need huge revamp for the system, and it seems intentional just because of this speculation. So we are in the same discussion whether this is implemented or not.


None of the tech documents point to this being the case. In fact in many of the articles I have read, it’s quite the opposite. Including the peer reviewed paper that had the dangers of such a program outlined in the conclusions. [1][2]

Do you have any sources here here to the contrary?

[1] https://www.washingtonpost.com/opinions/2021/08/19/apple-csa...

[2] https://www.schneier.com/blog/archives/2021/08/more-on-apple...


Their threat model[1] states:

> This feature runs exclusively as part of the cloud storage pipeline for images being uploaded to iCloud Photos and cannot act on any other image content on the device. Accordingly, on devices and accounts where iCloud Photos is disabled, absolutely no images are perceptually hashed. There is therefore no comparison against the CSAM perceptual hash database, and no safety vouchers are generated, stored, or sent anywhere.

and

> Apple’s CSAM detection is a hybrid on-device/server pipeline. While the first phase of the NeuralHash matching process runs on device, its output – a set of safety vouchers – can only be interpreted by the second phase running on Apple’s iCloud Photos servers, and only if a given account exceeds the threshold of matches.

We should also take account the way how blinding the hash works from CSAM paper[2]:

> However, the blinding step using the server-side secret is not possible on device because it is unknown to the device. The goal is to run the final step on the server and finish the process on server. This ensures the device doesn’t know the result of the match, but it can encode the result of the on-device match process before uploading to the server.

What this means, that whole process is tied strictly to specific endpoint in the server. To be able to match some other files from device into the server, these are also required to be uploaded into the server (PSI implementation forces it). And based on the pipeline description, upload of other files should not be possible. However, if it is and they suddenly change policy to expand to scan all files of your device, they will end-up into the same iCloud as other files, and you will notice them and you can't opt out from that with the current protocol. So they have to modify whole protocol to include only those images which are actually meant to be synced, and then scan all the files (which are then impossible to match on server side because of the how PSI protocol works). If they create some other endpoint for files which are not supposed to end up into iCloud, they need store them in the cloud anyway, because of the PSI protocol. Otherwise, they have no possibility to detect matches.

It sounds like that this is pretty far away from just policy change away.

Many people have succumbed to populism as it benefits them, and it takes some knowledge and time to really understand the whole system, so I am not surprised that many keep talking, that it is just policy change away. Either way, we must trust everything what they say, or we can't trust a single feature they put on the devices.

[1]: https://www.apple.com/child-safety/pdf/Security_Threat_Model...

[2]: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...


I just want to say thanks for the links and taking the time to explain it. I think it’s pretty logical. I see your viewpoint and I think I need to take some more time to consider my stance (again…).


It isn't a philosophical debate. It's about invading and controlling someone else's property. I can't shack up in your home and eat your food just because I feel like it. We're all doomed because digital natives have no concept of boundaries between something they own and something someone is renting or letting you use for free in exchange for data mining.


Like I said, Apple controls the hardware, software, and services. They are already control your property.


There's a substantial difference - both in theory and in practice - between Apple being capable of making your device do things you don't want it to do, and them actually doing it.

Saying that just because they could have, we ought to be okay with them actually doing it is nonsense. If you apply that line of thought to a non-updatable product, it becomes pretty clear.

Pick basically anything man-made around you - your shoes, your couch, whatever. That could have plenty of awful things in it. It could be spying on you, it could be poisoning you, whatever.

Just because the manufacturer could have done something terrible, doesn't mean we're okay with them actually doing it. The mere fact Apple can do these things after purchase doesn't make it any more acceptable for them to do so.


> Saying that just because they could have, we ought to be okay with them actually doing it is nonsense. If you apply that line of thought to a non-updatable product, it becomes pretty clear.

That is not the argument I made. The argument was that if they could be corrupted we should assume they are already corrupted. And since they already have control over the device an already corrupted Apple could have been spying on you already. It isn’t excusing their behavior. It is pointing out the naïveté of the current outrage.


Perhaps you are right that the outrage should have come much earlier. I myself avoided Apple since a long time. However, it's better late than never. I'm glad people are finally waking up.

See also: https://stallman.org/apple.html and https://www.fsf.org/campaigns/apple.


The concept of ownership you are asserting is but one of many historical principles of ownership. There are however, other concepts of ownership that conflict with what you are asserting.

https://www.econtalk.org/michael-heller-and-james-salzman-on...

I don't think there is a good faith argument that Apple is invading or controlling anything of you own. All that's happening is you agree to run the algorithm in exchange for using iCloud photos. That's just a contract; a mutual, voluntary exchange.


Contract implies meeting of the minds. I'd like to see the process by which I can line out or alter the terms of the agreement please.

I'll wait.

This is the other thing "digital natives" don't get, nor want to. Negotiation is normal. Ya know something else tgey don't get? Selling something with the damn manual, and enough system documentation to actually be able to sit down and learn something. Drives me nuts.


Of course they won't let you alter the terms of the agreement, but everyone always has the option (barring some sort of malicious state-enforced intervention) to walk away and divest themselves away from platforms and ecosystems that don't respect their privacy or otherwise act in ways they do not agree with. Or at least if it's going to take a long time to get there, one ought to start thinking about how to go about ensuring that escape hatch is available if their entire life's data depends on someone else continuing to provide them access to their "cloud" services based on potentially arbitrary rules and changing conditions they may never know about or be able to audit for themselves.

If there were enough people out there to stand up and make a difference by going somewhere else and hitting these companies where it hurts (stop giving them money and personal data to mine), then maybe it would start to make a difference.


I agree with the premise of what the parent is saying. How is it legal that you can enter into a contract that gives Apple a perpetual right to your private (Or your company’s) data, without clearly requiring you to consent, with a witness or notary, in plain and understandable terms at the point of purchase?

If your data is located in a cloud you might make the argument that Apple is the owner of the system. But if Apple is truly accessing data from your personal device or “server” without explicit authorization, it violates a whole host of computer crime laws.


> But if Apple is truly accessing data from your personal device or “server” without explicit authorization

If you're uploading the photos to iCloud Photos, you clearly have given authorisation to Apple - in the sense of accepting the iCloud T&Cs and saying yes to "store my photos in iCloud Photos" for them to access the data in those photos, else they wouldn't be able to upload them.


Easy. EULAs are not contracts.


Among the "historical principles of ownership" are those from the communist countries, where the individual humans had the legal right to own only things belonging to a very short list and nothing else.

However, USA has claimed during decades that such restrictions of the rights of ownership are an abomination.

Even if we would accept that this is just a modification of a contract between the "owner" of a device and Apple, if Apple would have acted in good faith, they should have offered that if you do not agree to let Apple run programs on your own device for their benefit, which was never mentioned when you "bought" the device, then Apple should fully refund everything that you paid for your device and other Apple services, so that you will be able to get an alternative device.

As it is now, you either accept that Apple changes their "contract" at any time as they like, or you incur a serious financial loss if you do not accept and you want an alternative.

This certainly isn't a "mutual, voluntary exchange".


the problem is that the companies make ilegal alternatives, obfuscated legals terms, and put himself in the least resistant position, and force you to opt in.


Apple is renting the phone to you for $1000 down and $0 a month (unless you actually are financing). Therefore, they are the landlord and, given notice, can change the property as they feel fit.


This is demonstrably not true. If you rent a home and then burn it down, you are going to be held liable to the owner of the home. In the case of your phone, no one, including Apple, cares if you buy it and then immediately smash it on the ground and destroy it.

Apple controls the software that runs on it but there is nothing that stops you from modifying or hacking it to your heart's content if you are able to, just as they are not obligated to make that an easy task for you.


>Apple controls the software that runs on it but there is nothing that stops you from modifying or hacking it to your heart's content

Nothing except all of Apple's attempts to make that difficult and a bad op sec decision. Oh and let's not forget the series of lawsuits attempting to make jailbreaking considered illegal. Luckily they failed there, but if they could make modifying their software illegal make no doubt that they would.

They don't own the hardware they sell you in the same way a landlord owns a home because they have transferred all physical equity to the purchaser. However, Apple's model really stretches the definition of "ownership". Would you say you own a adobe acrobat because you paid for it, or would you say you own a license to use it? Buying Apple means you own the hardware and license the software that makes that hardware be anything other than a paperweight. It's not a very attractive idea. Kudos to their marketing department.


> Nothing except all of Apple's attempts to make that difficult and a bad op sec decision.

No one said it had to be easy or advisable.

> Would you say you own a adobe acrobat because you paid for it, or would you say you own a license to use it?

Any software that I run that I didn't write myself is subject to the license of the people who wrote it defined it to be. Even the MIT License places requirements on you for you to be allowed to use the software. Exceptions to these copyright protections have been made which extends to jailbreaking iOS devices, which requires modifying copyrighted code.

> Buying Apple means you own the hardware and license the software that makes that hardware be anything other than a paperweight.

All hardware is paperweight without software.


> No one said it had to be easy or advisable.

Now you're making a pedantic point about a technicality instead of what's happening in real life.

> Any software that I run that I didn't write myself is subject to the license of the people who wrote it defined it to be. Even the MIT License places requirements on you for you to be allowed to use the software. Exceptions to these copyright protections have been made which extends to jailbreaking iOS devices, which requires modifying copyrighted code.

The MIT license doesn't require you to allow anyone else to scan your private data and doesn't allow the licensor to change the terms after you've already started using the software.

> All hardware is paperweight without software.

If you buy a Dell and you don't like the Dell crapware, you can remove it and the device still works just as well (if not better). If it came with Microsoft Windows and you don't like the Windows license, you can install Linux or OpenBSD. The hardware is still useful even if you don't like the license for the software it came with.

If you don't like Apple's software licensing terms, your iPhone is a paperweight.


> Now you're making a pedantic point about a technicality instead of what's happening in real life.

In real life people are looking for escalation of privilege exploits that enable them to exploit iOS to allow for installation of arbitrary software on it. This is what jailbreaking is.

> The MIT license doesn't require you to allow anyone else to scan your private data and doesn't allow the licensor to change the terms after you've already started using the software.

At what point did I ever state any of this or even imply this? I am simply stating that licenses affect all the software we run and places restrictions from the creators of said software on the users of it. This has nothing to do with Apple surveilling its users with its new tech.

> If you buy a Dell and you don't like the Dell crapware, you can remove it and the device still works just as well (if not better). If it came with Microsoft Windows and you don't like the Windows license, you can install Linux or OpenBSD. The hardware is still useful even if you don't like the license for the software it came with.

Hypothetically it is possible to run whatever software you want on an iPhone, including installing another OS. In reality this translates to people are jailbreaking devices. As has been mentioned, people are allowed to hack their own iPhones and it's protected by DMCA exemptions.

But if you're going to accuse me of being pedantic about technicalities instead of real life, how about this: In real life almost no one gives a shit about running arbitrary code on their devices and just use it to get access to the applications that are readily available in official app stores.

> If you don't like Apple's software licensing terms, your iPhone is a paperweight.

You can dislike their software licensing terms and still use your iPhone. I dislike the things that Apple is proposing with regards to CSAM detection but that doesn't mean I can't use my phone.


> In real life people are looking for escalation of privilege exploits that enable them to exploit iOS to allow for installation of arbitrary software on it. This is what jailbreaking is.

"Unjust imprisonment is fine because you can hire a black ops team to break you out."

So you jailbreak your iPhone. Then an iOS update comes out patching a security vulnerability. If you install it, it removes your jailbreak (or bricks your phone). If you don't, your device has an unpatched security vulnerability.

And at any given time there may not be a jailbreak for the current version of iOS.

This is not a reasonable state of affairs.

> At what point did I ever state any of this or even imply this? I am simply stating that licenses affect all the software we run and places restrictions from the creators of said software on the users of it. This has nothing to do with Apple surveilling its users with its new tech.

The problem is that Apple is imposing license restrictions you don't want. Your response was that all licenses impose restrictions. That ignores the important distinction between restrictions you actually care about and restrictions that don't really affect you.

> Hypothetically it is possible to run whatever software you want on an iPhone, including installing another OS.

Hypothetically you can make your own iPhone out of sand and crude oil. In practice no third party operating systems for iPhones exist because Apple doesn't document their hardware and so there are no drivers for third party operating systems.

> In real life almost no one gives a shit about running arbitrary code on their devices and just use it to get access to the applications that are readily available in official app stores.

In real life most people unjustly imprisoned by a government don't have the wherewithal to break out of prison. That doesn't mean they like being in incarcerated, or having Apple scan their devices.

What it means is that they're structurally bound into a position where their true preferences can't be expressed. Which is the problem.

> You can dislike their software licensing terms and still use your iPhone.

Yes, exactly. But you can't refuse to accept their software licensing terms and still use your iPhone, which means that your choice is between having something imposed on you that you dislike, or your iPhone is a brick because you can't in practice use it under any other terms.


>All hardware is paperweight without software.

You miss the point. I can buy an x86 machine and run Windows or a FOSS OS or any number of unix clones or hell even write my own OS. From the outset I can say I own the hardware.

You can't say the same for Apple hardware. Even if the act of jailbreaking as a specific case is not considered illegal, you have to do many illegal things if you want to pwn an Apple device enough to run another OS on it.


> You miss the point. I can buy an x86 machine and run Windows or a FOSS OS or any number of unix clones or hell even write my own OS. From the outset I can say I own the hardware.

You are more than welcome to use an x86 machine as your cell phone, but I don't think most people would choose to. If we're talking about comparable hardware, even m1 macs allow you to run alternative operating systems [1] on it so this isn't a valid point.

You could write your own OS on an desktop computer and it'd be a significantly easier process than doing so for an iPhone which has a locked bootloader, but that doesn't mean that you can't. Just that it's tremendously difficult and low value proposition. Privilege escalation on a jailbroken iPhone is typically about as much as people want. Why would they buy an iPhone over a device with an unlocked bootloader otherwise?

> You can't say the same for Apple hardware. Even if the act of jailbreaking as a specific case is not considered illegal, you have to do many illegal things if you want to pwn an Apple device enough to run another OS on it.

What laws do I have to break to pwn an Apple device enough to run another OS on it? Jailbreaking [2] is protected by a DMCA exception.

[1] https://asahilinux.org/

[2] https://en.wikipedia.org/wiki/Jailbreaking_(iOS)#United_Stat...


> you have to do many illegal things if you want to pwn an Apple device enough to run another OS on it.

Can’t think of a single one, elaborate please?


I think the op meant that as an analogy : in order for you to use their software, you pay 1000$ upfront for the hardware. So you can look at it as a one time payment/rent to use their environment. Since you need to upgrade iPhones quite often, I guess renting isn't a bad analogy.

> but there is nothing that stops you from modifying or hacking it to your heart's content if you are able to.

Are you sure? I haven't read the terms, but that might be quite against their rules. Rules that you probably adhere to by using their product, but I'm not a legal expert.


Their rules cover their continued services. When you buy an iPhone, you are free to use whatever tools you’d like to modify / hack / break / enhance / etc the device.

The terms govern your interaction w/ Apple. So, for example, if you crack open the case and try to re-wire the board, the terms say your warranty no longer applies. If you modify the software, they can ban you from interacting with their servers. And if you start offering to sell modified iPhones to other people, they can come after you for damaging their business.

That’s what owning the phone means. You can do what you want with the phone you bought, but they aren’t required to support your efforts or allow you to use the services they’re actively running.


You are allowed to modify iOS Software in this case because it is an exemption to the DMCA.

https://en.wikipedia.org/wiki/Jailbreaking_(iOS)#United_Stat...


Good to know, thanks!


I agree we are all doomed, but I don’t agree it has that much with digital native or not to do. My boomer grandparents, my gen x parents and my millennial self, we are all affected by this. And gen z (the first generation of digital natives), and whatever comes after gen z, is not to blame for that. Reducing it to a generational thing is silly.


I think the point was that the digital natives and the next generation of digital natives coming will not know any different and will thus tacitly accept it.


I think “we don’t have the machinery to do that” is an effective argument in the real world when someone asks you do to something. I’m not sure if it matters legally (lawyers sometimes use vague phrases like “reasonable effort”), but it definitely affects how strongly people will pressure you to do things, and how likely you are to acquiesce to that pressure.

The scope of the change Apple would need to make to scan your photos arbitrarily just got a lot smaller. The number of engineers who would need to be “in the know” to implement this change got smaller. The belief from governments that Apple has the option of doing this got stronger. The belief among Apple’s own management team that they can do this got stronger.


This is very well put.


Because that door hasn’t been opened yet. “Scan every photo on users devices” or “scan for non-CSAM” are much easier requests once they’ve already started scanning on-device.

It’s just how life and politics work.


The door has been opened for quite some time. What do you think spotlight is? It scans an indexes all your data.

What's prevented the government from saying "hey if you see Osama Bin Laden in a spotlight scan, you need to send us all that guys data."

The answer is, Apple can just say FU. And that's exactly what will happen here. In particular, the US DOJ needs to stay in Apple's good graces here and not be overly aggressive. If DOJ pulls any funny business, that's a pretty good reason for Apple to just say "OK, we're picking up our toys and going home. You get nothing now and we're turning on E2EE."


I'm not a security professional by any means, but this has been my line of thinking on this whole debate for quite a while. It's pretty silly considering what has been made public about the clandestine operations of alphabet agencies,(if you were paying attention to the right channels[1] their was good reason to believe that the 4th amendment was a joke to the Feds long before Snowden's leaks) especially combined with the existences and complete opaqueness of secret FISA Court. Its kinda crazy to me that all these technologists, and especially those on *hacker*news really believe that you have any sort of privacy from the US government, who has demonstrated it can act with complete impunity in most parts of the world for decades. I say especially people here because they should know how just a handful of rogue actors in any given organization could subvert any sort of veil of privacy. I'm not an expert by any means, but it makes complete sense to me that privacy in any large organization is a very delicate thing to maintain when your adversary is as sophisticated and belligerent as the US security and intelligence apparatus appears to be. Maybe I'm just not privy to something, but it seems like if the US national security apparatus want to do something on our or allies soil, they'll find a way.

[1]https://www.pbs.org/wgbh/frontline/film/homefront/ - aired 15-5-07 and covered the notorious ATT room 641a


I’m sorry, but there is a world of difference between locally indexing files for local search and tagging files as contraband so that they can be reported to the government.


Technically speaking, no there isn’t. It’s just a little bit if metadata you’re sticking on an already long tail of metadata that they’re doing as they index.


Most importantly, people are arguing that there is new technical risk, so a difference in intent or beneficiaries is not relevant for that argument.


Being technically similar is irrelevant. It’s completely different in principle, and that’s what matters.

A great deal many things are technically similar, but vastly different in principle. And those distinctions are important.


I’m not sure which side you’re arguing here.

The biggest concern about Apple’s system is that it’s very easy to add new items to a hash list. That is an argument about the technical similarity of scanning for CSAM and scanning for other things like classified documents (for example).

But there is a vast difference in principle. Pretty much everyone wants to stop child abuse. But many people—including major news organizations—believe citizens should sometimes have the opportunity to view classified documents.

Different categories of things to scan for will be different in principle, even if the technical approach is similar. This difference in principle is what Apple leans on when they say they will oppose any request to expand their system beyond CSAM.


The biggest concern about Apple’s system is that they are showing to all governments and everyone that it’s fine and good to scan for whatever on my device and report me to the government if they see fit, despite years prior refusing to implement backdoors or give access to someone’s device to the FBI.

They essentially invalidated all those claims and I can’t see how they’ll now be able to argue back if the US or the Chinas come to Apple saying they have to have more surveillance in their devices.


DOJ can pressure Visa, Mastercard and Amex to stop processing payments for Apple. Due to how the international payments systems work, that's a global sanction, even if Apple had no footprint in US.

And before you claim that's absurd and impossible, there is precedent for US doing just that.[0]

EDIT: There is also an earlier precedent, UIGEA - https://en.wikipedia.org/wiki/Unlawful_Internet_Gambling_Enf...

0: https://www.cnet.com/tech/services-and-software/credit-card-... (yes, the blockade was lifted later but my point was that the nuclear option is available)


> DOJ can pressure Visa, Mastercard and Amex to stop processing payments for Apple.

I think they'll find there's a world of difference in public support for "people leaking classified documents" vs "the people who make you and your family's phones, tablets, laptops, and watches".


This entire argument is a non sequitur and comes up like clockwork every time this issue is discussed. It's the metaphorical equivalent of saying "well someone could've snuck in through the open window. Let's just assume they did and leave the doors open as well".

How about instead we push back against Apple further shifting the Overton window on how acceptable it is for companies to run intrusive services on hardware we own?


It’s not a non sequitur. The comment is engaging with a series of rhetorical questions that imagine a slippery slope by observing that very little has changed about the trust model between iPhone users and their devices. If you are convinced Apple is slipping, then it is worthwhile to be able to answer how their position today is different than it was last month. That is of course a different question than whether their position last month was acceptable, and maybe people are realizing it was not.

As a concrete example, if you think the proposal introduces new technical risks, then if Apple announces they made a mistake and will instead scan entirely on the server, you may be satisfied. However, I’d argue that since no new technical risk has been introduced, your conclusions should not change.

I’d argue that the incorrect characterization of Apple’s announcement as scanning all the files on your phone with no control has shifted the Overton window more than what was actually proposed. Politicians who are none the wiser probably believe that’s what Apple actually built, even though it’s not.


I disagree - it's a distraction from the larger issue at hand.

> I’d argue that the incorrect characterization of Apple’s announcement as scanning all the files on your phone with no control

That's a strawman - few if any are arguing that the system will read all of your files out of the gate.

>since no new technical risk has been introduced,

This assumption doesn't reflect reality. Introducing a brand new system built specifically for client-side scanning absolutely adds technical risks, if nothing else then by the sheer fact that it's adding another attack vector on your phone. Not to mention the fact that all it would take is a change in policy and a few trivial updates (a new event trigger, directory configs, etc) for this system to indeed scan any file on your device.


What is the larger issue? The entire chain starts off with a prediction of what will happen in the future. It sounds like the trend is the proposed larger issue.


What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...

Simple: Money.

Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".

No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.

No government is going to pony up the money to reimburse them to do it (not even getting into the PR optics).

That leaves it happening only if 1) they decide to do it themselves, or 2) government(s) legislate they must.

So far #2 hasn't happened. Politicians had no basis of reference to point to and say "Your competitor(s)' doing that, you should too".

But now that #1 occurred, it will normalize this nonsense and pave the way for #2.


Their response to any such demands would be (and has been) "we don't have the capability to do what you're asking".

No judge is going to burden them to spend their own dime to build a massive new feature like this and deploy it to every phone out there to comply with a demand arising from a prosecutor of an individual case.

Government does not care one bit about how much it costs or if it is even possible. They demand the data with an ultimatum: deliver it as we requested by our deadline or we send in our IT people to take it. Sorry (not) if it takes your whole company down while we plugin in our own servers in your datacenter to take your data.


Doesn't work if the data of interest is not there for the taking. And a judge will not compel beyond what they consider reasonable. Having the feature already in place dramatically shifts the bar.


Their response to such demands has not been we are technically incapable of doing what’s requested. The demand from the FBI in the San Bernardino case was a very small change to passcode retry constants, because the terrorist’s device did not have a Secure Element.


Wasn’t that much more? FBI demanded that Apple would give them the signing key, they would do the rest themselves. Basically adding arbitrary code and unlock any device at that time.

https://en.m.wikipedia.org/wiki/FBI–Apple_encryption_dispute


No. The reference you provide characterizes it correctly: the FBI wanted Apple to create and sign a one-off build of iOS. The specific request was to allow automated passcode input and remove retry back offs and auto-erase constants.

This is how Apple described the request: “ The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.”

Or in their FAQ:

“Is it technically possible to do what the government has ordered?

Yes, it is certainly possible to create an entirely new operating system to undermine our security features as the government wants.”


You are right. I’m confusing to something else, or just fake news.

Edir: it was speculation by court, plan B https://www.theguardian.com/technology/2016/mar/11/fbi-could...


The politics of it is very different, and that's where the danger lies:

https://news.ycombinator.com/item?id=28239506

I think that quite a few engineers are too focused on the technical aspects of it, and specifically on all those "barriers to misuse" that Apple claims to have in place. But it'll be much easier to remove the barriers once the system as a whole is in place.


The reason we're focused on the state of it now is that we can switch at any time - especially if those barriers are shown to be ineffective or are removed at some point.


The real threat here is legal, not technical. Think mandatory on-device scanning as condition of access to the hardware market.


That just ties back into "be afraid of what it could become", and isn't dependent on Apple making this system - Congress could have forced PhotoDNA to be shipped with phones since the inception of PhotoDNA in 2011[0].

0: https://doi.org/10.1016/S0262-4079(11)60791-4


Of course it's not dependent on it being Apple. It could just as well have been Google.

And the difference is that Apple gave them what they needed to mandate this while still claiming that they preserve privacy.


There is a fairly large difference, first being it would be a massive damage to Apple's brand if they started scanning people's phones without permission.

But now that they've built the system to scan things on-device, they can be compelled by a government to scan for other things, and Apple can shrug their hands and say they had no choice.


Buried in the EULA, you give consent.


Why would Apple start shrugging now when they've been fighting the FBI in court?


One reason is that they weren't under antitrust scrutiny in 2016 when they fought the the government in court.

Their incentives have changed - they now have the real looming threat of being broken up by governments, so it is now in their interest to comply with anything else governments ask them to do.


If you believe this is true, why didn’t Apple also launch in the EU where antitrust scrutiny and case law are both more strongly against Apple?


How do you think will the lawyers be able to prove Apple is a monopoly when Android exists?


The mere existence of a competitor doesn't matter in competition law.

Other oil companies existed, but the government still broke up Standard Oil.

Other browsers existed, but the government still made Microsoft make certain API's available to other browser makers.


This is a false equivalence. Standard Oil was at the top. Apple is clearly not. In the USA, they may very well be, but I doubt the margin is very high. https://images.app.goo.gl/hLr1zAGs7Wq5TG46A A lawyer will say that Apple though in the lead as a brand, is still second to Android in terms of market share, no matter how fragmented the market is.

This might be the same in other western countries as well. But overall Android would still have more devices running it than iOS.


Again, competition law does not focus on the existence of competitors (true monopolies exist only in high school microeconomics).

If a company has durable (even if not a literal monopoly) market power due to its anticompetitive behavior and exclusionary conduct, that is grounds for taking antitrust action.

It's hard to switch to a different smartphone, due to anticompetitive actions taken by Apple such as their app store monopoly and making it harder to distribute web apps - switching to Android means losing all the apps you've purchased.

It is a lot easier to switch to a different brand of kerosene fuel, so antitrust action against Apple is justified despite them having a lower market share than Standard Oil did.

I'd suggest reading a textbook on US antitrust law and the court decisions that have led us to where we are, because this is a large topic that goes well beyond market share.


It's doubtful the focus would be there, it'll be on how locked down their ecosystem is, the app store, API's, proprietary ports etc.


> They could have already scanned our files because they already have full control over the entire ecosystem

They did do it in emails since 2019: https://www.indiatoday.in/technology/news/story/apple-has-be...


> They could have already scanned our files because they already have full control over the entire ecosystem.

Apple barely submits any CSAM[0]:

> According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.

0: https://www.hackerfactor.com/blog/index.php?/archives/929-On...


At one point this will be proven and we'll go back to regular digital cameras or even polaroids.


Nothing except Apple saying you could trust them. People were stupid enough to accept that and now even the trust is gone.


That's rational, but the point he's making is that this system obliterates the only defense we have had or could have against such activity: end-to-end encryption. This approach owns the endpoint.


…in the same way any existing feature of iOS that makes device data available to Apple (eg iCloud Backup) “owns” the endpoint, no? What’s to stop a malicious Apple from turning on iCloud Backup for all its users and hoovering up your Signal messages database and iCloud Keychain?


Nothing. iOS even defaults autoupdate to on, so Apple could do this without your interaction today.


> What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...

Yes, proprietary black-box hardware and software is poor from a user privacy perspective. But, If Apple began on-device scanning of content, I'd imagine eventually someone would notice the suspicious activity and investigate.

With Apple's announcement, the scanning will just be something that Apple devices do. Nothing to worry about. And, no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.

As for icloud, if your content is not encrypted on the device in a manner where only you have the keys, any cloud storage is suspect for scanning / data mining. But, on-device scanning is a back door for e2e encryption-- even on device encryption with keys only you control is thwarted.


> no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.

This seems like the easiest thing out of the lot to verify.

The way that this system is designed to work is that when uploading to iCloud Photos, images have a safety voucher attached to them.

If Apple secretly expanded this to scan more than just iCloud Photos, they would have to either a) upload all the extra photos, b) add a new mechanism to upload just the vouchers, or c) upload “fake” photos to iCloud Photos with the extra vouchers attached.

None of these seem particularly easy to disguise.

Your concern is completely understandable if you are starting from the premise that Apple are scanning photos then uploading matches. I think that’s how a lot of people are assuming this works, but that’s not correct. Apple designed the system in a very different way that is integrated into the iCloud upload process, and that design makes it difficult to expand the scope beyond iCloud Photos surreptitiously.

Could Apple build a system to secretly exfiltrate information from your phone? Of course. They could have done so since the first iPhone was released in 2007. But this design that they are actually using is an awful design if that’s what they wanted to do. All of their efforts on this seem to be pointed in the exact opposite direction.


How do you think Apple will increase the scope of what’s scanned without every person with Ghidra skills not noticing?


If the exchange with Apple is encrypted / interleaved with other traffic to icloud, how would you know that there isn't new classes of scanning being done?

I'll be very surprised if similar tech is not lobbied for as a backstop to catch DRM-free media files played on devices we "own".

And, it seems far more probable than not that police will demand this capability be used to help address more crimes. The problem here is that crimes can mean speaking out against an oppressive regime. Being targeted for having the wrong political views (think McCarthyism in the United States or the US backed murder of a million people in Indonesia for affiliating with the "wrong" political party). Etc. History is awash with political abuse of "out groups" perpetrated by tin pot dictators all the way to presidents and PMs of major world powers.

And, it sets the precedent that e2e encryption is not an excuse for a provider to not provide private customer data to the authorities-- a back-door can be installed, "Just do what Apple did."




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: