Hacker News new | past | comments | ask | show | jobs | submit login
Expanded Protections for Children – FAQ [pdf] (apple.com)
45 points by almostdigital on Aug 9, 2021 | hide | past | favorite | 58 comments



> Can non-CSAM images be “injected” into the system to flag accounts for things other than CSAM?

Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes.

The problem is not that Apple can't add images to the database but that organizations from the outside can inject any hashes to the new, constantly sniffing system at the heart of iOS, padOS and macOS. Apple has no way to verify those or any hashes before they get injected into the database.

If the system detects any matches only some overworked and underpaid content checker from Bangladesh is there to stop your life from being destroyed by some SWAT team crashing through your front door at 3am, killing your barking dog. And who knows if those foreign sweatshops are even trustable.


Biggest problem is "who govern the governors"

What is the feedback when such system is abused

> Existing images of CSAM that have been acquired and validated by X organizations

Will Apple be considered as responsible? punished? Will Apple or the X publish the list of images and allow regular 3rd party validation? (I see security related product companies do that)

In this era it's hard, VERY hard to entrust our private properties to these giant tech companies. There is so little to NO negative feedback to their mis-behaviours. These companies need more regulations than individual citizens.


> Biggest problem is "who govern the governors"

Child protection is an exception to all standards of due process in the US and Europe. There is NO organization that can interfere when people get mistreated based on these laws (the vast majority that fall victim to "side effects" are children, of course). Only generic "quality assurance" is done. There is no protection at all for individuals, whether children, parents, or third parties.

> Will Apple be considered as responsible? punished?

The EU court has judged just a few months ago that child protection authorities cannot be held responsible for the damage they cause, EVEN if it is shown that their actions were based on incomplete or wrong data.

> In this era it's hard, VERY hard to entrust our private properties to these giant tech companies.

Okay, well if you think this is a serious problem, then let me tell you what else social services data is being used for. In Belgium, social services, including homeless shelters, enter data into your medical records which you can't see, erase, or ... emergency departments will read this data, and use it to avoid the situation that they have to use the state insurance for non-insured persons, which is a polite wording for "refusing care to homeless persons".


Thanks for your detailed reply!

My point remains the same: Need more feedback on "governors/power wielding side"


> If the system detects any matches

This part isn’t true. Unless a threshold of multiple matches is reached, Apple won’t have a complete key to decrypt anything.


I think this is being a bit pedantic. If they can slide one on your device, they can slide 100


This is already a thing today. Most major could providers perform server-side scanning, so if a nefarious party can smuggle problematic photos onto your cloud, you have the same problem.

To make it perfectly clear: I am absolutely agains this scanning system, but I think that we need to keep to high-quality arguments to successfully argue agains it.


There is one question missing: if China asks Apple to flag users having Winnie-the-Pooh images on their devices, or leave the chinese market, what will Apple choose?


Apple has caved to pressure from China before multiple times, and it would not be surprising for them to do it again.

Examples :

Apple to withhold a new privacy feature in China : https://www.aljazeera.com/economy/2021/6/8/apple-to-withhold...

iCloud Data Turned Over To Chinese Government Conflicts With Apple’s “Privacy First” Focus : https://www.cpomagazine.com/data-privacy/icloud-data-turned-...

Apple removes police-tracking app used in Hong Kong protests from its app store : https://www.cnbc.com/2019/10/10/apple-removes-police-trackin...


I mean, that same situation already applies right now without this system being deployed. China already can mandate Apple to scan all iCloud or on-device photos for certain images etc. if they wanted to.


They answer that question. The answer they provide is that they would leave the Chinese market.

You might not believe them but they are pretty clear on that point: “we will not accede to any government’s request”

This is the standard you have to hold Apple to. As I said, you might not believe them – but if you don’t trust their statements on some level then it’s game over anyway.


Their record on not accepting requests from governments is not that good: another comment here mentions a few cases from China, but Russia has had some success too https://9to5mac.com/2021/03/16/russia-pre-install-iphone-app...


Did they in that instance say beforehand in clear and unambiguous terms that they would never do anything like that?


>You might not believe them but they are pretty clear on that point: “we will not accede to any government’s request”

What makes this time different from the other times they acceded to government requests?

Reminds me of the old Apple ad: https://www.youtube.com/watch?v=PjmVN7mAMwc


Did they, in those instances, make an explicit statement to the contrary?


There are about 69 countries where it is illegal to be gay, what if these countries want to know if you have a rainbow flag or two men kissing photo on your phone?


Offering that sort of service to China is the real reason this was developed.


There are apps, like WhatsApp, that allow you to save photos you receive to your camera roll instantly.

If somebody, or another compromised device sends a large collection of CSAM to your device, they will be uploaded to iCloud, probably before you get a chance to remove them -- the equivalent of "swatting".

Besides the apps that you give permission to store photos in your Photos app, what about malware such as Pegasus we've seen again and again?

I wonder if we'll start hearing a year from now about journalists, political dissidents, or even candidates running for office going to jail for being in possession of CSAM. It would be much easier to take out your opponents when you know Apple will report it for you.

I guess all this does is disincentivize anyone who cares about their privacy from using iCloud Photos, which is sadly ironic since privacy is what Apple was going for.


> There are apps, like WhatsApp, that allow you to save photos you receive to your camera roll instantly.

Major cloud providers already scan photos for CSAM. So this feature does not change anything in this regard. If you are using cloud photo storage, you can be targeted with this attack no matter whether you use an iPhone, some android Phone or anything else, really.


Apple doesn’t.

Which is why they focused their research on differential privacy.


If you're a nation-state or a group that wants to sow discord and distrust in another nation's citizens and their neighbors and institutions, what better way to do so by framing people, some significant and some insignificant, with CSAM?


Is there some mechanism keeping that from being done through SMS or something equally direct?


Apple is not planning to scan SMS messages with this (that’s a completely separate feature, not related to CSAM fingerprints, and never involves law enforcement), and SMS photos are not automatically added to your iCloud Photo Library.


Feels like they messed up the comms on this in a quite un-Apple-like way.

My understanding (high level) is their system is designed to improve user privacy by meaning they don't need to be able to decrypt photos on iCloud (which is, if I understand correctly, how other cloud providers do this scanning, which they are required to do by law?), but rather do it on the device - without going in to the upsides and downsides of either approach, I'm surprised they didn't manage to communicate more clearly in the initial messaging that this is a "privacy" feature and why they are taking this approach, and instead are left dealing with some quite negative press.


For me, my problem with it is nothing more than selfish. There’s literally no upside for me. Best case, my battery dies a little faster or my power bill goes up a small amount that I will never notice. Worst case, I get a SWAT team in my face because someone accidentally clicks “yes” instead of “no” when reviewing a false-positive.

What’s the upside to me, for this tech? There’s less of this content in the world? I don’t see it anyway, so for me, it doesn’t do anything except become a potential (low) risk liability.

I don’t see why I’m paying their power bill to do something they’re legally obligated to do.


Maybe if you somehow gain some power over this trillion-dollar company, it will start to care about your opinion, or the fact that you've been disadvantaged.


That assumption of a certain fairness, or give-and-take outcome, is only applicable in a democratic, free society.


> (which is, if I understand correctly, how other cloud providers do this scanning, which they are required to do by law?)

I don't think that is required by law - at least not in EU or US.

There was already a lot of online outrage in July when EU passed regulation that allowed such scanning for the next 3 years - I imagine requiring it would be much worse.


Ah, I didn't realise that, assumed it was a legal requirement


Apple is good a normal PR and terrible at crisis PR. This is crisis PR, but unlike with bend and antenna gate this should have seen this one coming, since they caused it.


Haha yeah, that is a really good point actually.


Indeed. The crucial thing is: iff you disable iCloud photos upload, the scanning Feature is disabled.


For now.


Expectation : Political rivals and enemies of powerful people will be taken out because c*ild pornography will be found in their phone. Pegasus can already monitor and exfiltrate every ounce of data right now, it won't be that hard to insert compromising images on the infected device.


This argument does not work however, since it also applies to "old-school" cloud CSAM detection that everyone is doing anyway. The big problem with Apple's approach instead is that they essentially include a configurable (and very fancy) spyware engine on your phone that could be easily extended to do more in the future.


> Why is Apple doing this now?

I find the answer to this question unconvincing.

If we think very selfishly from the company's perspective - Apple already had one of the most secure, private and trusted platforms. And they must have anticipated the backlash against the new feature. So I still don't get why a company like Apple would consider the marginal benefit from this to be worth the cost.


The benefit must be larger than publicly available information would hint.

Something like the PRISM program: https://www.theguardian.com/world/2013/aug/23/nsa-prism-cost...


Well, one very obvious benefits is that they gain leverage in discussing their alleged position as a monopolist. They could argue that they have implemented a sophisticated and private CSAM detection engine which is only possible if they continue to control the entire system.


>CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations.

How is Apple validating the datasets for non-US child safety organisations?


They can't. There is no possible future where Apple would ever intentionally expose itself to CSAM in order to verify it. It's a complete catch 22.


Something I'd missed before: "By design, this feature only applies to photos that the user chooses to upload to iCloud Photos"

This is not about what people have on their own phones. This is about what people are uploading to iCloud, because Apple does not want CSAM on their servers!


They should have thought about that before going in the cloud business. I mean, the amount of nastiness you'll find online is beyond belief. Just ask any person at FB who is tasked with reviewing posts.


> Something I'd missed before

If you go back and read the earlier mega-threads, there were many people pointing this out and downvoted to oblivion. Hysteria is a helluva drug.

For anyone concerned about the hypothetical framing attack via WhatsApp auto-saving, you can selectively control which apps have access to your photo library (and thus iCloud) in settings.


This is met with disbelief because of one simple fact: There is nothing that limits Apple from scanning other files, other than “we pinkie swear we won’t.”

Corporations have a terrible record about breaking non-revenue-impacting pinkie swears. Breaking this one - especially when “what about the children” is involved - would have no meaningful impact on their revenue or share price, it could even go up.


If Apple didn't want CSAM on their servers, they'd implement it server-side on a worldwide basis, and not on the client-side for US clients.


They explicitly don't want to implement it server-side due to the privacy concerns of decrypting every image in iCloud Photos. Doing it on-device limits Apple's possession of decrypted photos to those likely to be CSAM.


Apple can already decrypt your photos in iCloud - that’s how you can log in at iCloud.com and see them.


Of course they can; The point is that they don't want to if they don't need to.


Apple obviously does not want CSAM on their servers.

As the document states, they do not want to scan all images server-side for privacy reasons. They just want to flag the positives while keeping privacy standards as high as posible for everyone else.


Implementing it server-side would require the content to be on their servers to scan, doing it client-side reduces that risk (or, as they claim, don't want to do that for privacy reasons, or most likely a mix of both those two reasons).


Everyone has said everything wrong about it already. Nevertheless, Apple can sugarcoat it as much as they like. There’s no technical control (no actual nor possible one) making this exclusively about targeting CSAM.


It's frustrating (though not at all surprising) to see Apple continue to be so tone-deaf. They clearly think "If only we could make people understand how it works, they wouldn't be so upset, in fact they'd thank us."

This is not the case - we do understand how it works, and we think it's a bad idea.


The question and answer i’m missing is:

Will Apple notify a user once an image has been (/erroneously) flagged and will be inspected by Apple employees?


An image being erroneously flagged will not have any effect. Until the threshold is reached, Apple is not able to know if or how many images have matched.

And even then, the voucher doesn’t include the key to decrypt the photo itself.

Apple has documentation that explains all of this.


But surely someone will at some point inspect the images…

So my question remains the same. Do users get a notification once they are (/erroneously) flagged.


Yes. If you read Apple’s documentation about this, the user is notified immediately when their account is disabled, and they have the opportunity to appeal the decision.


I doubt it, since those photos will likely be close to porn, ie. of naked people and Apple doesn't want people to know that the naked pictures they took of their 22 year old spouse is now floating around some random office in Bangladesh.


That’s not how this system works - a random nude photo is no more likely to match a hash than a photo of my pool brush.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: