Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
National Crime Agency response to Meta's rollout of end-to-end-encryption (nationalcrimeagency.gov.uk)
47 points by webmaven on Dec 8, 2023 | hide | past | favorite | 91 comments


> “It is hugely disappointing that Meta is choosing to roll out end-to-end encryption on Facebook Messenger. They have an important responsibility to keep children safe on their platform and sadly, this will no longer be possible."

I think you mean... “It is hugely disappointing that Meta is stopping us from spying on their users."


I don't trust Meta or the UK government and its agencies, so it's quite possible for both sides to be technically correct while also utterly misleading.


The whole "think of the children" excuse wears after a while when you ponder - what about the adults, weaker encryption leaves them vulnerable to hackers/scammers..etc. But the whole aspect of weaker encryption is a risk to children has to be one of the most redactor-ad-absurdum things of government that has over-run its credibility.


> what about the adults, weaker encryption leaves them vulnerable to hackers/scammers

... And children!

Even worse, having the government monitor your communications _destroys your life_.[1]

These proposals by governments are effectively saying, "We're going to protect your children using this technique that completely fucks them up."

Child abuse is one of the most horrific things there is. It's only second to even now horrific things things like let's let the government monitor everything all the children say.

[1] "Anything you say or do can and will be used against you in a court of law", as US residents should be well aware.


People also seem to just be completely unaware of first degree price discrimination. Ones communication logs can be used against literally every human being on earth whether or not one has "anything to hide".

There is nobody who is "safe" from a company jacking up the prices of every product you see because you mentioned getting your paycheck in a message. Even if you don't say it directly, it can be implied from your general behavior using the same messages.


Not even a day until "THINK OF THE CHILDREN" .. probably not even a second. One could just write an auto answer script by now. Parse news headlines and the moment something with end-to-end encryption pops up. "Agency foo is very sad that <whoever> chose this step. <whoever> has an important responsibility to keep children safe and sadly, this will no longer be possible. We are so disappointed. So, so disappointed." .. and so on.


Nothing is black and white.

E2E encryption is a big headache for legitimate law enforcement. We've seen it with Encrochat.


Encrochat wasn't E2E, which is why the Dutch were able to get all up in their servers.


Exactly.

It was a big issue but they still manage to get in, with difficulties. Now, E2E encryption is much harder to crack if possible at all. You probably need to compromise each specific phone/computer/app you're interested in.


Encrochat is a funny example - WhatsApp and Signal were widely available at the time and were entirely secure.

Encro was a home-brewed system that pretended to be more than it was while the target market would have been much better off with an up-to-date stock android or iPhone handset + Signal and basic security awareness. The vulnerability around the handset is certainly significantly less than whatever cobbled together android distro they were using, and if the messages are on the handset then they are always vulnerable to recovery.


> E2E encryption is a big headache for legitimate law enforcement.

So is the requirement to have a warrant before barging into someone's house at 0300 with guns and dogs.

So is the requirement to allow the accused to confer with an attorney.

So is the prohibition on using torture to extract confessions.

And?

The world doesn't exist for the convenience of "legitimate law enforcement", or shouldn't, anyway.


Your comment is obviously disingenuous.

You are comparing apples and oranges for dramatic purposes.

Of course police need a warrant e.g. to wiretap but that becomes moot if the wiretap is impossible for technical reasons. That is the whole issue here.


> Your comment is obviously disingenuous.

It is "obviously" nothing of the sort.


To elaborate:

You're allowed to have curtains, no matter how inconvenient that makes it for police who want to look through your windows.


Good.


These things must go through a judge. Full stop. Indiscriminate spying is not the way to go.


A friend of mine worked at meta and said they already had a fullscale set of systems to mass export data to governments based on certain indicators (eg keywords) in response to subpoenas. Perhaps that’s obvious, but the mechanism for proper channels is there, I doubt end-to-end encryption blocks that.


> I doubt end-to-end encryption blocks that.

Isn't it supposed to?


I never saw this substantiated, but I heard that they had on-device systems that they used in emergencies for stuff that was particularly bad + easy to scan for. The time I heard of it being deployed was to prevent the spread of the Christchurch shooting video.


What's to stop them from having hooks in their app that can bundle up all the decrypted messages, re-encrypt, and phone home? Certainly it wouldn't be default behavior, but its possible and would allow them to answer warrants.


It's a good point. But someone would eventually see it in the dev tools and Meta's credibility would be shot forever.

It's an open question if that impacts their bottom line at all.


Agreed. I think their bottom line probably is built off of how it would affect their user base. My hunch is given the immensity of the user base, it wouldn't cause enough of a significant exodus for Meta to care either way. But that's speculation, not sure if that can be backed up with evidence from past events.


A keyword search is simple enough it can be done client side, before any encryption (and/or after decryption).


Oh, good point.


WhatsApp wording is usually very specific about e2e when you send a message from you to your contact. Nothing about eg all this also being dumped on FB servers in plaintext.

This wouldn't be an outright lie - it may well be e2ee but just also shared around.


That would be a lie from the perspective of a court since it very clearly does not fit their description of the service. In cases like that, courts tend to give leeway to the argument that they promised what a reasonable person would conclude from the product advertisement.

What would be more interesting would be if they had a way to signal their app to do a search locally and report a match. That would still be E2E but might be enough that they would be able to defend a lawsuit.


> That would be a lie from the perspective of a court since it very clearly does not fit their description of the service.

I don't know. They promise a particular aspect of privacy and presumably deliver on it. They don't promise absolute privacy - that's my whole point.

I think given the exact wording of what they promise, I can't reasonably expect that my conversations are private only between me and the recipient.


This is what they said and it very clearly excludes the scenario in question:

> The extra layer of security provided by end-to-end encryption means that the content of your messages and calls with friends and family are protected from the moment they leave your device to the moment they reach the receiver’s device. This means that nobody, including Meta, can see what’s sent or said, unless you choose to report a message to us.


With what little FANG legalese I know, this IMHO covers Al ranges of nasty behaviour.

But of course we won't know either way until it's tried in court.


> are protected from the moment they leave your device

Eh plenty of wiggle room here


Not if you don't chop the quote like a creationist. The next sentence rules out any sort of creative games like “we send it to the receiver which decrypts it and bounces a copy back”:

> This means that nobody, including Meta, can see what’s sent or said, unless you choose to report a message to us.

If they were playing games and got sued, I'd be quite surprised if they didn't get laughed out of court for claiming that “nobody” actually meant “the FBI”.


Anything that goes from a company to a government or particular that discloses sensitive data is illegal and should go through a judge permission. Otherwise it is a crime. These things must be supervised by a judge.

Otherwise innocent people, disidents or political rivals could just be in danger.

Not a good thing.


Surely if it is actually end-to-end, there's no data to send? I just wouldn't trust Meta to do that.


There are different definitions of “end” in “end-to-end”

* One being you communicate from your device to the servers securely; in that case, you can do keyword matching. If you try anything like that, a few engineers will insist Meta clarifies that is not the intended meaning of that word, Alec Muffet will probably be the loudest.

* The other, device to device, would indeed prevent Meta from doing data collection—unless they apply the keyword detection on your device directly. But that also means the list of keywords would be accessible by any user (unless you use a bloom filter, where it could be more challenging to guess), and that’s not something the cops are found about. You’d also have to let your own device send the message “Phone 1234 used the banned word 5678,” which criminals would learn to block.

So yeah, in principle, end-to-end would make it hard to do well. I‘d be really surprised if Meta got away from something that egregious, though: there are still people internally who know that government power can be abused and would not let people lie about how safe the tool is.


I don't for a second believe the part about search warrants.

Judges often rubber stamp search warrants on the flimsiest of circumstances. There is no reason that won't happen here, especially since judges are humans and thus when you tell them you suspect someone of having child porn, most (but not all) will give you more leeway.


> Judges often rubber stamp search warrants on the flimsiest of circumstances.

I have personal experience with this. A warrant to search my apt was approved by a judge because I was suspected of credit card fraud. Their evidence? Packaging for a new external hard drive found in a recycle bin shared by 4 apts, that's it. No evidence of who put it there or how it was acquired, just a hard drive package.


I'm curious - if the memory isn't too painful - did the search leave much of a mess, and was there any "apology process" when, I assume, it turned out you weren't a criminal? Also, what's the locale?

I sometimes worry about being a "victim" of a legitimate legal process, in which I did nothing wrong, and yet have to endure legally-mandated procedures.


They took most of my computer hardware and did leave a bit of a mess but it wasn't too bad.

I never said I wasn't guilty, I was. But the evidence they cited to get the warrant was shockingly slim. Had I not pled guilty I would have tried challenging it.


Perhaps the reasoning for obtaining any warrants should be admissable in court (if not already) on behalf of the defendent?


That reasoning is usually subject to PII - Public Interest Immunity - exemptions, especially where it involves evidence obtained from sensitive techniques (wiretap, CHIS, other technical measures).


> This is equivalent to 1.3%-1.6% of the UK adult population.

Did the not to editors just labelled 1.3-1.6% of the population as "perverts"?

I will go ahead and assume that this tilts disproportionately to male over female, so... 3% of adult male population in the UK is allegedly perverts? How about addressing THAT instead of carpet bombing personal freedoms?


How do you address that if that’s just how they are



> As outlined in our 2023 National Strategic Assessment, we estimate that here are between 680,000-830,000 adults in the UK that pose some degree of sexual risk to children. This is equivalent to 1.3%-1.6% of the UK adult population

This can’t be accurate, can it? The article says they arrest 800 people a month, whereas the US only convicts 1,600 people a year. What causes these discrepancies?

https://www.ussc.gov/sites/default/files/pdf/research-and-pu...


It's a hot topic and I rarely want the Internet drama associated, but the UK LE already failed numerous times at keeping children safe, and they're probably trying to pick up the slack[0]. Searching "uk grooming gang scandal" will add some context. Hell, even "Prince Andrew scandal" would add a lot of context.

[0]https://www.statista.com/statistics/303514/child-cruelty-abu... (can't find a graph like that on sexual abuse cases but there's been constant growth too)


Arrests don’t always lead to a conviction (eg not enough evidence or the suspect might genuinely be innocent).


It's close to what I'd have guessed:

• It's a bit higher than the percentage indicating they have that interest in responses to sexual desire surveys, but such surveys are often underestimates.

• It's much lower than the percentage of women who report being victimised when they were minors, but one criminal can have many victims.

• It's really difficult to prove the offence in a court.

• A common argument for harsh punishments is that the legal system is supposed to have punishments severe enough to act as a deterrent (as in not just rehabilitation, compensating victims, or keeping dangerous people out of society). Quite a lot of other crimes, say possession of heroin which IIRC carries a 7 year prison sentence even without intent to supply, also have far more offenders than the entire UK prison population (3x in the case of heroin).


Looking at the referenced source which links to their methodology, there's about 75k child sex offenders. 9/10 child sex offenders that are caught aren't on the register. So perhaps the estimate is actually below the real value.

https://nationalcrimeagency.gov.uk/nsa-child-sexual-abuse


It's likely to be a massive under-estimate.

>whereas the US only convicts 1,600 people a year. What causes these discrepancies?

Difficulties detecting the offending and the enthusiasm for prosecution.


> if Meta continues to roll out end-to-end encryption as planned, it would result in the loss of the vast majority of reports (92% from Facebook …)

Wait: how does that prevent reports? The recipient presumably can see the message and share it. Are reports coming from messages that are read by a third party? I’m assuming Meta itself, as that’s the only party who can and would do it at scale.

That’s not the NCA telling what Meta should do; that sounds more like an internal Meta team that deals with that and has close contact with the NCA, unhappy they can’t use keyword search to flag bad things.


It is always about the children when it comes to surveillance of online comms.


This is the government that ignored reports of child sexual abuse for decades resulting in the sexual abuse of 1400-2000 children.

https://en.m.wikipedia.org/wiki/Rotherham_child_sexual_explo...

End to end encryption doesn’t actually harm children and may in fact help them by allowing whistleblowers and others to safely report government misdeeds.


From Facebook Engineering on about secured storages of messaging server, my take is there is no need for convoluted protocol on server-side in a true encrypted end-to-end (EE2E).

https://engineering.fb.com/wp-content/uploads/2023/12/TheLab...


I see no issue with this, how is the NCA going to try and ban maths?

The cat is already out of the bag with E2EE, there is no point trying to stop this.

NCA should focus on getting platforms to raise the age limit of their platforms and (dare I say) have some form of ID and consent system for children to use social parts of the internet / web.


There is a huge difference between blanket use of e2ee in one of the most popular messenger on earth and a couple of nerds using "maths" on top of whatever protocol to send PGP encrypted messages around. Many, if not most criminals are neither smart nor tech savvy. They're clearly pissed because Meta's step is actually impactful.

I appreciate Meta's decision but let's not pretend that there isn't a trade-off.


Please not ID. I was with you right up until that point.


Turns out they have a FB page and just posted something on the topic.

https://www.facebook.com/NCA/


I don’t have a Facebook account. Are you able to share what they responded with?


It’s a summary of the public statement:

> Today, Meta has chosen to rollout end-to-end-encryption on Facebook Messenger, which means they will no longer be able to keep children safe on their platform.

> Today, our role in protecting children from sexual abuse just got harder.

> For years, Meta has supported law enforcement by reporting instances of abuse to National Center for Missing & Exploited Children. The content in these reports enable us to progress investigations.

> Currently, the NCA and UK policing are arresting around 800 offenders and safeguarding 1,200 children every month.

> Unfortunately, This important work is now at risk. Meta’s design choices mean they will no longer be able to see offending on their platform.

> As the rollout continues, we estimate that UK law enforcement will lose 92% of the reports we currently get from Facebook.

> The onus should not be entirely on children to report abuse.

> Along with our partners in the UK and overseas, we will continue to do everything in our power to safeguard children and identify offenders.

> Read our full statement https://ow.ly/6fFt50Qgra3


I can't believe I'd ever say this one day, but: assuming they've actually done this properly (which remains to be seen) : very well done facebook.


What about choosing not to encrypt chats involving children? Good compromise?


Probably the most reasonable take, but it’s unclear how you would tell who is a child and who is not.

I read your comment as a provocation to get the NCA to reveal that it’s not children they care about, which… I think some of them genuinely do, but encryption isn’t making a material difference.

Nota: detecting who was underage on Facebook used to be my job, so I'm happy to explain functional, scalable solutions and their limits, but not in public.

Edit: I think encryption isn’t material because you can have a bad-thing detector on the device if you suspect a problem. Naturally, there are debates about how to implement that, too.


But, how will we protect the children's need for privacy if just anyone can read their communication?! Think of the children!

The children argument has no bounds.


I can't believe these folks can say "we need to prevent everyone in society from communicating privately to protect children" with a straight face.

What must their employees think to themselves? Seems Stalinist.


The employees who think this are the ones who have to watch videos of babies and toddlers being raped, so that tends to influence their opinions. My partner was one of those employees and she is, consequently, very much against anything that makes it easier for this to go undetected.


Yeah, if we could guarantee a purely benevolent system that will only ever use the data to catch or prevent pedos without ever widening the scope, it seems like an easy choice. I don't doubt the more awful shit you see, the easier it is to convince yourself such a system exists.

But like the recent abruptness in states like Texas immediately targeting women for receiving healthcare the second they were able to should give anyone pause. Especially given that a large population wants to reelect an insurrectionist publicly planning a dictatorship.


That reveals a major complication of the process that your partner was a part of: if you suspect content to be CSAM, you *legally must not* look at it, open it, etc. It’s a way to prevent bad people from abusing those positions, but at the same time, no one (innocent) who was told to do that wants that law to be changed.

It makes building detection systems harder, though.

Your intuition points to a typical PR pattern around Meta: employees disagree, and they resolve their frustration in the press. So many big tech scandals are an opportunity to remember the “two wolves inside of you” story.


She was an investigator with the NCA. She was obliged to look at it in order to progress the prosecutions.


Opinions based on trauma are rarely logical.


That doesn't make the employees stalinist, does it.

The issue is that CSAM is not some sort of nebulous, vague idea. It is pervasive and endemic in online communities and if you have any tool that can facilitate the sharing of images and communication with children by adults then it will, inevitably, be used by paedophiles. This isn't a vague conjecture, it is the truth of the matter.

FB Messenger is specifically a problem because of the ready access it provides adults to children.


> That doesn't make the employees stalinist, does it.

Yeah, it kinda does.


No, it kinda doesn't.


Advocating for universal surveillance of people regardless of whether they've committed any actual crimes?

Yeah, it kinda does.


Against "anything"? Really?

What about, say, grabbing random people off the streets and administering electric shocks to them while asking them if they've ever abused children?

What about requiring 24/7/365 cameras in everyone's bedroom to make sure they're not raping children in there?

The point here is that there are limits.


The tyranny caused by a ban on private communication is not worth the net reduction in harm to children.


Clearly the people who deal with it disagree.


One problem is that an unknowable subset of the people who claim to be anti-harm-to-children are actually pro-tyranny.

https://en.wikipedia.org/wiki/Bootleggers_and_Baptists


Does she believe that by preventing sharing of this video numbers of acts of real rape will be reduced or stopped?


This sort of imagery is often created for profit - either for the kudos of being the creator of first-generation material or for actual profit in the case of a number of asian abuse rings.

So yes, demand reduction will reduce actual instances of rape and sexual abuse - while there is an argument that nonces will nonce regardless, the encouragement and egging on by others is a significant driver of offending.

There is also the matter that mere possession of these images is also a criminal offence.


It is quite amazing to see the UK gov PR focus so exclusively on child protection and not one mention of terrorism remains. Given their failures to protect children when the evidence was right in front of them (the Jimmy Savile situation) they clearly cannot expect this messaging to be taken remotely seriously, especially given the way we are all supposed to forget about Epstein.

These people really have not internalized what it means to have all your communication able to be intercepted by AI and they won’t until someone uses AI to fake a whole load of crime being committed by them for which they are then punished.


If only there were other measures to combat terrorism or child exploitation than mass surveillance...


I am sympathetic to your position. However this sort of low value statement is I think part of the problem. You ask a question that you imply you know the answer to and then you don't give the answers.

What other measures do you actually propose?


You're right; I didn't offer a solution. The point of my comment was how one-sided and predictable the framing around these measures has become. As you can see by some of the other comments, I seem to not be the only one thinking that.

Let me ask you the apparent counter question: do you not see the value of private communication? If you use Facebook Messenger personally, don't you feel that you have gained something by your messages being encrypted? If not, is your reply simply "I ain't got anything to hide"?


As I said I'm sympathetic. My question was more about effective strategy in countering the push against encryption than it was challenging the utility and value of private communication.


Now I got you. I guess "keep children safe" here means two things.

1. Protecting them from grooming 2. Stopping CSAM from being spread via Facebook Messanger

It's evident that E2E encryption likely doesn't make much of a difference regarding the second point, given that many messengers have already implemented E2E encryption.

So, assuming the communication of the NCA was perfectly sincere, the loss here would be mostly the inability to prevent grooming properly. In this case, I think the question we should be asking ourselves is whether children need to be on these platforms at all. We're seeing so many issues linked to the phone use of children that regardless of protecting them from sexual abuse and exploitation, they should probably just be using phones less.


This is I think a much more effective line of communication. I would also suggest that investing in creating communities where parents are much more involved in their children's activities is also a really effective mitigation. Much of our current communities seem to be trending in a direction where children are isolated and alienated making them much richer targets. Creating communities where the environment is much less target rich for those who would prey on them would I think be a good step. Also making it a community where children are more likely to report the predator would go a long way toward identifying and removing the predator from those communities.


I assume they mean "...more targeted surveillance based on reasonable suspicion".


>What other measures do you actually propose?

If the US really wanted to stop terrorism it could achieve that just by pulling all its troops out of the middle east. Islamists aren't just coming to the US to blow themselves up for the fun of it, they're attacking the US because it has soldiers and military bases occupying most of the middle east.


I'm sure that's part of it. But a clear eyed reading of their own religious beliefs and proclamations shows that they are also quite intent on converting the entire world over to their religion or else slaughtering. I'm not sure your suggestion is enough. How would you encourage the hard line believers (also most likely to become terrorists) to abandon that belief system?

I'll note that part of their complaint is also that we export our culture via economic methods. Would you also withdraw from all interaction with areas they currently control including humanitarian and immmigration? This would prevent people of that cultural heritage who live here from sending aid to those societies legally. And this only barely touches on just how complicated this all is. The discourse around much of all of this is full of motivated thinking on both sides and a massive amount of oversimplification of the problem.


You left the combatting child exploitation part out. How does pulling troops from ME will achieve that?


The NCA are so out of touch. Anyone technical briefing these people should be ashamed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: