Hacker News new | past | comments | ask | show | jobs | submit login
Bugs in Our Pockets? (lightbluetouchpaper.org)
85 points by etiam on Oct 15, 2021 | hide | past | favorite | 21 comments



This is a blog post on a paper [1] by Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Vanessa Teague, Carmela Troncoso.

To me, the gist is: There is no technical way to make client-side scanning psychologically trustworthy. There are many plausible policy threats.

So this boils down to: The decisions is with governments (some of which are bad) and users (who don't give a collective fuck), so governments will begin to decide the fate of these systems.

[1] https://arxiv.org/abs/2110.07450


> If device vendors are compelled to install remote surveillance, the demands will start to roll in. Who could possibly be so cold-hearted as to argue against the system being extended to search for missing children? Then President Xi will want to know who has photos of the Dalai Lama, or of men standing in front of tanks; and copyright lawyers will get court orders blocking whatever they claim infringes their clients’ rights. Our phones, which have grown into extensions of our intimate private space, will be ours no more; they will be private no more; and we will all be less secure.


> Then President Xi will want to know who has photos of the Dalai Lama, or of men standing in front of tanks; and copyright lawyers will get court orders blocking whatever they claim infringes their clients’ rights.

According to reporters in Xinjiang, this is already the case. Ethnic minorities in the region are subject to phone scans by the police that search for "dissident" material, like the Quran. Not only that, they have to run government apps on their phone that do the same thing, in addition to constantly spying on them.

Here's an article about the app that tourists to Xinjiang must install[1]:

> The app gathers personal data from phones, including text messages and contacts. It also checks whether devices are carrying pictures, videos, documents and audio files that match any of more than 73,000 items included on a list stored within the app’s code.

> Those items include Islamic State publications, recordings of jihadi anthems and images of executions. But they also include material without any connection to Islamic terrorism, an indication of China’s heavy-handed approach to stopping extremist violence. There are scanned pages from an Arabic dictionary, recorded recitations of Quran verses, a photo of the Dalai Lama and even a song by a Japanese band of the earsplitting heavy-metal style known as grindcore.

Xi's regime is quite literally looking for people who have pictures of the Dalai Lama.

[1] https://www.nytimes.com/2019/07/02/technology/china-xinjiang...


To play devil's advocate: They just legitimately looking for terrorist.

The western countries also "just look for terrorists" when it comes to mass surveillance…


And why should we actually care? Why should we give up a right that is inherently valuable, only to receive further detriment to our life, liberty, and happiness?

Logically, empirically, historically, “terrorism” is a term bandied by those who understand that fear is a weapon. One that can compel people to sacrifice a valuable, scarce, resource in order to spare themselves an infinitesimally small possibility of death in a manner that horrifies them.


See also "The phone disaster":

... It took some time for 3G networks and social media to arrive, but when they did, in 2010, Uyghurs embraced the transformation to their world brought about by the internet. This process of becoming digital persons was dramatically accelerated by a new smartphone-specific app, built by the Chinese company Tencent, called WeChat. Since Facebook and Twitter had been blocked across the entire country in 2009, Uyghur internet users focused their online communication on that one app. In the space of only a couple of years, millions of Uyghurs had purchased a smart phone and were using the app every day to build networks of friends. They also discovered that using the voice memo function allowed them to have Uyghur conversations at least partially outside of the censorship capacities of Chinese state authorities....

https://restofworld.org/2021/china-smartphone-uyghurs/

(https://news.ycombinator.com/item?id=28848665)

Spoiler: It didn't last.


Problem is, this is simply assuming that governments can't do that already. Anyone thinks that if president Xi wants to know who has photos of the Dalai, Apple is going to say no to any demand they make? Apple will be more than happy to oblige; they will censor, remotely disable apps, and push silent updates with location tracking and the like. I am quite sure this has already happened.

This is actually why I dislike when "libertarian" organizations are all too happy to let the corporations have practically unlimited surveillance power, but when the democratically-elected government tries to do it on a smaller, specific scale? Then no, that shall not pass! Government bad!

Well, I have some news. If your democratically-elected government is really ever going to turn Stasi/Fascist, Apple is not going to stand up to it, and they are going to be all too happy to share your data with them. Your government may right now still have some decency left, which means that they send a legal request to Apple instead of a secret letter from a three-letter-agency, but that is not going to be the case for a more evil one.

The solution for sure involves preventing these corporations from having surveillance power, _in part by using government regulation_ since apparently citizens do not seem to understand the problem. A cultural change is also required, so that pervasive surveillance by corporations stops being seen as normal. Then when the government wants to do it they will have to roll their classy-old own spy tech, rather than just asking for the data to any corporation.

This is yet another area where the market, "vote with your wallet" or "self-regulation" by the corporations themselves is not going to cut it. They need to be forced to avoid having access to this much data about their users.


I do agree that (seemingly paradoxically) privacy regulation is the key to preventing overbearing state and corporate spying.

The paper actually has a few nice things to say on this. Among other passages:

> Economics cannot be ignored. One way that democratic societies protect their citizens against the ever-present danger of government intrusion is by making search expensive. In the US, there are several mechanisms that do this, including the onerous process of applying for a wiretap warrant (which for criminal cases must be essentially a “last resort” investigative tool) and imposition of requirements such as “minimization” (law enforcement not listening or taping if the communication does not pertain to criminal activity). These raise the cost of wiretapping.

> By contrast, a general CSS system makes all material cheaply accessible to government agents. It eliminates the requirement of physical access to the devices. It can be configured to scan any file on every device.

I think the economics-based approach is very valuable here, especially because it is comparable across forms of government.


It's not about whether governments can do it, it's how easily it's accomplished.

If it's as simple as the stroke of a pen, then it can and will be easily and frequently abused. If it's technically possible, but it requires diverse favours to many individuals and unusual behaviour and lots of manpower, it'll rarely be considered and barely ever accomplished.

A turnkey service, provided by an arm's-length organization like a corporation, is an almost irresistible lure. Good value prop for Apple, too; this'll generate favours for them, at least at first, when governments want the system used for this and that.


> The solution for sure involves preventing these corporations from having surveillance power, in part by using government regulation

Well, certainly it is necessary to prevent those corporations from having surveillance power, because you're right that it's irresistible for governments to abuse it; but I suggest you find a strategy that doesn't rely on your opponent surrendering while he is in a superior strategic position.

(While I disagree with your comment, I think it adds to the discussion, so I deplore the vandals who are downvoting it as if it were spam.)

Historically, the motto has been, "Cypherpunks write code." Today, that's not enough; we need hardware.


> "We did not set out to praise Apple’s proposal, but we ended up concluding that it was probably about the best that could be done. Even so, it did not come close to providing a system that a rational person might consider trustworthy."

I think there needs to be more pressure on the EU to stop this law. Especially considering that they are already talking about expanding it to other stuff according to the linked article. Terrorism is supposed to be already on the roadmap to be next. After that? Fraud? Tax evasion? Pirated media? The databases for the latter are already there since social media are already required to block them.

I think they should just accept that they can't always intercept everything and use targeted police work to make up for it.

It also won't stop the worst offenders: Those making the actual content. Because at that point the material is not yet in any detection database as it's new. By the time the material is in those, the abuse is (sadly) already long done. So it's not a particularly effective way of preventing abuse either. It will help to stop the consumers of this stuff, but as the war on drugs has shown, that's not an effective method as it only raises the price and thus the profits for the criminals.

What would work for that would be some machine learning algorithm that evaluates any content while you're shooting it on your phone and phones the police so they can come running. And for it to be forbidden to ever have your phone offline in order to circumvent this. That would probably be effective. It would also eliminate any semblance of privacy.


> What would work for that would be some machine learning algorithm that evaluates any content while you're shooting it on your phone and phones the police so they can come running. And for it to be forbidden to ever have your phone offline in order to circumvent this.

Why stop there?

Just implant that AI into peoples brains. (We will soon have the tech for that).

Also you wouldn't need to call the cops. Just "deactivate" the offensive individual with the help of the brain chip in case law is broken. I guess this would create a paradise without any crime on earth in a blink.

Who could argue against a paradise? Don't we all want this?

/s

PS: SiFi movie references intended.


"We did not set out to praise Apple’s proposal, but we ended up concluding that it was probably about the best that could be done. Even so, it did not come close to providing a system that a rational person might consider trustworthy."

The best that could be done would be to not do it at all.


It's crucial to distinguish between "client-side scanning" that reports to authorities and "on-device context"[1] that might use similar client-side/on-device technology to identify a piece of content—but does not either censor or report to authorities.

Different use case, same underlying technology. Former is often very problematic, while IMHO the latter is almost universally helpful (for where it is applicable; e.g. not CSAM, yes misinfo).

[1] https://aviv.medium.com/client-side-context-a-defense-agains...


Fwiw the original paper was submitted 11hrs earlier:

https://news.ycombinator.com/item?id=28873435


It is very overbearing to have this on what is essentially a cloud storage service. Especially when they also make it a lot more difficult for third-party cloud storage services to function effectively (being able to run in the background automatically when charging, etc).

Now, if they wanted to implement this solely for "shared albums", that's a different conversation.


In "Ma Bell" times, they explicitly owned the instruments by which one connected to the phone network, and leased them too you for a monthly fee.

We're coming back to that, but now we don't know who "owns" which rights to which use of our device and there's little hope of, say, legal redress where no "one company" is responsible.


We truly don't own our devices.


Just depends on what devices you buy. The options where you do own them are getting better and better.

https://frame.work/

https://puri.sm/

https://pine64.com/

https://system76.com/

etc.


The offered options when it comes to phones seem legit. But this framework thing does not, imho.

It's the exact same proprietary tech you get everywhere else. (You actually can't have an open system when running mainstream hardware).

And when it comes to the design of that HW it looks more or less exactly like some of the more recent Inspiron Dell laptops. If I wouldn't know better I would guess it's just based on some run-of-the-mine China design.

This proprietary extension cards also doesn't look useful. Same reasons for why the "brick phones" didn't work out.

All in all it looks fishy. Attracting easy money with some hype topic but not delivering on that for real.

I would be happy to see Raptor Computing Systems¹ build some laptops. Only that sadly there aren't any proper CPUs for that purpose… But it's currently the only desktop grade HW you can really trust. Anything else doesn't run without custom closed source firmware in the CPUs (Intel, AMD, and ARM are guilty, RISC-V doesn't have the needed level of performance until now).

¹ https://www.raptorcs.com/


Yeah, been eyeing Framework laptops for a while.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: