If your hardware contains unflashable firmware with a back door to direct memory access, then there is no encryption you can trust to perform on the device itself. For example, your baseband processor in your mobile phone which is a binary blob, protected/signed so not to violate FCC regulations and disrupt networks.
Such firmware can be mandated from manufacturers without outlawing encryption directly but making it useless nevertheless.
> Such firmware can be mandated from manufacturers without outlawing encryption directly but making it useless nevertheless.
So the obvious first response to this is that it doesn't actually work. Have you seen the security of these vendors? Apple takes it more seriously than most because they're using it to maintain control over the App Store and yet people still root iPhones. Mandate it by law on vendors who don't even want to do it and it will be completely broken in two days. And completely broken against not only the user. Let's not forget the situation with wifi routers -- "only the manufacturer can issue updates" quickly turns into "security updates are not available from anyone anymore" with the consequent catastrophic nightmare following directly.
But let's pretend we're uninformed pedestrians who don't know that for a minute. How is this idea not even more outrageous than banning encryption to begin with?
it is politically more palpable and "sounds" less outrageous to the public than outlawing encryption. which means it is more likely to make it into law and get support.
The ability for any software to reliably recognize whether an encryption algorithm is being performed or not is not possible (not computable). It can always be hidden.
Yes. And you can obfuscate it. You can even encode it using packet timing, which is very hard to detect. It's also very inefficient, on the order of 1%, but that's enough for text over HD video. See True Names by Vernor Vinge.
> Consider that, as it is now, encrypted data on the Internet traverses numerous untrusted devices
And those untrusted devices leak considerable amounts of that data! You know that it doesn't matter how good the encryption is if one of the computers in the chain is full of malware.
For your OTP example: I know what the cipher text is. I slurped that. I don't know what the key is, or the plain text is, until you decrypt it, at which point I know both because I have access to your memory because your computer is compromised.
My point is that compartmentalization allows secure communication through untrusted devices. It won't be convenient, but it's doable. There is no "computer". There are local networks of suitably isolated devices.
The device that decrypts can't send anything to the Internet, because it's behind receive-only optoisolators. The device that encrypts can't receive anything from the Internet, because it's behind send-only optoisolators. All intervening information processing may occur in your head. Or there may be other devices that are totally air-gapped, with all data transfer through single use flash storage. If you're using entirely untrusted devices, you move all crypto to such air-gapped devices.
It does help if these devices can be trusted, but that's not essential. You could, for example, do encryption manually with one-time pads. Or use that thing with decks of cards.
Maybe you claim that no trustable devices will be available. But that's unlikely. Consider how easy it is to obtain Afghani heroin in NYC. Also, if I were targeted by American adversaries, I could arguably trust devices backdoored by the Russians, or the North Koreans, etc. And vice versa.
Mostly because I believe the device can be more cheaply made if it does not have to run a full-featured OS such as Android. No browser, no color screen, just like my old Nokia candy bar phone; no GPU at all would be required.
Such firmware can be mandated from manufacturers without outlawing encryption directly but making it useless nevertheless.