Hacker News new | past | comments | ask | show | jobs | submit login
Google's threat model for post-quantum cryptography (bughunters.google.com)
255 points by yuedongze on March 11, 2024 | hide | past | favorite | 73 comments



Stateless tokens come with independent security concerns, and moving towards stateful tokens is prudent just to ensure more robust systems. [...] Our main recommendation is to use stateful tokens where possible, given their additional security benefits.

This is smart. PQC schemes often add too much overhead for interoperable cookie sizes. Instead of trying to cram a PQC signature into a cookie, just stop using the stateless cookie designs that require asymmetric signatures.

I'm not sure I buy the Global Risk Institute chart. I get that they need to motivate adoption, but practical cryptanalytic work with quantum computers seems unpromising right now.


Even if the cookie size limit was 8192 bytes instead of 4096, I think it's insane to include a several KB signature on every request over the network (literally worth multiple packets). It's not a big deal if the payload is a large media file, but JWTs are most commonly used in the kind of API requests where the body is often empty or something tiny like {"success": true}. It'd be a shame to undo all the wins from things like smaller TLS handshakes using ECDSA (more of a pleasant side effect, but still).


The header dictionary for http2 makes large headers like cookies less of an issue: https://blog.cloudflare.com/hpack-the-silent-killer-feature-...

Which means it's only sent once per connection.


This is good for browsers which do a ton of requests to the same resource. It's less useful for service-oriented architecture where services would more often talk to other services through an LB, often contacting different nodes.


Are you regularly using stateless, asymmetrically-encrypted tokens during service-to-service communication? That strikes me as a rather odd architecture decision, but maybe there's something I'm missing.


We are doing this and it works pretty great across a bunch of microservices.

A user makes a call into a gateway, which talks to the identity service to get a JWT with info about who the user is. Now the gateway passes the call on to the microservice being referenced and it gets the users basic access info with no extra calls, and better it can call any other internal service and continue to pass along the token.


Istio with JWT seems petty common.


True, but I think doing something just because a lot of other people are doing it doesn’t make it good or correct.


Well, the question was about how often is it used, not how well it serves the purpose. So I'd say pretty often.

Having a data storage accessible for every service that needs authentication and authorization (and ideally every service needs at least authentication) is non trivial in distributed enterprise environment. To have a stateful token, you need to have a distributed scalable storage, unless you have only one instance. You need to be able to connect to it, hence distribute and manage database password. Anything symmetric requires a more sophisticated secret management.


Is there any insight why there is no PQC signature with a short signature?

Is it just us failing to find one, or something more fundamental prevents a quantum resistant short signature like?


Keep in mind it took us quite a while to move to short (ECC based) public key schemes for classical crypto. Although some of that is for non-technical reasons.


There is! It just takes milliseconds to verify and has questionable security


SQIsign is relatively small and fast to verify, but it is pretty slow to sign.


Well, SQIsign signatures are about the same size (204 bytes) than RSA-2048 (256 bytes). So ok, but most people who care about size on the wire have moved to EC sigs, which are much smaller (64 bytes). And “fast to verify” is not really true: 50ms is not fast — both RSA and Ed25519 signature verification is <0.1ms.


I’m confused because there are such things. What you want here is authentication, and there are hashing modes which provide that.


They specifically asked about PQC signatures, not MACs.


Well, there are PQC (public key) signature schemes based on hash functions, but they all have large signatures (in the multi-kB range) and other drawbacks (eg being stateful).

Moving to a MAC is also not a crazy idea in a lot of deployments. You can horizontally scale an internal MAC validation service. I believe Facebook use MAC-based tokens internally (CATs).


There are hashing modes that achieve both integrity and authentication. It usually involves prefixing the data with a server-provided secret before hashing. Only someone who knows the secret can produce such a hash, thereby providing both authentication and data integrity.

Cookies are signed by the server, not the client, and only the server needs to check the signature, so there really isn’t any reason to use much more expensive asymmetric cryptography. And post-quantum security is just a bonus.

In the context of cookies, authenticated hashing IS a PQC signature.


The kind of math problems that are hard for quantum computers to break tend to produce kilobyte+ outputs at our target security levels.


>I'm not sure I buy the Global Risk Institute chart. I get that they need to motivate adoption, but practical cryptanalytic work with quantum computers seems unpromising right now.

Yeah; that doesn't look right to me either. Perhaps "cryptographically relevant" has some unusual meaning (parity with conventional computing?). Otherwise, the consensus view is that there's, pessimistically, a 4% probability of a total break of all public key cryptography within 5 years.

That's practically "abandon all hope right now" territory rather than "motivate adoption" territory.


Who came to this consensus? I'm not aware that the quantum machines have been able to do much beyond, say, factoring 21.


I think they were factoring 21 a decade or so ago. Now the record is at least 261,980,999,226,229. Although my understanding is that this # was somewhat cherry picked and the algorithm they used (not shor's) worked well on this 48 bit number but wouldn't scale as well as Schor's algo to sizes that are cryptographically meaningful


Do you have a reference for that achievement?


They mention they are working on their own QC. In corporate logic, using a third party projection to make a statement about where you are headed seems absolutely legit to me.


Are there privacy concerns with stateful tokens?


No more than there would be with a stateless one, they're both used to authenticate a request after all.


A counter-point that perhaps everyone is taking PQ a bit too seriously [1].

Personally, it seems reasonable to at least spend some effort preparing for it, given the rather long lead time required to develop, study and stress the constructions needed. It might be a long time (if ever) before cryptographically relevant quantum computers show up, but if they do, we'll be glad we had a decade or two to get ready. The alternative of scrambling at the last minute while everything gets cracked seems unenviable.

[1] https://www.cs.auckland.ac.nz/~pgut001/pubs/heffalump_crypto...


It also depends on what your personal requirements for forward secrecy are. As a major platform operator Google should aim somewhere towards the top of that distribution.


Why does Google need long-term forward secrecy? They may encrypt my sessions with keys, but most of their data is the huge index of the public web which is, by definition, public. I suppose they dabble in things like health records, but it seems like most of what they store and forward are public.


Google operates among other things an office suite, a cloud platform, the largest repository of location services data in the world and an IoT health/lifestyle devices company.


And do they encrypt any of those details aside from while the data is in transit? My impression is that they don't.


Im pretty sure they encrypt anything remotely sensitive at rest. A quick google found sources for this for cloud [1] and drive [2].

[1] https://cloud.google.com/docs/security/encryption/default-en...

[2] https://support.google.com/drive/answer/10375054?hl=en-GB


Any big company has numerous data access restrictions. Some of them are obligatory and external, or required for certification. Even basic HDD/SSD decommission and transfer between projects strongly implies that old data was not stored as clear text.


The risk is non-linear advances

That's what everybody said about AI, then sudden advances ~2013 led to the situation today.


A non-linear advance probably wouldn't involve Shor's algorithm. As what happened with AI algorithms. What we call post quantum cryptography is really post Shor's cryptography.


The difference is that we know Shor's algorithm will work once you solve certain engineering problems. As I understand it even the theoretical side of AI was always shooting in the dark.


I don't think it's really fair to call these engineering problems. Creating a working quantum computer is at the bleeding edge of experimental physics. Sure, the theoretical physics behind a working QC is somewhat understood, but there are no successful experiments in the area.

Even the theory is somewhat debatable, as creating a working QC pushes close to the boundaries of the measurement problem.


> It might be a long time (if ever) before cryptographically relevant quantum computers show up

Don’t be so sure. From the article, Google seems to believe quantum computers will arrive in the next 10 years.


Certainly three engineers in Google want to project that claim. I'd be interested to know if there's been a recent study of expert consensus on it.


They're cryptographers (or, at least, two of them have cryptography doctorates, and are well known in the field). Cryptographers in generally are pretty warm to the idea of imminent quantum supremacy, as it's a motivator for a lot of interesting research. By which I mean, if you surveyed academic cryptographers outside of Google, you might get a pretty comparable answer.


"in as soon as a decade".

They think it's possible. I doubt anyone there thinks it's certain or even has a high probability.


Besides encrypting your user data at rest using these post-quantum cryptography algo.

What can be done from a design point of view to make it as hard as possible to deter attackers?

Would it make sense to segregate different types of data into other dbs rather than as a separate table?

“Name DB” “Account DB” “Address DB”

An attacker would need to have advanced knowledge of the app backend to know you have to snag both the account db and address db. Otherwise, the decrypted data is useless with only 1 db.

Drawbacks of course include “performance degradation”, “increasingly complex app”.

Ideally, if you did not need to store sensitive data or collect other user info such as address or zip code. Then not asking for it at all would be optimal. Maybe regulation is needed here.


Assumed attackers owning quantum computers probably have much better “Address DB” than yours.

As for regular assessment of your secure infrastructure, you can read latest “Algorithms, Key Size and Protocols Report”:

https://www.ecrypt.eu.org/csa/documents/D5.4-FinalAlgKeySize...

and practical part of ENISA report:

https://www.enisa.europa.eu/publications/post-quantum-crypto...


>Besides encrypting your user data at rest using these post-quantum cryptography algo.

If you're encrypting data at rest, just keep using an appropriate mode of AES (or ChaCha20 or whatever+) and exercising good key management practices. Quantum computing is largely irrelevant to conventional symmetric ciphers.

+ for tightly constrained values of 'whatever'


QC will break AES 128 and reduce the margin of security for AES 256 is my understanding.


QC is worth a one bit of reduction in security strength for AES.


No it’s worth a 1/2 reduction in the number of bits. So 256 becomes effectively 128 bit encryption. Idk though if the solution could be as simple as creating “AES 512”.


If I were an adversary possessing a (quantum) code-breaking device, who can break into your infra that uses a (pre-quantum) encryption for authorization, I probably would attach a debugger to your worker process, and find out about all your databases, and maybe hijacked a live connection to one.

If I only could get access to nothing but one database, and every table / segment of it were encrypted differently, it would likely take as much effort to break into each of them, as if they were "physically separate" databases.


This doesn't make sense:

- i think encryption at rest would probably just use symmeyric encryption which is unaffected by shor

- how does the attacker get into your network in this scenario. Stuff encrypted at rest by definition is not flying around the network.

- what sort of scenario would be involved where the attacker knows how to get one db but not the others? The hard part is figuring out the first db

> Drawbacks of course include “performance degradation”, “increasingly complex app”.

On the contrary, sharding is a super common performance technique.


Minor nit, symmetric encryption is “impacted” by QC, see Grover’s algorithm. The speed up isn’t as fast, though.


it is also trivial to work around by just doubling the key length. But even without that the speed up is slow enough that we will be well within the quantum age before this matters. The article talks about this a bit.


Typically once you get access you just try to dump everything and figure it out later. Splitting into separate stores is just convoluting your day to day for minimal inconvenience to a possibly non-existent an adversary.

The only real protection is encryption.


> There are several alternatives to simply replacing classical signatures with quantum-safe signatures, which could address the performance issues when it comes to PKI. We are currently looking to experiment in this space to gather data for more solid recommendations, which we will share in a future blog post.

Does anyone know what those alternatives might be? Some way to collapse a chain of signatures into one? Long-term symmetric session keys? Neither of those sound like good ideas but I'm grasping at straws.



The threat estimate for a quantum computer that breaks cryptography shall be based on currently available data and the understanding that only the Schor algorithm is known to provide exponential speedup for factorisation.

Let’s give IBM credit for attempting to factor in the number 35 in 2022, although they failed there [1]. Before that, the successful factorisation happened for the number 21 in 2012 [2] and the first factorisation of 15 in 2001 [3].

Now we have three points. There is a trend that the factorised number grows by a number of 10 for every ten years. Thus, to get a quantum computer that facilitates RSA-2048, we shall wait for 2^2048 years.

[1]: https://arxiv.org/pdf/2103.13855v1.pdf

[2]: https://www.nature.com/articles/nphoton.2012.259

[3]: https://www.nature.com/articles/414883a


I think they must not be planning on the trend being linear. Maybe we’re at the linear looking beginning of a sigmoid.

It seems like a reasonable bet on their part, in the sense that Google has a lot of money to play with. Even if it is unlikely that it takes off, behind hit by quantum attacks would be pretty bad for them, so maybe they see it as insurance against an unlikely but catastrophic event.


I understand that it is enjoyable to rework all protocols involving public key cryptography and propose alternatives, and I support that. It can be an excellent catalyst for coming up with new cryptographic primitives and a better understanding of existing ones, and Google can certainly afford such activities.

The issue is the emphasis on the importance of an inevitable threat. Organisations start to justify implementing QC-resistant algorithms on the basis that others are doing so, and in that way, they justify themselves that the threat is real. After that, we would end up with government agencies issuing a requirement that only QC-safe algorithms are acceptable for security in the process, killing useful cryptographic primitives like ElGamal cryptosystems, homomorphic encryption, blind signatures and a class of zero-knowledge proofs.

At the moment, there are only three points from which one can extrapolate further advancements. The Shor algorithm requires exponential suppression of errors with the number of qubits. This is why, although we have hundred qubit QCs, the most significant factorisation is being done at most with five qubits. Threfore, if we would have “exponential progress” in advancing QC, the advancement of factoring integers would be linear. Faster progress requires miracles. QC error correction would be no panacea either, as it would require repeated application and would require an unreasonable number of qubits as well as would increase runtime.

The miracle as well be that we figure out a way to factorize numbers on classical computers in polynomial time. Or as well break lattice based crypto on classical computers.


From [1]:

> We implemented the algorithm on IBM quantum processors using only 5 qubits

Last December, a team based out of Harvard demonstrated the ability to scale up to 48 logical qubits: https://arxiv.org/abs/2312.03982

It has been shown that, to factor an integer with n bits, Shor's algorithm requires ~2n logical qubits: https://arxiv.org/pdf/quant-ph/0205095.pdf


Running Shor's algorithm requires essentially error-free logical qubits. Not a single logical qubit of that quality has ever been demonstrated. The Harvard results are impressive, but their logical qubits are worse than some physical qubits.


It is not enough to place qubits on a single chip or on a grid. We already know how to do that. The hard part is keeping them isolated while allowing arbitrary control of their interference.

The core issue why Schor algorithm is hard is that it requires exponential supppression of error with number of qubits to produce meaningful results. Therefore we don’t actually see much results here as the error rates have not yet reached thresholds to do it with more qubits. The error correction would not be a panacea either because it would necessitate it’s repeated application to get necessary threshold to run Schor algorithm. This would require unreasonable amount of physical qubits.


Just from reading this it's a bit unclear to me why they recommend only SPHINCS+ for "Firmware Signatures", while for "Software Signatures" they recommend Dilithium3+hybrid or SPHINCS+.

Is SPHINCS+ more lightweight?


SPHINX+ is actually slower (which does not matter much as firmware signatures generation or verification are not time critical) but it is the most conservative choice available (avoids only relying on the security of lattices for signatures), which is very valuable for firmware that cannot be easily updated if a security issue arises with lattice-based signatures. It also offers the smallest public key size, which may help for devices with limited storage.


I’ve wondered if the institutional buy up of BTC via ETFs and corporate and governmental balance sheets has adequately considered this risk. Could a bad actor steal BTC by cracking sha256 keys on a quantum computer? There could be a catch-22 - doing so would zero the value of btc, so if they didn’t immediately transfer to a quantum resistant asset, their heist would be worthless. Still, they could profit by shorting or it could be on agenda of other assets to just zero btc.


This may be a naive question but why not go back to Vernam? Storage is cheap.


In addition to the usual argument about the impracticality of the one time pad. There is also that it is symmetric cryptography (the one time pad is a shared secret).

Symmetric algorithms (AES-256 in particular) are generally considered to be quantum resistant. Here is what written about it in the article.

> Symmetric cryptography, using a single secret key to encrypt and authenticate data: In our current understanding, symmetric cryptography is not impacted by quantum computers for all practical purposes. Grover's algorithm could be used as an attack here, but is currently considered infeasible for even medium-term quantum computers. (See "Reassessing Grover's Algorithm, 2017")

The algorithms most affected by quantum cryptanalysis are public key (asymmetrical) algorithms, because we already have effective quantum algorithms against them (Shor's in particular). We are not even close to having the computer though.


excellent explanation.


OTP is deeply impractical.

How do you distribute your pads? (hint: it's a chicken-and-egg problem, you need a secure channel)

How do you ensure they're only ever used once?

In its most obvious instantiation, you get no authentication. So now you have to build an authentication mechanism on top. There's simply no point, when we have better solutions already.

And to top it all off, symmetric ciphers aren't threatened by quantum computing in the first place.


One-time pads just trade an encryption problem for a secure key distribution problem, the latter being much, much harder.


How do you solve the key issue with Vername? It requires a secret key of the same length as the plaintext.


How close or far do people speculate we are from Q-Day?


There's a chart (referenced from [0]) in TFA.

[0]: https://globalriskinstitute.org/publication/2023-quantum-thr...


Yes, there is some truth to that


In short all encrypted data transiting through internet will get uncrypted once quantum computing is there. As if we didn't already had enough threats to worry about...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: