> factoring numbers is deterministically achievable in sub polynomial time
Am I missing something? I thought that the whole point of using the difficulty of integer factorization as the strength of things like RSA was that it's not known to be solvable in polynomial time. "P==NP" is one of the unsolved problems of computer science.
I think that the state of the art is that we can factor in sub-exponential time.
You may not mean it this way, but for any reader who might be mistaken:
P == NP => integer factorization can be done in polynomial time with a classic computer but not vice versa.
In fact, many people believe P != NP && integer factorization can be done in polynomial time since FACTOR lies in this awkward space in the intersection of NP and co-NP. (Most other problem in this space are also in P, but we think P != NP, so...)
Now of course, if you put in quantum computers into the mix, Shor's algorithm can break both RSA and ECDSA in polynomial time.
By "things like RSA", you pretty just much mean "RSA". A lot of progress was made on integer factorization over the last 35 years, so that key sizes we'd have thought totally reasonable in 1995 are totally inadequate today. That is not similarly true of non-factoring systems, and so it's legit to point out the relative weakness of IFP crypto compared to other approaches.
> key sizes we'd have thought totally reasonable in 1995 are totally inadequate today.
Not really. The last big jump in complexity was in 1991, when the first 512-bit modulus, 2^512+1, was factored by NFS [1]. What has progressed the most, by far, has been the available computational power.
The last big theoretical jump in complexity was NFS, but is your argument that implementations of NFS haven't much improved, apart from raw computational power of the same sort available to general purpose computers (ie: non-vectorized instruction sets)?
You know much more than I do about this stuff, but I think I'm prepared to challenge you on this. :)
NFS implementations have certainly improved, and certain sub-algorithms as well, e.g. lattice sieving and polynomial selection. But these things don't significantly tilt the exponent you expect in a factorization. As in, you expected around 2^80 "operations" for RSA-1024 back in 1995, and that's still roughly what you expect today.
The practical NFS improvements since then might bring that down to 2^78 or even 2^72---and that's a lot of money saved---but when you're thinking of which key size to use it doesn't matter nearly as much as how 2^80 computation is something that is reasonably practical today, but was unthinkable in 1995.
I take your point. ECC probably wouldn't cost Nintendo more to implement, but would provide a system that probably isn't being attacked as successfully, hopefully giving their system more time on the market before it's cracked.
It didn't make a huge difference here, I should be clear. I don't like RSA, and my dislike for it comes almost entirely from something orthogonal to key sizes --- public key systems provide one or more of three primitives: direct encryption, signature verification, and key exchange; RSA provides all three, and the direct encryption primitive is a giant foot cannon.
If the original commenter's point was that RSA directly harmed the 3DS in some fundamental way, that would be a legit thing to push back on, I think.
Am I missing something? I thought that the whole point of using the difficulty of integer factorization as the strength of things like RSA was that it's not known to be solvable in polynomial time. "P==NP" is one of the unsolved problems of computer science.
I think that the state of the art is that we can factor in sub-exponential time.