This is very different from saying P=NP. It would instead be to say that the integer factorization or discrete log problems in the specific parameters used by cryptosystems simply aren't as hard as we once thought they were.
RSA-128 is trivially breakable. Being able to break RSA-128 isn't a demonstration that P=NP, either.
For RSA to be completely broken the attacker must be able to retrieve the private key for any practical key length.
Some of what people are writing seems to imply more than attacks against specific parameters but rather some theoretical breakthrough that would render the scheme completely unusable. (e.g. factoring in polynomial time to key length)
People have been moving to longer keys over time and thus simply being able to attack a given key length isn't the same as saying the entire scheme is broken. Today a 768 bit key is very difficult to attack. Is RSA with 64k-bit keys likely to be broken within 5 years?
I guess in one sense our ability to work with much larger RSA keys has progressed faster than the theoretical advances in factoring. Until the time someone has some practical and provably strong cryptosystem we're kind of always at risk. ECC could also be broken so why should I feel more comfortable using ECC over RSA with 64k bit keys?
RSA with 65536-bit keys is quite secure, but it's hugely impractical. Try it in OpenSSL. Key generation alone will take you several hours on a fast, current-generation machine.
You only need to generate your key pair once. I just generated a 16kbit key in 10 minutes. Why is this impractical? The cost of encrypt/decrypt would be modular exponentiation which isn't that bad. My point is though that as long as there is some number of bits where RSA is secure and it's use is practical then RSA isn't dead. A lot of the discussion seems to imply RSA will be dead soon as a result of recent work. Maybe it will or maybe it won't but it seems the claims are a bit overblown. I agree with the idea that we need to design to the possibility of RSA falling (or ECC falling) within reason but RSA could have fallen 5 years ago and it could fall tomorrow (and so can ECC).
Well, now you're changing the goalposts. You originally said 64kbit, not 16kbit. Key generation is roughly O(n^4) in the bit size. If you quadruple the key length, the time goes up by a factor of 4^4, or 256. Key generation time is not irrelevant -- waiting ten hours before your SSH server can come online is no one's idea of a good user experience.
At the 64kbit-level (again, BIG BIG difference from 16kbit), even the cost of encryption/decryption alone is easily going to cripple a high-traffic server.
I apologize for changing the goalposts (slightly). I was using ssh-keygen and 16kbit was the maximum RSA key length I could use. Running the numbers, if we assume L(1/4) then 4kbit keys are slightly stronger than 1kbit keys today. 8kbit keys are very slightly stronger than 2kbit keys today. Please verify this, it's possible I made a mistake. L(1/4), 8kbit would be ~5*10^10 harder to factor than 1024 k-bit keys today. I think I now have the goalpost where it should be for a meaningful discussion? Another interesting question is why did Google choose to move to 2kbit rather than to ECC?
Attaching specific numbers to an expression like L(1/4) is meaningless, because L(1/4) is an asymptotic expression. Asymptotic expressions contain implied constants, whose values are not explicit. There's also no particular reason to expect the L(1/4) attack to apply to RSA. For one, I explained in another reply that we currently do not know any direct relationship between RSA and discrete log. In addition, Joux's latest result is not even L(1/4), it's L(epsilon) for any fixed epsilon > 0 (and in this context, the implied constant cannot be ignored, otherwise you get nonsense values).
You're right and I understand that. I was assuming some hypothetical attack that is equivalent to GNFS. That assumption has no theoretical basis, it's just used as a mechanism to question the ideas and get some intuition. I'm using the same line of argument used in the original presentation where they show a progression and make assumptions about what that means about RSA keys. I think that line of reasoning is odd but even under that line of reasoning the results don't make sense to me.
No competent engineer in the world would give a green light to 768 bit RSA keys.
This article was based on a presentation I was marginally involved with, and I'm telling you that the assertions the talk was making were very much key length dependent. An event that forces a migration from 1024 bit RSA keys would be painful and dramatic.
It also bugged me that different classes of attacks were grouped together to build a "case" against RSA.
Looking at the slides again:
"There is a small but real chance that both RSA and non-ECC DH will soon become unusable"
"key sizes may have to up to 16kbit" - "wildly impractical" (WHY???)
L(1/4) isn't quite enough to make RSA unusable with large enough keys. I ran the numbers for 64k but you don't need to go as large.
It's not clear that this sort of linear progression presented is real. Some math problems don't see any progress in a long time and some see big progress rapidly. Seems to me like it's trying to find a pattern where there isn't one necessarily.
To conclude. RSA may fall tomorrow. ECC may fall tomorrow. AES may fall tomorrow. I agree with the principle we need to have some agility built in to cryptosystems (though if it falls we're kind of screwed). Maybe we need to combine cryptosystems such that one breaking won't take us down. Added complexity and changes create attack opportunities (at least current implementations are battle hardened).
I was just stating the fact that factoring 768 bit keys is still difficult (though doable given enough time/compute power). I'm not implying you should use 768 bit keys.
I saw the slides and read various articles that followed and none of them seemed to be specifically concerned about 1024 bit RSA keys. Can you point me to where in the presentation slides that is emphasized as the main concern?
It's quite one thing to say we should be concerned that 1024 bit keys can be compromised and or to say DH/RSA is dead. I wasn't at the presentation so I don't know what was presented.