Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Break a dozen secret keys, get a million more for free (cr.yp.to)
110 points by tptacek on Nov 20, 2015 | hide | past | favorite | 16 comments


What are the preconditions for the AES batch attack? 2^40 users encrypting the same message? Similar message? Any message?


> 2^40 users encrypting the same message?

How about probably the single-most-encrypted block in history: 'GET http://www.g'?


It would actually be something like:

  GET / HTTP/1.1
  ?
where the ? is the first letter of the most common first request header. A bit less reliable for a given target (due to needing to guess the header), but a bit more effective in general (due to not actually caring about the domain).


The ? is the 17th byte, unnecessary.


You're right, I forgot that the HTTP spec calls for Windows line endings. That means the header doesn't even matter.


Unless 1/n-1 record splitting is used that is common with TLS 1.0. Fortunately HTTP spec uses Windows line endings.


HTTP 1.1 does allow absolute URIs though they probably aren't used as much.


They're really only used for connecting to proxies.


GET / HTTP/1.1\r\n

Usually the URI in an HTTP/1.1 request is relative.


Yes, a single block encrypted under 2^40 distinct keys (you can generalize this to a few popular blocks, instead of just the same block encrypted over and over). Each key you guess during bruteforce has a 2^40/2^128 chance of being correct. After 2^88 AES operations you're likely to recover at least one key. After 2^100 you've recovered, on average, 4096 out of the 2^40 keys.


It sounds to me like this would be applicable to something like an encrypted storage service where every object has a different key.


I wonder BTW if similar attacks has been tried against 40-bit export suites for example.


problem is that not many people must be using these suites.


I know, but the computing power to do so was there even a decade ago I think.


"Real-world clients often support non-elliptic "DHE" along with, or instead of, elliptic "ECDHE". It would be interesting to trace the reasons for this: for example, were people worried about ECC patents?"

DHE was in the original SSLv3 spec, but not commonly used before NSS added support for DHE_RSA in the early 2000s I think. ECDHE did not come until much later.


What are the incentives for the authors of the linked papers? Why would they round up an old, incorrect, estimate of RSA factoring? Is this just engineers engaging in "optimization" and trying to edge out more performance?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: