> It was generally thought that government would not keep security vulnerabilities hidden
Was that what people thought? Were there vulnerability reports in open-source software that were coming from the NSA or thought to be coming from the NSA? Surely everyone knew that the NSA was capable of finding exploits in software, and I would think that it would be hard to keep secret whether or not they're being reported.
> That used to be a tin-foil hat idea just a few months ago, and we know better now.
It's well-known that the NSA pushed to have DES limited to 56-bit keys. There were suspicions about Dual_EC_DRBG long before there were any leaks from Snowden. In the 90s, they pushed the Clipper chip, in which they'd engineered a back door. I think that everyone understood that the NSA had somewhat of an interest in weaker cryptography. That's why the cryptographic standardization processes happened in the open and when constants were needed, they were taken from the digits of pi or some such sequence.
There's a video from the RSA conference in 2011 with Dickie George, who was the director for Information Assurance at NSA when DES was being reviewed. He claims that the agreement between NIST and NSA was that: 1, NSA would only change things if they could find a specific problem with the cipher, and 2, NSA promised that DES would have security equal to its key size. The implication is then that they decided that 56 bits was how secure it was, and then picked that as the key size.
You can believe him or not, but I don't see any particular reason not to.
Thanks for the video. I ended up watching the whole thing. My interpretation is a little different than yours: I think George is saying that there's no point in having a key longer that 56 bits given that the goal is 56 bits of security, but he's vague about where the requirement for 56 bits of security came from. In any case, the video certainly supports my larger point that the idea that NSA would sabotage a crypto standard was mainstream within crypto circles, even in the '70s.
> It was generally thought that government would not keep security vulnerabilities hidden
It depends on which they find it on, according to this talk, https://www.youtube.com/watch?v=E4Zx5rQFk4U , If vulnerabilities are found on secure systems they are immediately classified, For them to be able to report they have to refind and document the vulnerability on a non secured system.
Was that what people thought? Were there vulnerability reports in open-source software that were coming from the NSA or thought to be coming from the NSA? Surely everyone knew that the NSA was capable of finding exploits in software, and I would think that it would be hard to keep secret whether or not they're being reported.
> That used to be a tin-foil hat idea just a few months ago, and we know better now.
It's well-known that the NSA pushed to have DES limited to 56-bit keys. There were suspicions about Dual_EC_DRBG long before there were any leaks from Snowden. In the 90s, they pushed the Clipper chip, in which they'd engineered a back door. I think that everyone understood that the NSA had somewhat of an interest in weaker cryptography. That's why the cryptographic standardization processes happened in the open and when constants were needed, they were taken from the digits of pi or some such sequence.