"If you find yourself typing the letters A-E-S into your code, you're doing it wrong". Obviously moreso if you're typing D-E-S.
We give our clients a very simple recommendation when it comes to encrypting things:
* If you're encrypting data in motion, rely on SSL.
* If you're encrypting data at rest, rely on PGP/GPG.
There are plenty of libraries that will GPG a blob for you, and you can assume GPG got all the details right. That would have been the right call here (as opposed to figuring out CBC and --- importantly, for someone who is still fetishizing "salts" in 2009 --- how to safely set an IV).
As an obvious counter example, asymmetric encryption (gpg,et. al) is very slow. Which means that if I need to encrypt a lot of data at rest it sometimes makes sense to use a symmetric cipher.
Security (even just the subdomain of encryption) isn't that easy - there is no one-size-fits all solution.
GPG is a program. It isn't an algorithm. Consider reading it before you propose alternatives, which are likely to be broken.
Like I posted upthread, there's a laundry list of things that program is going to give you besides picking a better algorithm than "Triple DES" and a better block cipher mode. But listing them is just begging for a bunch of people to propose wack-ass alternative solutions that other people will feel obliged to waste time knocking down.
The worst thing about this is that he had to actively decide to use CipherMode.ECB. It's not like it's some kind of dumb default, hidden from view. A human being had to type that string of characters in there. Did he look at the doc and just pick one at random? Perhaps CipherMode.ECB the first hit IntelliSense gives you?
Probably because you don't know how to code and you write a bunch of articles and annoy everyone with your pseudo-skills by clogging up every internet news aggregator with your blog spam.
Give the guy a break, he clearly does now how to code, he writes well crafted blog entries in which he humbly explains some of the mistake's he's made for the benefit of others. If he's being voted up in news aggregator then a lot of people must appreciate what he's doing. Why is that so bad?
I'm a fan of his and I think a lot of what he writes about is pretty poor. I don't mind because he is a good writer and he doesn't pretend to be an expert about most of what he writes about. I do mind that he thinks it's not important a lot of the time. I can't fault his record though, he's a successful in his field, and his latest ventures have been really wonderful. I disagree on his views (I don't know C well but I understand why you might want to learn it Jeff!) and I don't think we should see every article of his here, but he's hardly the worst pundit out there.
It's annoying because it's been going on for a long time now, and he hasn't seemed to have taken a break to really bring himself up to speed despite having been shown, repeatedly, to have a need for it.
Comments like this are why I cannot wait to be able to mod down. Back on topic...
I found this article interesting even though I've never needed to use encryption explicitly in .NET. I did study crypto in Java, but only briefly whilst in college and we barely touched on ECB. I probably would have made the same mistake and never been the wiser.
It was 2. The exact bug that exposed the data wasn't really the point, since he assumes that everyone will eventually make some similarly awful mistake; the point was that you should design things in such a way that when you discover your boneheaded mistake(s), the damage is less than it could be. I agree, though, that his habit of using particularly boneheaded mistakes as examples dilutes his main point, since a significant part of his intended audience will think, "Wow, I'd never make a mistake like that, so I don't have to pay attention to mitigating my mistakes like he does."
In our case, we were using this Encrypt() method to experiment with storing some state data in web pages related to the login process. We thought it was secure, because the data was encrypted. Sure it's encrypted! It says Encrypt() right there in the method name, right?
!!!!!!!
There are two cases when you might want to send sensitive data to the end user: You may wish to send them something and have them send it back in an unmanipulated block, or you may wish to have them manipulate it in some way and send it back to you.
In the first case the best practice is to store the data on the server side and send a pointer (key) to the data. The key is inherently meaningless and sparse relative to its space, and is therefore difficult to attack in any meaningful way.
The second case would rely on the web server sending a piece of (javascript) code to the browser to manipulate the crypto data. If you can see the encrypted text and the plaintext javascript routine you're a trival hack away from rewriting the javascript to have it change the encrypted text. This moves the problem from a crypto problem to a javascript coding exercise. The proper solution in this case is to use a form of crypto that is outside the realm of the HTTP session (read: SSL).
On how a trivial implementation error (one we're familiar with already) in Debian's OpenSSL means that any message signed with Debian OpenSSL DSA reveals private keys. That's a micro-error; Debian and OpenSSL may have got almost everything else right, but fucked up one tiny detail, and now exposing the ciphertext of certain messages leaks your private key.
This isn't crypto-geek chauvinism. If crypto isn't a big part of what you do in your day-to-day, you're just not going to get this stuff right. That may be the point Jeff is actually trying to make (and the reason he isn't offering a neat solution in his article), but look at the comments on how to "do it right", and you can see that isn't the message that's getting transmitted.
There's a much bigger flaw in Atwood's cryptosystem than has been discussed here --- forget CBC --- but I'm not going to post it, because it will just result in 20 comments about how easy that is to fix, and here's 15 crazy heuristics to do it, so nyah!
I don't know why you'd debate trivia when his key derivation function is just MD5(keystream), and even that is optional (why-oh-why would you have a "bool useHashing"?).
In this case, the "salt" is completely irrelevant, because there's no precomputed dictionary you can build for this function.
But I still dispute that a distinction needs to be drawn between the word "salt" and "nonce".
Personally, I was curious. Why doesn't a distinction need to be made between a random salt and the same salt used every time? Don't they have dramatically different results in terms of security?
And yeah, his choice of MD5 is an abomination, but that wasn't something that I thought was even nonobvious.
If you're trying to crack a specific password, there's no difference: you still need to hash (password + salt) for all passwords. If you're trying to crack all passwords out of a database, however, you need to hash (password + salt) for all salt. In the latter case, increasing the number of distinct salts in use increases the time-complexity of cracking, without doing anything to the time-complexity of authentication (though requiring O(n) space).
We give our clients a very simple recommendation when it comes to encrypting things:
* If you're encrypting data in motion, rely on SSL.
* If you're encrypting data at rest, rely on PGP/GPG.
There are plenty of libraries that will GPG a blob for you, and you can assume GPG got all the details right. That would have been the right call here (as opposed to figuring out CBC and --- importantly, for someone who is still fetishizing "salts" in 2009 --- how to safely set an IV).