I can't speak for Ruby, but in both Java and Python I've never seen a bug this embarrassingly bad. I mean, this bug simply would not have happened in a language with a working boolean type, or with exceptions (I trust the ZeroMQ guy's judgement about as far as I can throw him).
> Bogus objects — properly tagged words with invalid addresses that pointed at uninitialized memory or into the middle of object of a different type — which would cause the GC to corrupt memory would be left in registers or on the stack. These sort of problems were everywhere in the microcode.
Actually C is an excellent choice for security relevant systems software because the issues for developing in C are well understood and can easily be mitigated by following 30 years worth of best practice patterns and using the correct development tools.
The issue is developers are not using the tools or following the best practices because they think they know better than 30 years worth of experience or get caught up in bikeshedding about ideology, licenses and which line the curly braces go on.
"Actually C is an excellent choice for security relevant systems software because the issues for developing in C are well understood and can easily be mitigated by following 30 years worth of best practice patterns and using the correct development tools."
Nevermind the copious undefined behavior, the fact that C programmers sometimes struggle to figure out what a valid C expression actually does, the fact that C programmers have to choose between code bloat and using "goto" for finalization, the fact that there are no standard error handling constructs, the fact that strings are null terminated, the lack of a standardized way to determine array lengths at runtime, etc., etc., etc. Even something as simple as this:
Another good read (it probably does not reflect how you want to write C code, the rule about dynammic allocation is probably extreme if you are not writing code to fly spaceships, but I think it is good to read regardless): http://lars-lab.jpl.nasa.gov/JPL_Coding_Standard_C.pdf
Preferably something generated from a formal proof tool. It wouldn't be perfect (nothing is), but the mistakes would be less stupid. Which is a big jump!
There's a bit of research in this, but I don't know of anything that's ready to use.
In a few years, Go. It's native crypto libraries are impressive already, though not thoroughly vetted.
There's nothing wrong with C though, as long as carefully written. The biggest asset is that every popular language can link to it.