Hacker News new | past | comments | ask | show | jobs | submit login

At least some types of safety critical software require binary analysis, which is expensive, but addresses both this, and the more banal compiler bugs.



Binary analysis only addresses this if you manage to analyze everything involved in the system: OS kernel, utilities, device drivers, bootloader, ROM BIOS, etc. And then you have to not let any untrusted tool touch any of those components.

After all, analyzing the binary file only protects you if you can guarantee that the same bits get loaded into memory and executed unmodified.

Thompson's technique could easily have been extended so that the code that loaded executable pages would look for a binary signature of the login code, and modified it at load time. If you do the same thing for the binary signature of the file loading code, you get the exact same thing Thompson described, but at the level of binary machine code going from disk to memory, instead of at the level of code going from source to executable.

However many levels you analyze, in theory the adversary could have gone one level further in their attack, unless you build everything yourself (including the CPU, presumably).


Yeah. Even after a full analysis, if you only analyze the differences between each later version after that, you could still easily miss a Black Sunday attack. http://www.codinghorror.com/blog/2008/05/revisiting-the-blac...


Unless, of course, you're hardware has been backdoor-ed when it was created (or do you trust all hardware manufacturers, including those in China under potential pressure from government...). See for example King et al. on "Designing and implementing malicious hardware", online version appears to be here: http://www.usenix.org/event/leet08/tech/full_papers/king/kin...


The binary file reader could have been compromised.


In real life, this is a non-issue. Even if one piece of software is compromised, you cannot compromise all of the parts of the system so they align perfectly.

For example, someone compromised your C compiler such that it adds a secret login when it compiles UNIX sources, and that it adds the same feature to the compiler code.

You can modify the compiler code to accept Modified C Language. You can then write a transforming program that converts C into MCL. Then you will compile your new compiler code with the compromised compiler. This will give you MCL compiler. It will still contain the backdoor logic. However, that logic will be deemed irrelevant. Any pattern-matching will be useless against your new language, since attackers cannot know in advance which transformations you applied to it.

After that you can get compiler source, apply MCL transformation, compile the result with your compiler, and you will get a "clean" version of C compiler.


"You can then write a transforming program that converts C into MCL. Then you will compile your new compiler code with the compromised compiler. This will give you MCL compiler. It will still contain the backdoor logic. However, that logic will be deemed irrelevant."

If your transformations preserved behavior wouldn't the backdoor still be there?

What does it matter which language the compiler compiles its code in to as long as the behavior remains the same?


The original scheme recognized a particular sequence of code; recognizing particular behavior no matter how it's implemented is in theory equivalent to the halting problem, I think.


Write it yourself?


...on an operating system you wrote yourself (or fully understand), on a chip you fabricated from a design you thoroughly understand



Better link: http://www1.idc.ac.il/tecs/

But yes, awesome book. :)


But who trusts you or me, anyways?


The compiler could've been compromised too, as said by Ken Thompson.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: