Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It could add a much needed impetus for the semiconductor industry to consider security perspectives in a much wider sense.

Today we can't know or analyse the security properties of the silicon that essentially runs the entire world. We simply have to trust the producers, and they have been proven to be unreliable at best. They compete on performance and features, and the result is not that different from when there were no safety regulations (and testing) for cars, except that so far bad processor security haven't killed as many people as far as we know.

Open IP is one way to help making sure the above stays true. There are certainly other avenues, but for security it seems to be the case that it's best handled in the open.



If this is true how come that the spectre/meltdown discovery came from external vendors? Black box testing is essential in security and majority of the bugs found with fuzzers and other black box methods. Just because you have visibility into a product it does not mean that you are going to spot the bugs even less, you won't be able to identify security related bugs. Just to give you a really simple example, flying is one of the safest activities for humans yet all of the airplane producers have "closed" development models. It just proves that safety has nothing to do with open or closed development methods. Another angle, do you have documentation of any bridge you use? I don't think so. Yet, it is extremely safe to use bridges. I think people are trying echo the mantra that opensource is safer and testing is the only way to produce reliable products when we know for fact that other engineering disciplines do not use testing and open development, yet produce much safer to use products.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: