Hacker News new | past | comments | ask | show | jobs | submit login

There have been bugs in open source cryptography. E.G. Heartbleed, shellshock, etc.

There are state actors that will pay to find the bugs, or put the bugs there (and the underhand C contest shows how hard it is to find the bugs).

Open source is not an ironclad guarantee.




Closed source is an ironclad guarantee that the source cannot be audited independently.


And yet companies hire out security audits on a regular basis, frequently as a part of their contracts.

Even Windows has source available, if you pay for the pleasure.


How much more secure does a theoretical audit that never actually happens make you?


I don't believe claims of perfection were made. Merely claims of a standard of excellence.


And I wouldn't say standard of excellence--just the possibility of excellence. With closed source you can't have even that. You pretty much have to assume the worst.


Honest question: how do you feel about Google's security history?


I suspect they do a good job at a technical level, but they have interests in advertising that are at odds with my privacy interests. I also have no way to know if they are doing Dark Nefarious Things.


I totally agree with you.

From your initial comment, I got the impression that you were claiming that closed source software could not be excellent..meaning secure.

By the way, I'm a big GPL/free software nut. However, I understand the place of closed source software, and under the right circumstances, it can indeed be excellent.


This is where I've come around to appreciating the FSF's moral argument for Free software a bit more than the instrumental-utility argument of the Open Source movement.

Open Source can be bad, in terms of quality. Closed source can be good, in terms of quality.

Security is an interesting case where I don't believe that you can be trustworthy and closed. Could the code be good? Yes. Can I validate in any meaningful sense that it doesn't violate my expectations? No.

Of course it's possible to have obfuscated malicious behavior in Free/Open Source software. But, there is at least the possibility of descovery of such defects. With closed source, there isn't.


We'll have to diverge a little, then, but not too much.

In some cases, such as the security sensitive code written at Google, there are far more eyes on the code than there are with all too much of the critical, security sensitive open source code.

In my mind, it's a matter of alignment of interests.

For the cases of 'run of the mill' security questions, such as buffer overflows, password leaks and the like, Google, Facebook and I have fully aligned interests. None of us want those things in anybody's code.

Things get harder for other security questions, such as data collection, and cooperation with surveillance, legal and otherwise.

In the latter case, state surveillance, Google (and other like entities) have interests that are mostly aligned with mine, but not entirely. They're pushing back on warrant-less, Patriot Act type crap, while efficiently complying with traditional directed warrant disclosures. (As far as we know!)

Fully open source and free software will almost always have full alignment with my interests, and so is better in that regard.

As far as code-level bugs, I think the general rule is that closed source code is pretty poor at companies, with a few notable exceptions, where I think things are a heck of a lot better.

Finally: Wow, I just reminded myself that this thread talking about a new Amazon product. Crazy. (:


But at least you have a development history with open source. If there's a bug/vulnerability, its origins can be investigated.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: