Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm willing to give you the benefit of the doubt and assume you were just unaware of how things are supposed to be done (reporting exploits to the vendors privately and waiting for the fix before going public), but man, you did a fantastically dangerous thing even if it was unintentional.

I'd never condone beating up on somebody on the internet, but I dearly hope you've learned a valuable lesson here. You've put lots of people in danger of being exploited. It's not about whether or not you'd do anything malicious with it, it's about all the other people who now can because Google doesn't have a fix out there yet.



This is the misconception I can't stand. Where we hold individuals responsible for a product / companies defect. I thoroughly disagree with the idea that it's his fault people are vulnerable.

So called responsible disclosure is just a marketing spin term. Disclosing bugs privately is a favour not a responsibility. All this does is reduce the risk of bad software decisions. It doesn't solve anything.

How about free market instead? If you run a multi-billion dollar company that can be hurt by issues like this, then it's on you to make it more profitable to disclose issues privately. If you can't or refuse to do that, then you're exposing your company and your customers to risk. Enough with the shunning and the "responsibility" of individuals which expose bugs.


I sympathize with THIS position. It’s the same blame shifting crap when “identity theft” becomes your fault, even though any cashier or clerk can “steal your identity”.

What this marketing spin does is give cover to those who design badly secured systems.

http://www.youtube.com/watch?v=CS9ptA3Ya9E

Also similar is the “jaywalking” idea, made by car manufacturers to make the default right of way to cars!

http://amp.charlotteobserver.com/opinion/op-ed/article650322...


[flagged]


Google? When I brought a serious issue up in 2012 https://dejanseo.com.au/hijack/ Google never fixed it:

In summary, I can take any of your (or anyone else's content) pass more pagerank to it than the original page and then I become the original page. Not only that but all your inbound links now count towards my site and I can see your links in Search Console of my domain.

This is something link graph theory refers to as "link inversion" and is very harmful to smaller publishers.


I can't speak to that particular exploit, but no matter what you always go to the vendor privately first. Period. If they are uncooperative you can then go public. Not before.


I'm not sure how to respond to your comment (for the record I didn't downvote you). The free market point was obvious to me, but I'll elaborate.

When he chose to expose this bug, either he wasn't aware of an alternative (so called responsible private disclosure) or that alternative just wasn't appealing enough. Since we're dealing with a company that generates income (indirectly) through the product, they risk financial consequences from this sort of exposure. It follows that doing more to incentivize and generate awareness of their disclosure policy would reduce their risk which would have a financial impact. It's up to them to decide how much to money / effort / resources to spend on reducing that risk.

My stance is that public shunning doesn't solve the problem of releasing buggy software. I'm actually a Google fanboy, but (to me) they could do better. Instead we get "The site is completely removed from their index without any notification." Maybe we need to elevate browser security to the level of Space Shuttle safety? Obviously that costs more and takes longer, slowing innovation, but IMO the market should determine that.

TLDR; The idea that the individual is responsible for exposing a companies bugs is completely absurd to me. I'll respect you having a different opinion on it.


Please don't do this.

https://news.ycombinator.com/newsguidelines.html (see "idiotic")


I mean, Google has been told (over and over) for a long time that HTTPS doesn't fix trust on the web being broken, and that the back button shouldn't have an API. These are both well documented security problems. What has happened now is that Google is under public pressure and scrutiny to actually fix these things. A fire has been lit under their bum, and rightly so.


I believe they did remove the green lock from https sites to avoid implying trustworthiness. And removing the back button API is something Google can't decide on their own; it has to go through the standards process.


> I believe they did remove the green lock from https sites to avoid implying trustworthiness.

Nope. I'm on build 68.0.3440.106, the latest public stable build and as I'm writing this comment, little green lock and "Secure" right next to https://news.ycombinator.com.


Its a process. The green lock will be eventually removed, and instead of the "LOCK Secure" you see now, you'll see nothing, and http-only sites will be "Unsecure". This you can already see if you go to a non-https site like neverssl.com. There's a "Not Secure" banner in white.


Trying to be objective and understand my own motivations here. Obviously I didn't do anything out of malice. But yes, I could have told Google directly about the problem, but then I'd have no cool story to publish on my blog. At the end of the day, that's what it boils down to. Now that I got too much attention from it, I regret all of it.


"I could have told Google directly about the problem, but then I'd have no cool story to publish on my blog"

First of all, you definitely would. Standard practice is 1) report the bug privately, 2) wait for a fix, 3) get the go-ahead to publish your report and take credit publicly. That's how it always works; that's how security researchers build their reputations and careers. I guess you just weren't aware of that.

Second of all, even if you wouldn't get to publish it, that is horribly selfish reasoning. Putting millions of people at risk of having their information stolen for the sake of a popular blog post?


I fail to see how dejanseo put the people at risk. Exposing how a tool is dangerous and poorly conceived isn't the same as conceiving a dangerous tool.

In this case, Google put millions of people at risk, and dejanseo actually contributed saving them.


Right; but sometimes someone is the first to have an idea or realize a vulnerability, even if it seems trivial to them. Once it's public, novelty is no longer a factor, and it is a good idea to allow the vendor a chance to remove that vulnerability before the novelty is clearly eliminated. Obscurity does actually matter in the real world, even though it is a useless design principle.


That's right.

But while there are a lot of domain where I don't accept the reasoning "someone else must have thought about this before", finding vulnerabilities is somewhere where I can't help but believe that every publicly disclosed vuln has probably been secretly exploited and sold for years.

(The only data point I have behind that is that there are nations level agencies pretty much dedicated to finding those, and they've gotten really good at this (cf Stuxnet !)).

So, while by conviction only, I highly doubt any independent white/gray hat vuln finder will ever be the first to find it, and I applaud any kind of disclosure.


Yes, the reveal is required. But it doesn’t have to be without the vendor’s knowledge. The rush to get it out without allowing the vendor to respond is unjustified and reckless. The TLAs using the vuln are keeping it a secret, after all, and the script kiddies enjoy public trashing of people which I think is worse than the TLAs careful abuse.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: