Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Project Zero is a (very public) Google project though. If they stand behind their choices and policies, they should live by them.


In what way is Google not standing by their policies (for example, have they criticized or tried to prevent this person from disclosing publicly)?


The clear implication is by not fixing the bug in the same time frame.


What is the thing being implied? Like as far as I can tell, Google's position seems to be that "it is best if vuln researchers have the freedom to disclose unfixed issues, especially after reporting them".

People criticize P0 for publishing issues despite companies asking for extensions. But we're criticizing Google here for...what? They didn't ask for an extension, they didn't try to prevent this person from disclosing. Where is the hypocritical thing?


Might be worth noting, 90 days are how long Google thinks it is reasonable to keep vulnerabilities secret without a fix.

The longer it is kept secret, the benefits of the public knowing about it outweigh the risks.

Not all vulnerabilities can be fixed in 90 days, but they can be disclosed.


The complaint is that Google's stance with Project Zero is "90 days is plenty sufficient; you're a bad vendor if you can't adhere to it", and then Google itself doesn't adhere to it, which implicates themselves here.

I see what they're saying if you lump them together; I just think it makes sense to treat P0 a little independently from Google. But otherwise it's got a point.


Can you point out the second part, specifically where "you're a bad vendor if..." is either stayed or implied py P0?

See instead https://news.ycombinator.com/item?id=27680941, which is my understanding of the stance p0 takes.


> See instead https://news.ycombinator.com/item?id=27680941, which is my understanding of the stance p0 takes.

That's a common sentiment I just don't buy. People here love to hand-wave about some vague "benefit to the public", and maybe there is some benefit when the vulnerability can be mitigated on the user side, but it literally cannot be the case for the fraction of vulnerabilities that entities other than the vendor can do nothing about. The only "benefit" is it satisfies peoples' curiosity, which is a terrible way to do security. Yet P0 applies that policy indiscriminately.

> Can you point out the second part, specifically where "you're a bad vendor if..." is either stayed or implied py P0?

As to your question of when this is implied by P0, to me their actions and lack of a compelling rationale for their behavior I explained above is already plenty enough to imply it. But if you won't believe something unless it's in an actual quote from themselves, I guess here's something you can refer to [1]:

- "We were concerned that patches were taking a long time to be developed and released to users"

- "We used this model of disclosure for over a decade, and the results weren't particularly compelling. Many fixes took over six months to be released, while some of our vulnerability reports went unfixed entirely!"

- "We were optimistic that vendors could do better, but we weren't seeing the improvements to internal triage, patch development, testing, and release processes that we knew would provide the most benefit to users."

- "If most bugs are fixed in a reasonable timeframe (i.e. less than 90 days), [...]"

All the "reasonable time frame (i.e. < 90 days)", "your users aren't getting what they need", "your results aren't compelling", "you can do better", etc. are basically semi-diplomatic ways of saying you're a bad vendor when you're not meeting their "reasonable" 90-day timeline.

[1] https://googleprojectzero.blogspot.com/p/vulnerability-discl...


They literally directly describe it as a benefit to users, the sentiment you don't buy, and don't ever actually call vendors bad, except if you interpret the less benefit to users to be a moral impugnment of the vendors.

What you cite proves my point!


> They literally directly describe it as a benefit to users

"It" in that sentence does not refer to their own unpatched disclosures.

> They don't ever actually call vendors bad, except if you interpret the less benefit to users to be a moral impugnment of the vendors. What you cite proves my point!

Okay well now I'm definitely convinced.


They didn't fix it within that timeline. I don't know why everyone is saying "well they didn't stop disclosure in 90 days", but they didn't fix it in the timeline that they have allocated as being reasonable for all vulns they report.


At the limit, what you're saying would mean that vendors should feel obligated to fix issues they don't consider to be vulnerabilities, as long as they're reported as such. That'd clearly be absurd. Is there maybe some additional qualifying factor that's required to trigger this obligation that you've left implicit?


> what you're saying would mean that vendors should feel obligated to fix issues they don't consider to be vulnerabilities

Why would it?

> Is there maybe some additional qualifying factor that's required to trigger this obligation that you've left implicit?

That they consider it a vulnerability seems fine.


If you're leaving the determination to the vendor, they could just avoid the deadline by claiming it is not a vulnerability. That seems like a bad incentive.

There are things that literally cannot be fixed, or where the risk of the fix is higher than the risk of leaving the vulnerability open. (Even if it is publicly disclosed!)

It seems that we're all better off when these two concerns are not artificially coupled. A company can both admit that something is a vulnerability, and not fix it, if that's the right tradeoff. They're of course paying the PR cost of being seen as having unfixed security bugs, and an even bigger PR cost if the issue ends up being exploited and causes damage. But that's just part of the tradeoff computation.


I don't know what point you're trying to make here. Google acknowledges that this is a vulnerability ("nice catch"), Google pushes every other company to fix vulns in 90 days (or have it publicly disclosed, which is based on the assumption that vulns can be fixed in that time), and Google did not fix it in 90 days.

If you're asking me to create a perfect framework for disclosure, I'm not interested in doing that, and it's completely unnecessary to make a judgment of this single scenario.

> A company can both admit that something is a vulnerability, and not fix it, if that's the right tradeoff.

Google's 90 days policy is designed explicitly to give companies ample time to patch. And yes, this is them paying the PR cost - I am judging them negatively in this discussion because I agree with their 90 day policy.


I am saying that there are things that are technically vulnerabilities that are not worth fixing. Either they are too risky or expensive to fix, or too impractical to exploit, or too limited in damage to actually worry about. Given the line you drew was that there must be a fix in 90 days, if the company agrees it is a vulnerability, the logical conclusion is that the companies would end up claiming "not a vulnerability" when they mean WONTFIX.

If you think this particular issue should have been fixed within a given timeline, it should be on the merits of the issue itself. Not just by following a "everything must be fixed in 90 days" dogma. All that the repeated invocations of PZ have achieved is drown out any discussion on the report itself. How serious/exploitable is it actually, how would it be mitigated/fixed, what might have blocked that being done, etc. Seems like those would have been far more interesting discussions than a silly game of gotcha.

(If you believe there is no such thing as a vulnerability that cannot be fixed, or that's not worth fixing, then I don't know that we'll find common ground.)


> Given the line you drew was that there must be a fix in 90 days, if the company agrees it is a vulnerability, the logical conclusion is that the companies would end up claiming "not a vulnerability" when they mean WONTFIX.

OK, but that doesn't apply here, which is why I don't get why you're bringing up general policy issues in this specific instance. Google did acknowledge the vulnerability, as noted in the disclosure notes in the repo.

So like, let me just clearly list out some facts:

* Project 0 feels that 90 days is a good timeline for the vast majority of vulns to be patched (this is consistent with their data, and appears accurate)

* This issue was acknowledged by Google, though perhaps not explicitly as a vulnerability, all that I can see is that they ack'd it with "Good catch" - I take this as an ack of vulnerability

* This issue is now 3x the 90 day window that P0 considers to be sufficient in the vast majority of cases to fix vulnerabilities

I don't see why other information is supposed to be relevant. Yes, vendors in some hypothetical situation may feel the incentive to say "WONTFIX" - that has nothing to do with this scenario and has no bearing on the facts.

> If you think this particular issue should have been fixed within a given timeline, it should be on the merits of the issue itself.

That's not P0s opinion in the vast majority of cases - only in extreme cases, to my knowledge, do they break from their 90 day disclosure policy.

> Not just by following a "everything must be fixed in 90 days" dogma.

Dogma here is quite helpful. I see no reason to break from it in this instance.

> Seems like those would have been far more interesting discussions than a silly game of gotcha.

I'm not saying "gotcha", I'm saying that:

a) 9 months to fix this feels very high, Google should explain why it took so long to restore confidence

b) The fact that they have an internal culture of 90 days being a good time frame for patching purely makes it ironic - it is primarily the fact that I think this should have been patched much more quickly that would bother me as a customer.

> (If you believe there is no such thing as a vulnerability that cannot be fixed, or that's not worth fixing, then I don't know that we'll find common ground.)

Nope, 100% there are vulns that can't be fixed, vulns that aren't worth fixing, etc. But again, Google didn't say this was a "WONTFIX" though, and they did ack that this is a vuln. If it wasn't possible to fix it they could say so, but that isn't what they said at all, they just said they weren't prioritizing it.

If it's the case that this simply isn't patchable, they should say so. If they think this doesn't matter, why not say so? It certainly seems patchable.


> OK, but that doesn't apply here

It's not what happened, but the logical outcome of what you propose. Right now the rules are simple: "disclosure in 90 days, up to you whether to fix it". What you're proposing is that it is no longer up to the company to make that tradeoff. They must always fix it.

> That's not P0s opinion in the vast majority of cases - only in extreme cases, to my knowledge, do they break from their 90 day disclosure policy.

Again, that is a disclosure timeline. Not a demand for a fix in that timeline. In general it's in the vendors best interest release a fix in that timeline, especially given its immutability. You're trying to convert it to a demand for a fix no matter what. That is not productive.

> a) 9 months to fix this feels very high, Google should explain why it took so long to restore confidence

So why not argue for that explicitly? It seems like a much stronger approach than the "lol PZ hypocricy" option.


You're trying to talk about consequences of my statement, which I'm trying very hard not to talk about, because I don't care. I'm only talking about this very specific instance.

> Again, that is a disclosure timeline. Not a demand for a fix in that timeline.

Yes and it is based on the expectation of a fix within that timeline being practical.

> You're trying to convert it to a demand for a fix no matter what. That is not productive.

No I'm not, you're trying to say that I am, repeatedly, and I keep telling you I don't care about discussing disclosure policy broadly. I'm only talking about this once instance.

> It seems like a much stronger approach than the "lol PZ hypocricy" option.

Take that up with the person who posted about P0 initially. I'm only saying that it's ironic and that I support the 90 day window as being a very reasonable time to fix things, and that them going 3x over is a bad look.


Sure, but how is that hypocritical, which is the question I asked that you initially responded to?


Replace "ironic" with hypocritical and I think it's still pretty fair. Less so, strictly.


> Again, that is a disclosure timeline. Not a demand for a fix in that timeline. In general it's in the vendors best interest release a fix in that timeline, especially given its immutability. You're trying to convert it to a demand for a fix no matter what.

I don't see what form it would come in if it were a demand in your view. We have a disagreement over private entities over a vulnerability; how would one "force" the other to do that except by disclosing it? Hold someone hostage?


> Google pushes every other company to fix vulns in 90 days (or have it publicly disclosed)

I believe you're mistaken about the conditional publishing. The 90 day clock starts when google reports the bug - they will make it public whether or not the vulnerability is remediated (with very few exceptions). By all appearances, Google is very willing to be on the receiving end of that on the basis that End-Users can protect themselves when they get the knowledge - in this case, GCE users are now aware that their servers are exploitable and make changes - like moving to AWS. I think the 90-day clock is reasonable stance to take, for the public (but not necessarily for the vendor).


I'm totally aware of all of this and I strongly agree with P0s policy.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: