Hacker News new | past | comments | ask | show | jobs | submit login
Google's Project Zero exposes unpatched Windows 10 lockdown bypass (zdnet.com)
225 points by _o_ on April 21, 2018 | hide | past | favorite | 118 comments



To people calling this a dick move by Google, I encourage you to look at the actual issue in Monorail. The reason given for not extending the deadline was that the issue is not particularly severe, and there are also similar bypass issues which are currently unpatched. If it isn't going to help protect customers, what's the point in granting an exception?

https://bugs.chromium.org/p/project-zero/issues/detail?id=15...


My thinking would be exactly the opposite - severe bugs have to be disclosed sooner while non-severe can be left lingering around.


And Google has actually had a track record of using that reasoning as well, so it's not like their words actually mean much. Great example of a drive-by high-sev exploit getting exposed long before Microsoft could patch: https://www.digitaltrends.com/computing/google-project-zero-...

I appreciate the work they do, and I sure as hell appreciate the talent, but Google is mostly treating this entire endeavor as a giant marketing and recruiting trick.

(It worked well; they stole my favorite pentester from one of my preferred boutique consulting firms.)


“Long before Microsoft could patch” meaning when Microsoft decided to cancel a patch Tuesday? If Microsoft decided that the correct patch cadence was quarterly or annually (because so much QA work goes into a release), does that change what a disclosure deadline should look like?

Also in the bug you’re referring to, Google expresses surprise Microsoft let it get through because of the severity, and declined to comment on details that would only help exploitation. I also don’t see anything on the bug re: will disclose _because_ sev:hi (your core argument AIUI); but I agree that it doesn’t really impact what their policy should be.

You say Google’s words don’t mean anything, but it sounds like you’re advocating for their 90 day disclosure policy to not mean anything whenever a company doesn’t get its act together in time, which I’m sure isn’t your intention. P0 might have some of the best pentesters in the world (it does) but that doesn’t matter much if everyone else gets a ton more time to find and exploit bugs.


> If Microsoft decided that the correct patch cadence was quarterly or annually (because so much QA work goes into a release), does that change what a disclosure deadline should look like?

Microsoft uses a regular patch cadence so enterprise users can allocate the necessary resources for review and update of their computers. There are scores of enterprises that use tens of thousands or hundreds of thousands of computers per install.

The manner of Google's expected information release puts many end-customers at risk. It is true that in this case the vulnerability was due to Microsoft, but the release of exploitable information will put enterprises at risk. That is why Microsoft asked for additional time. This is a business decision not a technical decision.

Sure, Microsoft may have encountered technical issues in their fixing of the issue, but the risk upon exploit information being released is the issue. Google shoulders that risk all by themselves.


With respect to Google shoulders that risk all by themselves, this is clearly not the case. If an attacker discovers this issue and discloses it, by your logic, the fault would be all theirs.

The fault is with the creator of the software, not the discoverer of the flaw.

This broken logic has been around since at least when "Stalking the Wily Hacker" was released. Those who make the error and release broken software and systems should be held accountable, instead of throwing teenagers in jail.

Don't release software, especially important software, unless it is finished.


I already demonstrated that this is false: there are tons of examples of Google giving MSRC grace periods to hit a patch Tuesday, and in the example GF cited, Microsoft _skipped the patch Tuesday_!


That doesn't demonstrate this is false. This is true, because these are the actions being taken at this time... That different things happened for other vulnerabilities is not relevant to this situation. Google is threatening to publicly release an exploit for which Microsoft has admitted they are already working on a fix.

Google is not a lawmaker, and if they continue to release 0-day exploits in this manner, even after being instructed otherwise by the vendors, at some point they will be made to shoulder some of the burdens of their having done so. Google knows this to be true, or they would not have held some recent vulnerabilities past their stated 90 day release window.

Thoughts about the relative technical merits of the companies or source codes doesn't come into play here. These are business decisions that affect real world companies and people.


Statements like this suggest a fundamental misunderstanding of what a vulnerability is. Google will never be held responsible for someone else’s mistakes, no matter how they make those mistakes known.

Your speculation about the reasoning for the extension could not be more off. All you have to do is read their blog discussing their disclosure policy to understand why those exist.


> Statements like this suggest a fundamental misunderstanding of what a vulnerability is.

That's simply absurd. Stop trying to turn a discussion into an argument.

> Your speculation about the reasoning for the extension could not be more off.

What was my speculation about Google's reasoning? I made no speculation whatsoever about Google's reasoning. Their stated reasonings in a blog post don't matter. Their actions matter.


You're reading far more into my comment than I put to paper, but I'll indulge.

> If Microsoft decided that the correct patch cadence was quarterly or annually (because so much QA work goes into a release), does that change what a disclosure deadline should look like?

Absolutely and enthusiastically yes, and for absolutely the reason you wrapped in parens.

When so much software runs on your platform, availability matters (and is a critical component of security, which I feel Google doesn't quite understand for reasons not entirely related to p0). QA-test the hell out of a patch unless there's evidence of 0d or imminent exploitation. Plenty of examples exist where that kind of regression testing was provably necessary, such as this one case:

https://technet.microsoft.com/library/security/MS15-011

https://blogs.technet.microsoft.com/srd/2015/02/10/ms15-011-...

I'm happy as hell it wasn't p0 who found that one.


What did I read into your comment that you didn't say? You claimed Google used a line of reasoning when it suits them, I refuted that.

Your argument only works if a few things are true:

* P0 is unwilling to budge from the 90 day disclosure if a bug is legitimately hard to fix. But that isn't true: for example, they kept Spectre/Meltdown under wraps for a very long time. It's not just bugs that conveniently affect Google, either: plenty of Windows issues were given grace periods (usually to hit a patch Tuesday). They've even re-restricted bugs after MSRC _failed to request a grace period in time_ (e.g. P0-395).

* If a bug was being exploited, you'd know. (If this isn't true, delay just means attackers have more time to exploit the bug.) But that isn't (generally) true: plenty of bugs are hard to detect remotely, and we have no clue what hoard attackers are sitting on.

Never mind the fact that that the onus is on Microsoft to show that a period is warranted (attackers aren't nice enough to leave them a detailed reproducer), can we even come up with a plausible reason for this bug being delayed that isn't "we didn't prioitize it"? Is there code that legitimately tries to load the wrong DLL? If the argument is just 'QA should win by default" and mine is "disclosure should win by default", we're just going to have to agree to disagree. Vendors do not get to arbitrarily model their business to manipulate how disclosure works. Attackers don't care.


> Vendors do not get to arbitrarily model their business to manipulate how disclosure works. Attackers don't care.

Right, hence my earlier point, emphasis added:

> When so much software runs on your platform, availability matters […]. QA-test the hell out of a patch unless there's evidence of 0d or imminent exploitation.

At the expense of sounding like a broken record: (from an arguably oversimplified angle), confidentiality, integrity, and availability all matter.


I already addressed why that doesn't really hold water: it assumes that you're likely to know if a bug is being exploited or not. Google found the bug. They're already doing the free research, giving Microsoft a reproducer, and giving Microsoft a well-established policy for when they're going to go public with the bug. Are you suggesting they're responsible for knowing if a bug is being exploited in the wild, or that we should take Microsoft's word for it if it is or not?

To be clear: Microsoft can do whatever they want with the bug they themselves found too. (I imagine their internal teams would want similar policies to make sure that they can hold internal teams accountable for fixing their bugs, though, but whatever, that's on them.)

You are again only interacting with a tiny part of my argument. We're taking it as read that somehow this bug requires significant QA. Can we agree that some bugs don't need 6 months of intense QA to fix? A UAF is a UAF.


> I already addressed why that doesn't really hold water: it assumes that you're likely to know if a bug is being exploited or not. Google found the bug. They're already doing the free research, giving Microsoft a reproducer, and giving Microsoft a well-established policy for when they're going to go public with the bug. Are you suggesting they're responsible for knowing if a bug is being exploited in the wild, or that we should take Microsoft's word for it if it is or not?

Google is operating with limited-to-zero information on what exactly breaks when the bug is fixed, and Windows (or even just the .net framework) is a behemoth. Google and Microsoft do both have threat intelligence groups, but that's a separate thread.

When you're producing software designed to run on many configurations with absolutely stable operation for at least a month at a time, it's extremely, extremely hard to say that "some bugs don't need 6 months of intense QA to fix." I'm not an engineer at Microsoft, but so long as Microsoft is giving routine updates on a private channel—which they did in this case—as to why it's taking so long, it at least signals that the team is in fact actively working towards a resolution.

In fact, with this specific defect, Google applied its own rules/practices to decide whether Redstone 4 (which is my assumed read into what "RS4" means) would be considered a broad patch, whereas Microsoft considers system requirements for minor Windows 10 releases to be hardware-identical in need and entirely backwards-compatible.

Timeline (per the report):

-> 2018-01-19: Reported issue to secure@microsoft.com and received MSRC case number 43182

<- 2018-02-10: MSRC indicates that the issue has been reproduced and will determine if it's to be fixed.

<- 2018-02-12: MSRC indicates that due to unforeseen code relationship this will not be fixed in April PT

<- 2018-04-02: MSRC requests the 14 day extension.

-> 2018-04-02: Informed MSRC that as the issue will not be fixed with 90+14 days then the grace extension does not apply.

<- 2018-04-05: MSRC again requests withholding of disclosure until 2018-05-08, giving more context on the deadline miss.

-> 2018-04-06: Informed MSRC that this isn't possible. Made it clear that the issue isn't particularly serious and other .NET based DG bypasses are still unfixed.

<- 2018-04-11: MSRC again requests grace extension based on the upcoming release of RS4 which will have the fix

-> 2018-04-12: Informed MSRC that as there's no firm date for RS4 this couldn't be applied, and RS4 wouldn't be considered a broadly available patch per the disclosure conditions.

-> 2018-04-19: Issue exceeds deadline.

——————————

——————————

As an aside:

> You are again only interacting with a tiny part of my argument.

I'm developing RSIs and would prefer to minimize my interaction to what's most relevant to the debate. I apologize if that makes it more difficult.

Edit: though for what it's worth, I'm enjoying interacting. I bear no ill will towards you for your perspective; I've seen and learned quite a bit about balancing security and managing business impact as I've continued to climb the career ladder, things which were shielded from me when I was a lowly developer or security engineer.


Alice: "My bug isn't particularly severe, and there are similar issues from Bob and Carol. If it isn't going to protect customers, what is the point in fixing it?"

Bob: "My bug isn't particularly severe, and there are similar issues from Alice and Carol. If it isn't going to protect customers, what is the point in fixing it?"

Carol: "My bug isn't particularly severe, and there are similar issues from Alice and Bob. If it isn't going to protect customers, what is the point in fixing it?"


First off, this is about disclosure, not fixing.

Second, you're neglecting the time aspect.

This is a valid argument: "There's a similar issue that microsoft hasn't bothered patching for months, so what's the point in keeping it secret?"

This is not a valid argument: "In a few months there will be a similar issue, so what's the point in keeping it secret?"

So there is no loop leading to mistakes.


Because attackers only need 1 of the bugs to work? The argument only tenuously holds for disclosure to begin with, but it doesn’t make sense at all for patching.

Also, the other bug has been known, with POC, for more than half a year.


That preexisting exploit has been known since at least August 3 2017, so it's not like MS is scrambling to fix these.


it seems that cartels are sometimes very useful


The only "dick move" involved here is the fact that zdnet wrote this article. Minor security issue lapses standard disclosure deadline? Who cares. Instead we get this attempt to sensationalize this into some kind of big Google vs. Microsoft rivalry.



Could someone who’s more knowledgeable about how Windows works than I am provide a semi-technical explanation of how this works?


There is a lockdown mode which restricted which COM objects could be instantiated from .net code to a short whitelist.

The whitelist is based on GUID. The lookup of GUID to actual binary is done through the registry.

COM hosting implementations should check that the object they got is the one they asked for. Net doesn't. So if you can write to the registry you can escape the sandbox.


Yeah. It's not as severe as the article makes it seem. It's just a bypass in a very insignificant component for which you need 2-3 other vulns to exploit.

I presume the editor just wanted to put an article out with Microsoft and Project Zero in the title, rather than analyze the actual flaw in the context of its severity.


[flagged]


The press had many problems, but this isn't close to a lie. Sensationalistic? Maybe. But to accuse the journalist of lying is claiming that the article is (1) false, and (2) that the journalist knows it's false. They're big claims, not to be made as throwaway snark. A functioning (and honest) press matters, and this does it and ourselves no benefit.


Google reported the issue to Microsoft on January 19. Microsoft confirmed the issue about three weeks later

Microsoft should make a mental note that when you receive an email from a member of Google's Project Zero team you don't wait 3 weeks to respond.


Answering an email and confirming an issue is not really the same thing.


If it takes 3 weeks to confirm an issue reported to you by a Project Zero team member, especially when they provide you a detailed report on how to replicate the exploit, then you need to optimize your process.


Microsoft should attempt to verify issues quicker than 3 weeks, particularly when all details are provided?


Google, you have 90 days to stop tracking web users, then Windows will start asking desktop users if they would like to block tracking by filtering DNS requests


I would pay to see this happen.

"Google, we believe in our user's right to privacy and are looking for ways to improve their experience on our platforms. Due to your non-compliance with the upcoming GDPR and past misdeeds we have classified all your services as spyware and will be protecting our users accordingly should you fail to address this matter in 90 days from now.

Kisses, Microsoft."


I think you know why Microsoft won't do that. They do the same kind of tracking in Windows 10.

They had an opportunity to actually hurt Google by blocking tracking scripts long ago with their "Do Not Track" feature enabled by default in its browser. And they wasted it by simply asking advertisers like Google nicely if they'd like to stop tracking users or not (you'll never guess what happened next!).

Microsoft has already been found violating previous and less strict EU privacy laws recently. I think Google, Microsoft, Facebook, Amazon - they'll all end-up paying big fines in the EU within 18 months after the GDPR passes, because neither take it seriously enough and they still think they can use "angles" to trick the regulators as well as users into getting that data without real consent. They can't, and they'll learn it the hard way.

Oh, and the Privacy Shield will likely fall by the end of the year, too. So brace yourselves, it's going to be a wild ride for these privacy violators.


To be clear, DNT being enabled by default was probably more harmful to DNT than helpful. DNT should have been a choice from users and a clear indication of intent; by making it Default, there was no conscious decision like there is with Ad-blocking. DNT should have been (be) the clear message from users they do or do not want something, an undeniable response to over-reaching TOSes.

Microsoft making it default felt less like something to help consumers and more just bandwagoning. Whether or not DNT was particularly effective as a means of __blocking__ tracking has always been irrelevant. The point was the __message__ sent by those who enabled it.


> they wasted it by simply asking advertisers like Google nicely if they'd like to stop tracking users or not

This is pretty blatant re-writing of history. DNT was a piss poor standard that relied on advertisers respecting it. As soon as Microsoft put it on by default they got shit on endlessly for it, and advertisers just bailed anyway.

again, HN was upset at Microsoft for doing it.


On an unrelated note, I think that's a really smart move on the part of the EU. Most of these companies do what they can not to pay taxes in Europe, and counteracting it is difficult without hurting other businesses or creating other kinds o bureaucracy. But with the GDPR, the EU can easily put million-dollar fees on these companies easily.


Any fines would likely go to court many times, before they actually have to be paid. And all those large corporations will show up with a huge mountain of lawyers to try and get around any law that may exist.

Taxing the large companies is difficult, because of individual countries (like Ireland and Netherlands) free-riding to attract investment, while hurting all other EU members. "Tragedy of the Commons" that sort of thing.


Fortunately GDPR includes class action against corporations and there was already one rejected (prior to GDPR) fro EU high court with 25.000 FB users. Now it wont be rejected.


> Most of these companies do what they can not to pay taxes

This applies to all companies, everywhere. The problem is that the EU don't close their loopholes.


Google has no problem with the GDPR. They helped draft it and are very prepared for it.


> Google has no problem with the GDPR. They helped draft it and are very prepared for it.

The opposite is true, despite Google having lobbied during the drafting. The entire premise of how google makes money conflicts with GDPR.


Google has huge problem with GDPR, their whole bussiness model is standing on tracking users, they are only smart enough not to piss into the wind. Actually they know far more about you than FB, imagine that they have almost everything you have on your android phone.

And let me explain "draft it". They lobbied. The proof for that is fb and ggl attack on Canada to prevent legalising something similar as GDPR.

Please (PLEASE, FOR GOD SAKE!), stop beeing protective to corporations, either FB, CocaCola, Tesla, Google or whatever comes to your mind. None is have priority in making world a better place, their only priority is money and power and money and they do not care about you more then a milking cow. If you disagree, you have fundamential lack of understanding how world functions.


When I will finally be able to speak my personal thoughts without beeing downvoted by fanboys with low self value who are bonding their self worth with a popular brand. Guys seriously, you need psychiatric care, you ARE WRONG but due to your own personal problems you are beeing pestilence to the world. It is simple to diagnose, just check your posts, if you are defending one brand, infront others you are flawed. Every brand does something wrong, but your perception is it is better.

Here you are having a study, about your behaviour, READ IT, you will thank me later, you are having issues worth psychiatrical care, help yourself and stop annoying the human race: https://insight.kellogg.northwestern.edu/article/leave_my_br...


You're likely being downvoted because your comments break the site guidelines by being unsubstantive and/or uncivil. It's possible to make substantive points thoughtfully; when commenting here, please do that.

https://news.ycombinator.com/newsguidelines.html


As far as I have noticed, it always happens when you post criticism on rust, javascript, tesla or google (haven't tryed with apple, samsung,... maybe someone should, it would be a nice research). In past same was with FB and Microsoft while this improved and today you can criticise them without downvotes. I think, that it is a clear case of this: https://insight.kellogg.northwestern.edu/article/leave_my_br...


> They can't, and they'll learn it the hard way.

I doubt it. The fines will be completely irrelevant, and they won't even need to notify more than 1% of their lawyers and lobbyists to ensure they can keep it up for the next decade.


GDPR fines can be up to 4% of a company’s turnover (at group level too, not just the turnover of the local subsidiary). Whether the courts will actually use the maximum fine remains to be seen, but on paper it’s enough to give you pause - 4% of revenues is a large chunk of money for anyone. I work for a large EU company and this possibility is taken very seriously.


All I'm saying is I don't think the result will be a real improvement to privacy or data safety. I think it'll be some adjustments on paper, then business as usual.


" Due to your non-compliance with the upcoming GDPR "

Honest question, why is Google not in compliance with GDPR?


This could happen 10 years ago, today Microsoft embraced tracking and spying. Win 10 grabs more info about you than google.


It's unfortunate that this is necessary, but you can use this if you'd like to limit that a bit. You'll have to put it on your router or on your own dns server though, as I think Windows will ignore some hosts from the system file.

https://raw.githubusercontent.com/crazy-max/WindowsSpyBlocke...


Does it? I wouldn't know how to tell it that much.


No it doesn’t. Not even close.


You need to check your facts.


Just think about it before you ask for that.

Imagine if DNS resolution on Windows was pay-to-play.


Please do this! :)


Why does MS struggle so much with security?


They do not struggle - Microsoft have one of the more sensible and well functioning security organizations in the software world. This just is not a very important bug.


Why does everyone struggle so much with security?


ChromeOS been secure since the get go. Just seems weird Google can do it and MS struggles so much.

Now you get GNU/Linux out of the box but security intact.


Are you aware that ChromeOS comes without the GNU userland?


Why 90 days? Why not 30, 14, or 7? Microsoft might have requested responsible disclosure for exploits affecting Windows, but what gave Google the right to set a deadline?

I feel the 2 US companies have a friendly competition with each other which can help secure their systems.


The right to freedom of speech means Google can say what they want when they want about the vulnerability, giving them the right to set a deadline.

There are certainly companies doing much worse than setting 90 day deadlines. For example VUPEN, Hacking Team, and GrayKey selling undisclosed vulnerabilities to "good" governments, and other companies servicing the shadier governments[1].

[1] https://www.bloomberg.com/news/features/2017-01-18/the-post-...


>The right to freedom of speech means Google ...

Surely it means specific people employed by Google may "speak". Does the right extend to corporations?



Corporations are made of people and you do not lose your rights because you form a corporation.


That doesn't answer the question. Do corporations have the right to bear arms separate to the rights of the members of the corp - can Google keep an arsenal even if none of the people in it had a gun license?


Knowledge of where Microsoft forgot to enable a security check is not “bearing arms”.

Imagine that world. I point out a mistake to you, and by reading or hearing it, you are suddenly holding a gun! We would have to criminalize coredumps :)


No corporations aren't people. That headline is pure clickbait. Some rights are extended to corporations due to corporations being a group of people who come together for a purpose.


This speech was ultimately done by specific people employed by Google.


> Does the right extend to corporations

Unfortunately it does, to some degree.

Religious freedom too. The US is fucked.


Does freedom of speech extend to newspapers? Newspapers are corporations.


Yes, it does, at least since 2010.


Nobody has any right to set a deadline. The 90 days is merely Google being courteous.


For a bug that should take no more than a few days to patch and test 90 days seems like more than enough for a product with automatic security updates.

For a bug with completely unknown scope and very difficult fixes (such as the recent intel issues) the story might be different. But 90 days here? Why would Microsoft need more than that?


Because of an "unforeseen code relationship", they say. What more could we want to know?

It sounds like some other piece of MS software is relying on .NET not performing the checks that it should have been performing.


Is Microsoft allowed to make arbitrary business decisions that prevent it from responding to security vulnerabilities or is backwards compatibility given a special pass?

They own an operating system. We get to hold them responsible for whatever choices they make that impact security.


Windows attempt to always remain backwards compatible has left a huge tangled mess of dependencies. I applaud Microsoft's attempts at this but a slow cycle of breaking changes would allow a much better long-term system. Of course, that also means you have to be committed to some sort of long-term roadmap (something that appeared to be lacking between XP and Longhorn/Vista/7).


The last time they did that was with the launch of vista. People were not pleased.


For something installed on so many devices, 90 days seems like an incredibly tight timeframe to change anything.


Put another way: For something installed on so many devices, 90 days seems like an incredibly long time to leave so many devices vulnerable. Security fixes aren't a once and done thing. New exploits will be discovered that must be fixed. Missing deadlines for less severe exploits encourages getting better at sending security fixes out. Then, when more severe exploits are discovered, they can be patched in a reasonable amount of time. If extensions are always given, then deadlines become meaningless and "90 days" becomes "eh, 6 months given we can push for extensions."


Don't forget that these 90 days are also 90 more days where this vulnerability can be exploited by attackers.

Microsoft has set up a patch delivery infrastructure that's pretty effective and comparably fast by industry standards, if not deactivated by the people who got offended by the forced Windows 10 upgrade and feature creep.


Well, Windows Update is forcing me to deactivate it in one machine bacause it continues to make it unusable. I have come to terms with most quirks of the forced updates, but in this particular situation Windows is nasty und uncooperative.

Add to it Microsofts well established unwillingness to provide any useful diagnostic information and suddenly the only way to use the machine is to not update it.


3 months is more than enough time if you care about your customer's privacy and security.


YOu are talking about a very large, complex, mission critical, and incredibly widely used piece of software.

If they mess up a patch it's a big deal. If they break systems, introduce further bugs, etc...

90 days to understand the problem, fix the bug, verify the fix, plan the release, get it out to customers. There is a lot of work involved in such a thing.


Do you have any experience with complex systems where a security patch could possibly take more than three months to implement?


He might have or not but I have. Not as complex but similar mission critical and distributed. 90 days is nothing as outlined.

Once you go life or death situations, regulatory environment applies, backward compatability matters, ... Everything takes endless. It is not code, commit, test and deploy. Intake, Risk Analysis, project planning, approvals, alignments, etc. So many more processes. We should not fool ourselves that other platforms are better in that once you go for serious SLAs. Linux Kernel or user land patch might be fast, but RedHat delivery will take longer.

Welcome to Enterprise development.


Very very little of enterprise is life or death, and windows is not suitable for life or death.

When enterprise just falls on its face, I don't have much sympathy. "So many more processes" sounds like taking a handful of steps, splitting them up, and making each one require multiple days of memos back and forth. Can you provide any justification for this? Am I misreading?


>>Welcome to Enterprise development.

That is assuming whoever you are replying to is foreign to enterprises.


In that case - if you are not - congrats! Seems like you ended up in well organized places :)


I have that experience. Security patches take a long time. You need to ensure that mission critical operations are not impacted, that the private builds which had been supplied to customers are not impacted, that documentation gets updated, that laws and regulations are followed across the world and in specific regulated verticals, etc. Then customers need time to review the patches and follow through on their schedule of updating their devices.


Correct there is a lot of work to be done. It is not 90 days worth of work. If you think that is not enough time, you need to raise your standards.


People who think 90 days is not enough time don’t have insight into how long this actually takes and why it took so long. I don’t know where this meme came from but I know for certain that is not the reason that companies like Microsoft miss their deadlines.


What gives Microsoft the right to set the deadline? It’s their bug, and the bad guys aren’t going to stop using a bug because Microsoft has decided to be slow at patching.


I think there are so many point of views here. I'm not going to defend Google nor Microsoft, but imagine you're paid by Google to work on security issues. What would be the metric to prove your existence, if there is no public awareness of your work, like this zdnet article? Project Zero IMO from time to time need to show they exists and doing great job. I think that could be one of reasons, why they resists to prolong standard 90 day period.


I have a hard time thinking of a more elite team than P0. They earned their stripes long before they got there. The disclosure policy is the right thing to do, not to make someone feel better about their job.


It's one of the points of view, not saying the most significant one.


Read about the details. Wow, having a bug like this being discussed so broadly shines a bad light on Google IMHO. Its appears like targeted news against Microsoft. It's not mich newsworthy defense in depth issue. If an adversary can modify the registry, they can do a lot more harm.


Denying the deadline extension to May 8th [1] is quite a dick move by Google, considering that it took them 6 months to fix the extremely harmful sitemap ranking bug in their search engine[2]. And after they fixed the bug, they only paid peanuts to the researcher for a bug that could've cost Google's customers tens of millions in misplaced ad campaigns.

1: https://bugs.chromium.org/p/project-zero/issues/detail?id=15...

2: http://www.tomanthony.co.uk/blog/google-xml-sitemap-auth-byp...


> that could've cost Google's customers tens of millions in misplaced ad campaigns

You're comparing an operating system security bug to advertisers' lives being made inconvenient.


Yeah so? Both involve tons of money.


One can have a material impact on people's security, and thus everything from financial stability to physical safety. The other results in some companies making less money via advertising than they otherwise would have.

Both involve money, but let's not pretend that the more money involved, the more important something is.


A company being driven out of business by advertising practices, having to let their employees go, and merely hope they can find new jobs, certainly impacts their financial stability, and likely their physical safety.


First of all, Google has no responsibility to give any period of time. It isn't a "dick move". Second, the researcher in question:

1) Never gave a required date for disclosure. 2) Upon requesting the right to disclose, was told no and he followed suit. 3) Was initially offered a bug bounty of $1300, which was upgraded to $5000. He apparently never bartered on that issue at all.

So your entire post was completely irrelevant.


> First of all, Google has no responsibility to give any period of time. It isn't a "dick move".

The motivation of the project is supposedly to protect Google's users. Being firm on disclosure deadlines helps ensure that vendors take the issue seriously. Did they have any indication that Microsoft wasn't taking this seriously? If not, then it sounds like their true motivation is elsewhere.


> Did they have any indication that Microsoft wasn't taking this seriously? If not, then it sounds like their true motivation is elsewhere.

That doesn't follow. The primary reason to be firm is to ensure that vendors take future issues seriously. Belief that the vendor is serious about a single issue removes only a tiny fraction of the motivation to be firm on deadlines.


There are two possible outcomes with any security issue:

1) The issue is so obscure that nobody else in the world will ever discover it, so not disclosing it to anyone but the vendor is the right choice.

2) The issue has been discovered by someone with malicious intent, and every second that you hide the details from the users, they're at risk.

You can't know which case applies, which is why policies about disclosure are useful. If a vendor is informed of a security hole, and they immediately fix it, great, users are saved. If a vendor is informed of a security hole, and they do nothing... eventually users will have to mitigate the risk in their own way (which is usually "stop using the flawed product"). A disclosure deadline strikes a balance; in many cases it's pretty likely that no evildoers have independently discovered the flaw, but would be able to exploit it if they knew the details. So giving the vendor a bit of time to fix the issue is the best solution. But given infinite time, all bugs will be discovered and exploited, so the longer you wait to fix or mitigate, the more risk you take on. Therefore, I think Google's policy strikes a very reasonable balance between protecting through patching and protecting by telling users to use something else.

With that in mind, I have no real qualms with people that disclose flaws immediately (letting users be aware of their risk), or vendors that fix an obscure bug that's not being exploited slowly. In the end, if users want to be free from all risk, they should be finding and mitigating these issues themselves... anything you get for free out of someone else's goodwill is a benefit.


They asked for multiple extensions, couldn’t give a hard deadline on the Redstone release, didn’t disclose why it was hard to patch (“unforeseen code interactions” is basically every patch ever, definitionally!) and a comparable bug has gone unpatched since being publicly disclosed since Aug 2017. Also, looking at the fix... if they have code that relies on being able to lie about which DLL a COM object maps to maybe that’s a problem?!

The onus is on MS to show more time is warranted, and so far I’m only seeing evidence to the contrary.


> The motivation of the project is supposedly to protect Google's users

Exposing competitors security bugs is only a nice marketing side effect.


Bartering on reward money before disclosure can easily look like -or be- blackmail.


Not true. Bartering in this sense meaning suggesting that the reward is at a level consistent with a non-critical bug while this is. Not "Pay me more or I'll release it".


> First of all, Google has no responsibility to give any period of time.

Any security researcher has a responsibility to disclose a vulnerability in such a way that it does not cause widespread damage. I don't know why you would think otherwise.


Not everyone agrees with you. Some people promote full disclosure.

>Full disclosure is the policy of publishing information on vulnerabilities without restriction as early as possible, making the information accessible to the general public without restriction. In general, proponents of full disclosure believe that the benefits of freely available vulnerability research outweigh the risks, whereas opponents prefer to limit the distribution.

https://en.wikipedia.org/wiki/Full_disclosure_(computer_secu...

https://hn.algolia.com/?query=author:tptacek%20responsible%2...

https://news.ycombinator.com/item?id=16579292


Letting vendors leave bugs unfixed is irresponsible. Google didn’t put the bug there. I don’t know why you would think otherwise.


Disclosing unpatched vulnerabilities when the company is asking for an extension of a few weeks to patch is absolutely a dick more and extremely irresponsible.


Seems the researcher's response was something like: "Hey, you know how your front and back door are both unlocked and everybody already knows about it? Well, your bathroom window is open too - so if someone gets past the front gate, across the moat, and into the castle - there's a third unlocked way onto the living quarters."

It's not like _this_ bug alone would get anybody RCE - they'd need to chain up some other way in, and if they've done that there are at least two known and unpatched bugs that'd get them the same place as this one already.

It's just as easy to argue that pushing for extensions to disclosures that aren't going to decrease security is the dick move here...


What is an acceptable amount of time? Why isn’t 90 days the correct number? A comparable bug has been out since Aug2017 and MS hasn’t patched it. Is a year the right number? Does everyone get that extension by asking? How does that help anyone but the people who own the bugs to save face publicly and attackers more time to exploit the bug?


It seems that the researcher was relatively happy with the outcome. They could have certainly gone public far earlier if they had chosen to do so.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: