Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft fixes 5-year-old Defender bug, reducing Firefox-related CPU use by 75% (bugzilla.mozilla.org)
1163 points by ylere on April 10, 2023 | hide | past | favorite | 422 comments



Quick napkin math of the wasted power : Firefox has ~300e6 users, let's assume the bug wasted 5 extra watts 4 hours a day.

That's 250 megawatts saved, the equivalent of an average coal power plant. Because some Microsoft engineer missed a bug.


> That's 250 megawatts saved, the equivalent of an average coal power plant. Because some Microsoft engineer missed a bug.

Are you sure you want to invoke this logic? Because following it through imagine the energy savings if Firefox users switched to Chrome.


It's not too bad an analogy. Think of it this way:

- Switching from Firefox to Chrome might be similar to switching between two car models, one consuming less energy than the other.

- Fixing this bug is more like going to a car workshop to fix an injector issue in your car that was causing higher fuel consumption and more pollutants.

The first one is really a matter of tradeoffs and personal choices. The second one is less of a choice and more of an actual issue that was left due to negligence. Hardly similar.


Isn't it more like an auto maker issuing a recall to fix an injector issue in all their cars?


An analogy can only get you so far, but in this case the bug is caused by Microsoft Defender, yet Firefox, the car manufacturer, is a different entity. So I wouldn't call it a recall.


> but in this case the bug is caused by Microsoft Defender, yet Firefox, the car manufacturer, is a different entity.

Quoting the Mozilla engineer responsible for most of the recent activity on the bug:

"This problem has two sides: Microsoft was doing a lot of useless computations upon each event; and we are generating a lot of events. The combination is explosive. Now that Microsoft has done their part of the job, we need to reduce our dependency to VirtualProtect."

It was also noted elsewhere in the thread that similar, though less severe, CPU impact is seen with other antivirus products.

Microsoft was doing something wrong that made this operation more expensive than it needed to be, but Mozilla is also doing this far more than any other browser.


A bunch of cars across many manufacturers were recalled in the 2010's due to a defect in the airbags made by the same manufacturer.

One could also argue that the OS is the car, the browser is the chauffeur, and the user is the passenger.


With regard to browsers, I think we need more manufacturers, not just the GM (Google/Chrome) of browsers. I want Ford in there as well. (Sticking with US automotive companies).


What's wrong with Chrysler?


> Are you sure you want to invoke this logic? Because following it through imagine the energy savings if Firefox users switched to Chrome.

Ironically, Mac users routinely complain about how power-hungry Chrome is on the Mac. Safari is significantly more efficient.


>Safari is significantly more efficient.

Based on the increased laptop battery life I notice, so is using Edge on Windows.

It makes sense that both Apple and Microsoft can extract the best out of their OS + browser. There's no way Firefox can compete on such OS specific optimizations.


> so is using Edge on Windows.

The new Edge based on Chromium performs better than Chrome with regards to power consumptions.

The old Edge was even better in terms of power consumption. Like a few extra hours of battery life.


To continue to overextend the analogy though, that's asking people to use Edge over Chrome or Firefox is like trying to market the Smart Car. Few people want to be seen driving it, fuel efficiency be damned.


And no browser is more efficient that simply turning off your computer and reading by daylight.


Is that because of the quality of Chrome or because Safari is a "blessed" application and probably gets to do things other applications do not?

Entirely serious question. Apple is known to severely privilege their own applications over competitors.


Totally guesswork here, but I'd say Chrome has a lot more telemetry, profiling and tracking built-in and its users tend to use a lot more plugins, including things like ad-blockers that scan over each webpage and can be beneficial (battery-wise) or not depending on content. Safari users are more of a barefoot type. A power user is more likely to not be running Safari. And a power user may, well, prefer to sacrifice battery power to get the power they seek.

Besides, there's some precedent set in 1998 by a certain OS that "favored" their embedded browser over the competition, so I doubt Apple would want to tickle that fancy.


That's because optimizing for battery life is a stated goal of the Safari team, it's actively benchmarked


If I had to guess, it's that Apple refuses to implement very CPU-heavy JS features they don't think add much value in Safari.

I recall some feature I was using that ran in an inferior way to save power on Safari, so much so that I had to kill that feature and use a different API


I think it's more that Apple implements power saving behavior into the JS engine on desktop much like they do on mobile. For example, pausing background scripts on inactive tabs and disallowing autoplay on videos even on desktop.


I would be at least slightly surprised if Apple would deliberately hold back power-saving techniques from third party developers. It’s not like this is something that makes Safari much more appealing to users. Most users wouldn’t even notice that their choice to use Safari is responsible for their great battery life.


Third party developers are free to go here and see what Apple is doing:

https://github.com/WebKit/WebKit

> Most users wouldn’t even notice that their choice to use Safari is responsible for their great battery life.

Seeing that MacOS has called out which apps are using the most energy for years, users can definitely tell which apps are using the most battery life.


Apple can and does leave things out of that repo.


So you think there is a conspiracy by Apple to not allow third parties to be battery efficient and make there hardware look bad?

But you are free to download the source code, compile and run it yourself to see if it is less battery efficient than the official builds.


I absolutely think there is a conspiracy (inasmuch as "conspiracy" can be defined as "not going out of their way to change it") to make the competitors to Apple applications look worse than their own offerings. It does not reflect poorly on Apple when Firefox performs worse and uses more memory and battery than Safari.


How is Apple suppose to change Firefox and Chrome’s source code?

If either browser’s manufacturer wants to see what Apple does to make Safari more battery efficient, they are free to look at the source code to WebKit.


And as the other person told you already, Safari.app is not what you get if you build the WebKit source. In much the same way that you can't judge Chrome by Chromium, one is the base of the other, not the whole.


I did not say that.


It’s mostly that they care. Chrome is designed with the mindset that it’s the most important thing running on your computer.


Blessed or not, I still end up using Safari. The improvement in battery life is too significant to ignore


Have you compared versus Orion, which is also based on WebKit (and can run most Chrome and Firefox extensions)?

IME, Safari is no better than Brave in terms of battery, even when I’ve got hundreds of tabs open.


It's because of the quality of Chrome. Google's focus is on features and security, with efficiency secondary. Safari's primary focus is efficiency.

Google's only incentive to make Chrome more efficient is when they start receiving heat (no pun intended) about bad performance.


Googles focus is to collect as much data about the user as possible, nothing else.


It's not impossible, but I doubt it, if only because very few third party applications use as much as Chrome does. The only exceptions are things that actively use a lot of CPU, like compilers or compressors.


Its been a really long time but safari on Windows was a thing and it did run a lot leaner in the background than anything else available at the time (except Opera if memory serves).

It’s entirely possible that Safari is intentionally avoiding features that make it wake up-

I doubt that it does anything unavailable to other browsers, thats MS territory, because they wanted features. I feel like safari, by contrast, doesn't want to add features.


If only the underlying engine were open source so people could see what “secret APIs” (sic) it was using…


I’ve not noticed a difference between safari and brave, so I’d say the former is not blessed.


Think more like this: this bug cost an average coal power plant, all other things being equal. I doubt it's that much, but it certainly did waste a lot of energy.

> imagine the energy savings if Firefox users switched to Chrome.

Imagine the privacy savings if Chrome users switched to Firefox.


There are good reasons to not use chrome over firefox, but few reasons to leave firefox bugged. I don’t think the same utilitarian logic applies.


Yeah, finally as the market share is where it should be for Firefox Microsoft had no more reasons to leave it on :D


> imagine the energy savings if Firefox users switched to Chrome.

Imagine the energy squandered on all the extra goods and services bought by users using a browser owned by an advertising company, instead of Firefox.


If one user switches to Chrome, the energy savings are only for that one user. If one Microsoft engineer fixes a bug, the energy savings are for the many thousands who use Firefox on up-to-date Windows.


> imagine the energy savings if Firefox users switched to Chrome.

Nah, I like my privacy. How about replacing Electron apps with native apps instead?


> Because following it through imagine the energy savings if Firefox users switched to Chrome

i've read everywhere that Firefox at this point is far more energy efficient than Chrome...is that not true?


Maybe compare manifest v2 friendly Firefox with uBlock Origin vs eventual Chrome without it :)


Serious savings indeed when the Javascript cryptominer some ad network blithely serves up is ad-blocker'd, but we prefer synthetic benchmarks.

In seriousness, though, this is an issue. Elsewhere, I observe arguments about eg userbenchmark rankings, and the comparative relevance of single-core vs multicore performance. Are you playing a game, or rendering video 24/7 -- or running some entirely synthetic workload that allows for a peak performance the real world would never achieve? Same kinda problem.


Not sure if this is related, but Chrome on my Macbook regularly gets to a state where one process (a tab?) is consuming 4 Gb RAM and 20%+ CPU. This happens on both M1 and Intel CPUs, on multiple releases of OS.

Situation seems to happen some time after visiting ad-heavy sites (yeah I know I should install an ad-blocker). Have others noticed this? Seems to be Chrome. I also use Firefox and occasionally Safari but have not noticed that with those browsers.


Web browser benchmarks are typically designed to track real-world applications


I mean, sure, I could also just turn off my computer. Presumably people use Firefox for a reason, and making that a option use less energy is pure upside, and it's very interesting to see how big of an upside it might be.


The cause and effect exists whether or not some commenter on HN writes about it.

The reason it is not “invoked” is because energy prices are sufficiently low (due to not pricing in externalities) that there exists little incentive for end users to optimize for power usage.


>The reason it is not “invoked” is because energy prices are sufficiently low (due to not pricing in externalities) that there exists little incentive for end users to optimize for power usage.

You're right in principle, but in practice even factoring in externalities electricity prices won't be high enough for people to care. Using current US carbon intensity for electricity generation[1] and the higher end estimates for the social cost of carbon[2] gets us carbon costs of $0.142 per kWh. The average prices in US is $0.168. Adding in carbon costs would almost double the price, but there are countries with even higher electricity prices[4] and they're not exactly switching to more efficient software in droves to save energy.

[1] https://emissionsindex.org/

[2] https://en.wikipedia.org/wiki/Social_cost_of_carbon#Carbon_p...

[3] https://www.bls.gov/regions/midwest/data/averageenergyprices...

[4] https://www.statista.com/statistics/263492/electricity-price...


Imagine the power savings if chrome users switched to lynx.

Imagine the power savings if everyone used pihole, ublock etc.

Second uses more power than the first and is better. Do it!


Does Chrome really use significantly less resources than Firefox? Are there numbers there?


According to Tom's Guide[1] Microsoft Edge beats out both when it comes to RAM utilization but Chrome just edges out Firefox when loading >10 tabs. That was in 2021. I'd be interested to see any other comparisons or benchmarks.

1. https://www.tomsguide.com/news/chrome-firefox-edge-ram-compa...


This is with no extensions installed right?


It took a lot longer for Firefox to get GPU accelerated video playback on Linux iirc

Perhaps a "niche" use case for some, but there's a lot more Firefox users on Linux in particular


Isn't Chromium hardware acceleration currently broken on Linux?


It works for a single case, but not sure about long term. We still have at least a little bit of competition. All browsers improve because the other ones improve. If we all switched to Chrome or Safari because of the power saving, we'd just get another version of IE monopoly/stagnation which could result in more power wasted longterm.

The optimum is likely somewhere where most people use the currently most efficient solution and others keep alternatives alive and competitive.


Using firefox without memory errors is a pareto optimization over using firefox with memory errors.


Imagine the energy savings if all users installed a good ad-blocker.


Those are 2 very different things.

In switching browsers, you get different features, workflows and functionality, many of which you prefer on Firefox, which is why you’re using it.

With this bug fix, there is absolutely no loss or change in functionality (other than increased battery life), and yet you’re getting the benefits.

This bug is pure waste.


Or the energy used by all the electron apps on all operation systems.


>Firefox users switched to Chrome.

Far worse due to privacy/adblock addons.


How many megawatts saved if everyone learned how to do all their work on text-only consoles running linux?


> imagine the energy savings if Firefox users switched to Chrome

This is why I left firefox.


> Are you sure you want to invoke this logic? Because following it through imagine the energy savings if Firefox users switched to Chrome.

And then imagine Safari on an M1.


Now do one for javascript.


> Because following it through imagine the energy savings if Firefox users switched to Chrome.

Enlighten us.


I love calculations like this and hope they are part of every engineer’s line of thinking. I originally came across this thinking in Andy Hertzfeld’s book - https://www.folklore.org/StoryView.py?story=Saving_Lives.txt

Performance is time, energy, heat. It’s one of the easiest features to get and there are lots of tools, research, and philosophies to help get it. Memory and storage are similar.

For anyone working on large scale apps that are on millions of devices, hundreds of thousands of servers, or even just some back office guy who has minutes less stress in his day, performance benefits the world. For programmers, it’s one of the easiest ways to Save the Planet™.


Don't forget the waste caused by people throwing away devices that are "too slow", and the resources required to build new computers/phones.

Somewhere I saw a rough figure about phones. Something like: if everyone was able to keep their phone one year longer, it would be the equivalent of 600,000 cars off the road or something. (Just looked it up - source is possibly the founder of iFixit).

But you know, development velocity or whatever.


Yep. Most things most users do on our computers could be done just fine on a computer from 10 years ago. I think we're on a performance treadmill like this:

- Developers get new, fast devices (yay). We make our software just fast enough to run smoothly (enough) on our own devices. Then we ship it.

- Users' computers are slower than the developers' computers. Users experience is bad because the software is slow.

- Users buy new computers too. Money is poured into R&D for even faster computers.

- Hardware companies release faster hardware

- Developers buy new, fast computers

And the cycle continues.

For general purpose computing, there's not really any meaningful difference between my computer today and my computer 15 years ago. But I can't use that old computer, because modern programs don't work well on it any more. I can't imagine how slow modern Discord or Microsoft Teams would run on my old 2012 macbook air - even though that machine is orders of magnitude faster than the computer I was using to chat on IRC in the late 90s.

I'm low key convinced that if computers and phones stopped getting faster tomorrow, the software industry would grumble but get on with things and adapt. Users wouldn't really notice anything change, except we'd all be richer because we would no longer need to buy new computers every few years.

There's a few notable exceptions of course: modern AI, video production & animation, and massive cloud services (like Google and Netflix) who actually optimize their code. I'm on the fence about compilers - do LLVM & rustc really need to be that slow? Netflix might never make it to 8k video (boo hoo). And ray traced video games might not happen. But the average consumer would probably be delighted. And the savings to the environment would be insane.


This is interesting, but maybe our software is doing a lot more today than it was back then. I guess the question is whether those things are necessary. There is of course some software that is just slower because of negligence too.

I remember using a computer 20 years, and they would take multiple minutes to boot into something that was usable, and executing programs would take quite a few seconds because the HDD had to go searching for stuff the executable required.

My current desktop has a Samsung 970 NVME SSD, it boots up in like 20 seconds or something, and everything is extremely responsive as far as reading from the disk goes.

It's interesting that the one thing that hasn't been completely neutralized by the treadmill is SSDs!

Also the LLVM and Rustc thing is interesting. I imagine they are doing much more advanced optimizations than what compilers did 20 years ago so they probably have a good excuse, but I think these optimizations are also neutralized by the treadmill eventually.


chromium 111 source is 1.7GB compressed with xz(lzma). google has completely lost whatever minds they may have had, and included more code than one would imagine exists in the whole damn world.

it is an absolute abomination, and these words does not even remotely do justice to just how horrible it is


Actually, in the PC/laptop space, I believe this phenomenon has been waning somewhat over the past... oh, the better part of a decade.

This is a result of:

* Single-core performance no longer dramatically improving - almost plateauing

* The rate or extent of "bells and whistles" and other OS overhead being added - decreasing.

* Budget consumer CPUs having reached smooth desktop performance (with sufficient memory and and an SSD) already, even with multiple applications open.

.. and all of these had not been the case during the 1980s, 1990s and 2000s. Now, if your machine's hardware doesn't brake down - and you're just a plain desktop user - your motivation for throwing away your machine is quite limited.

---

Of course, this is not the case for smartphones, we're still on the roller-coaster there.


I would say we're coming to that point with phones too. Phones have also largely reached the end of their breakneck big-annual-improvement phase; There's little annual improvement in performance, power consumption, form factor or feature set. Screen resolutions are plenty high for most users, similar for the cameras. Most people would also probably be quite happy with the version of Android or iOS they are using if it kept getting bugfixes and security updates.


> Most people would also probably be quite happy with the version of Android or iOS they are using if it kept getting bugfixes and security updates.

Absolutely! Most non-tech folks would be delighted by this. Regular users generally don't care one bit about whatever fad style or trendy feature got added with the latest update, they are just annoyed that their buttons or icons got moved around.


> Most non-tech folks would be delighted by this.

And many tech folks too. My phone is a tool, not a lifestyle. I don't need my tools constantly changing out from under me.


Here's an anecdote.

I like to look at websites that offer "last season" outdoor gear at discounted prices. The major frustration is that the sizes are never my size of shoe/shirt/etc because the popular sizes are gone sooner than the odd sizes.

There is no "domain-specific" improvement (The "domain" in this case being "last season discount retailer.") to the basic shopping cart ecommerce experience that would give me a way to flag my interested product or product category so that I could be notified when the item is in-stock.

This basic feature would vastly improve my interaction with the websites and would probably result in higher sales.

In today's world, that's a huge ask to have that kind of feature implemented.

Heck, even asking for a good "syntaxable" search seems to be an impossible ask. Imagine being able to search "Category: Shoes; Brand: "North Face"; Price: >50&&<120"

The fact that this is never implemented yet billions of dollars are poured into ecommerce development kinda reflects how the industry is focused on improving the tooling/back end and not the design.

I see vastly more posts in this vein: "Nobody really knows how this complicated spaghetti-like system works"

than in this vein: "X tool allowed us to build a Maserati of our domain."


It can be a bit dangerous (especially to your employer) to continue that line of thinking, though. How many pieces of software do we collectively work on which would make the world a better place if they didn't exist at all?


Meh. I feel like there needs to be an active movement to assess programs that have huge scale (>10m users) to identify unnecessary power usage - whether it be because of a bug, because of unused functionality that nonetheless takes resources, or intermediate steps that take unnecessary power.

Perhaps I’m getting into a bit of a niche here, but the rise of stringy formats for data transfer concerns me. There are many-stage pipelines on machines that agree on what a 64 bit integer is, yet each stage performs encoding and decoding of JSON twice (decoding upon receipt, encoding to pass it on to the right place, decoding the response, encoding it in another manner to reply to the original sender). Sounds like a minor concern, but the scale of this instinctively feels like it’d dwarf 250MW globally.


Is that really a downside?

In some cases you convince your organization to shift focus onto more useful products, and that can be a really great feeling. In other cases (company is too large, management too committed) it helps you confront exactly who you're working for. Because if you're going to sell your soul, you should at least make sure you're getting a good price.


Oh no!

… anyway…


If your job and your conscience disagree, quit your job. You only get one life and we have so many options as engineers.

Why make the world worse for ourselves and our children when you have other options?


How did the idea of avoiding premature optimization get misapplied to client-side apps where the entity writing the software is not the one paying for electricity, cooling, and people's time when the software takes much longer to run than it could? When did a lot of software devs stop caring?

Pardon me, I think there are some electron devs at my door asking for a word. They might have baseball bats.


Premature optimization should be avoided client side as well I imagine? It just seems like lots of development shops skip optimization altogether, even when it stops being Premature (when it matures?).

And it's not like those Shops suffer for it, so it isn't very surprising they continue.


>> When did a lot of software devs stop caring?

I'm not sure the devs stopped caring as much as the powers at be. Software development has become more commoditized than we want to believe. Devs following an agile workflow with every intent of performing multiple rounds of optimization find that the product gets shipped as soon as it approximates the thing that had been conceived originally.

It doesn't look like an immediate failure, so the less that leadership takes from it is frequently that the level of maturity they shipped is safe. The cycle continues and eventually folks lower down succumb to this shipping pattern. The only things that get them to optimize is competition that successfully drive home their win was due to performance. This doesn't always lead to optimizations when you are an incumbent who can still close more feature gaps because those often result in higher sales and revenue.


I use a 7 year old low-power laptop. Cooling, electricity usage, and performance of Electron apps are never an issue. Crashes, bugs, lost data, and bad usability still are. I’d rather have devs spend time on that stuff.

If Electron frees up organizational resources to do what’s actually important, I applaud devs for using it.


Not an issue for you, that you know of, because you have no equivalent software that's written in a more performant language (or at least critical portions of the codebase written in a more performant language). That's part of the problem; in most cases, software users don't know what life would be like with better software. They assume the performance they see is close to optimal, or at least that the devs made an effort to optimize. Users are willing to get used to whatever the software's performance is, in order to have access to the software's features. You can get used to almost anything, as suboptimal performance turns into background noise, but that doesn't mean you should. You get used to waiting 5 seconds for a piece of software to do something that you do on average once every hour or two, not realizing that those 5 seconds, 10 times a day, 365 days a year, cost you 5 hours every year.

Optimizing performance and fixing crashes/bugs/dataloss aren't mutually exclusive, either. Developers who care less about code quality than checking boxes for features requested by management or customers, will write code that's both suboptimal and buggy.


> Not an issue for you, that you know of, because you have no equivalent software that's written in a more performant language

How on earth would you know that? No, incorrect.


There's a similar calculation (in a slightly different context) in a good scene in the movie Margin Call, about all the miles and hours saved by one bridge: https://www.youtube.com/watch?v=m8Mc-38C88g


Takeaway: If you're working for a tech company with billions of users, your personal CO2 footprint is completely irrelevant. Rather than agonizing over flights vs. trains, beef vs. pork vs. vegan alternatives, heating etc. - find some small thing that you have access to and that you can optimize.

That's about 2.2 TWh per year. A decent estimate for average worldwide CO2 intensity of electricity is 400 gCO2e/kWh (that's 400 tons per GWh). A typical personal footprint including some international flights will be in the tens of tons.


It's worth considering that not all Wh are equal in environmental impact. In the case of many (most?) big tech firms, a large proportion of their energy consumption is sourced from renewable sources.


If it isn't on the Sprint board it doesn't exist.


You also have to assume that at least one Microsoft employee has Firefox installed. There's no bug if there's no users


I work at MS, tried to use Firefox but couldn't because FF doesn't integrate with the Windows cert store. Crucially, this keeps Windows Hello (TPM auth) from working, which makes it useless for any internal websites. For a while I used a hand-compiled PKCS#12 plugin that bridged to the cert store, but that was extremely fragile and eventually I gave up.

I think this is probably a major blocker for many enterprise users, and wish Mozilla would have fixed it.

edit: it looks like they may have fixed this in the past couple years, though you might have to go poking around in about:config.


Current MS employee here. For a time this was true, but FF recently added this integration. No about:config needed, there’s simply a checkbox under the FF security settings. Since this was added, I have gone back to using FF as my daily driver, and I haven’t really encountered any other friction.



Firefox not integrating with the Windows cert store is actually a good thing in many use cases. The ability to have an alternate browser that's not integrated has saved my butt more than once.


Microsoft now blocks non edge browsers with conditional access policies.


You assume all Firefox users are on Windows (they're not) and that all Firefox users on Windows are affected (I and my SO were not).

Who knows what edge case triggered that bug to manifest but I for one haven't seen it in the wild in the years we've been using FF.

Probably difficult in such a large org to allocate dev resources to chase down and fix a bug few people were impacted by.


Around 80% of Firefox users are on Windows, per https://data.firefox.com/dashboard/hardware

That same site also suggests that Firefox has around 200e6 monthly active users, the average user uses Firefox 3.5 days a week, and for 5.5 hours per day.

My math could be wrong, but taking the above into account, and arnaudsm's 5 W estimate, I come up with an upper bound of around 80 MW. Discount that further by whatever proportion of Windows users you assume were actually affected. Not a whole coal power plant, but nothing to sneeze at.


Wow, that's fascinating. It really speaks to the utter dominance of Windows over Linux more than anything else. Like even among Firefox users, as of last year, there were an order-of-magnitude more Windows 7 and 8 users than Linux 5.x users.


Don't have any data to back this up, but I would think that the average linux user will instantly turn off firefox telemetry and won't show up on these graphs. It's one of the first things when I install firefox, disable ff telemetry, set privacy mode to strict and then install uBlock. Nevertheless Windows has a huge market share, even if no one turned off data collection, and the year of linux on desktop didn't happen.


Similar numbers can be seen with Steam demographics: Linux is nothing compared to Windows 7, to this day.



Gaming on a mid-tier modern GPU probably uses around 50-100w, the Steam stats probably have a number of users to multiply with. I'm sure it's a massive amount of power.

I don't like video games and they are not-necessary so I propose that we ban them globally, or only allow gaming if using renewable energy. If you don't live in a place where this is an option, too bad!

Maybe instead of this we require all games to be limited in graphical effect (imagine early source games or something). We could save a lot of power globally if we enforced this.

This is why I strongly dislike this line of thinking. I don't think power plants work that way anyways, they probably make a constant-ish amount of power rather than taking exactly 50w worth of fuel every time someone opens up Call Of Duty.

There are also much lower hanging fruit to get upset about if you care about the planet, like cars with large motors or people with heated drive ways (yes thats a thing).


This is a bad comparison, gaming presumably brings utility to someone whereas this was a pure bug with no upside.


The bug was an accident though, not something intentional!

My intention was to compare the listed things against MicroSoft engineers making mistakes while programming Windows Defender (the quote below), or programmers writing bugs in general.

>Because some Microsoft engineer missed a bug.


People get entertainment out of games. They got nothing out of this wasted cpu.


Now I will have to spend the rest of my life to try to forget that heated driveways are a thing and pretend everything is going to be fine


It's 5.5 MW.

That's not how power or energy works.

5 [W*hr/day] (Kept in energy units per interval rather than average power).

200e6 users/day who surf for maybe 4 hours/day

5 * 200e6 * 4 / 24 = 133 [MW*hr/day] (133/24 = 5.5 MW)

A typical US home uses 11 [MW*hr/y] or 0.36 [MW*hr/day].

(Convert to power draw, that's 0.015 MW.)

mean consumption of roughly 400 US homes (133 / 0.36).

The smallest US grid-tied operational nuclear power plant produces 600 MW. 133 MW very small coal power plant would 400 MW.

https://data.firefox.com/dashboard/user-activity


Your math is wrong.

  5W/user * 200e6 user * 4h/24h = 167 MW
  11 MWh/a/home = 0.00125 MW/home
  167 MW / (0.00125 MW/home) = 133 000 home


I have been hoping for many years that global warming would force devs to go back to a time of creating fast and efficient software in order to help the environment.

Unfortunately devs dont realize how needlessly power hungry software is when they don't need to be. But hey, its easier to use electron and javascript than it is to save the planet.


> let's assume the bug wasted 5 extra watts 4 hours a day.

How did you come to this?


Yes, the trace comparison in the link (which the author says is for browsing YouTube in Firefox) shows average CPU usage is down from roughly 5% to roughly 4%. That’s a great improvement, but I’d like to know why we should assume that translates to 5 Watts.


Great question. Based on my use, it would be a lot more than 5 watts/day.


Don't underestimate Microsoft Won’t Fix which helped IE dominate the browser market for over a decade.


doesn't this bug only manifest itself if one is using microsoft defender as their only security solution, and not a 3rd party AV/IS? if so, then the number of Firefox users in this calculation is much lower.


I don't know if that's the case. I'm a Firefox user but consider all the 3rd party apps nearly as much malware as the things they are trying to solve. I run strictly defender and try to make good choices when downloading and browsing.


I actually replace Defender with a 3rd party choice (Eset) for this very same reason - to wrestle some control over my OS from Microsoft. I find Defender to be overbearing in so many ways.


I agree with this and try to practice myself. I download portablespps.com hoping they have a scanner and stick to the open source ones


well, if we're taking strictly subjective personal experiences as some sort of a relevant benchmark, then I'm a Windows Firefox user that has never used MS defender for any length of time, and always strictly a reliable low-impact 3rd party AV like ESET or Emsisoft. so I guess the two of us cancel each other out.


So based upon rigorous analysis, approximately half of all Firefox users use the default choice, and half use a different AV.


> strictly a reliable low-impact 3rd party AV

Sounds good

> like ESET

What?! ESET used to burn constant CPU when wifi disconnected.


More complicated still, defender does not completely stop working when 3rd party AV is installed. Also maybe Firefox is not the only app triggering this bug?


I run an antivirus suite and have attempted to turn Defender off several times. Windows Update keeps switching it back on.


> Because some Microsoft engineer missed a bug

That might be a bit too kind given how much Google liked to Oops Firefox. Wouldn't be surprised if MS did too.

Oops:

https://www.computerworld.com/article/3389882/former-mozilla...


> the equivalent of an average coal power plant

Produces in an hour, four hours?


Continuous. We need one less coal plant to support the Firefox code after the bug fix.


Coal makes only 12% of the electricity, in the US at least. Natural gas makes 36% and oil makes 33%.

https://www.eia.gov/energyexplained/us-energy-facts/


What does that have to do with anything? "Coal plant" is being used as a unit of power here.


And a unit of pollution. I'm sure that one extra solar plant or hydro plant wouldn't draw as much attention.


I'm pretty sure you're mistaking power for energy. Watts are units of power, which is the rate of change in energy (joules per second). Asking for how much power something produces in an hour is like asking how many miles per hour your car goes in an hour.


Would be interesting to see the energy usage of Windows Update computed in a similar way.


While this feels significant, I don't think humanity is doing itself any favors by guilting itself over problems it didn't (or couldn't) anticipate. Hindsight is always 20/20. This is especially true when "don't over-optimize" is a core tenant in software development. Software would never ship in a reasonable timeframe if we scrutinized every release to this degree during development.


The units don't make sense. You might mean megawatt-hours?


No, it makes sense. The parent is talking about continuous power measured in megawatts, i.e. megawatt-hours per hour, or megawatt-days per day.

300 million users * 4 hours/day * 5 watts = an average continuous savings of 250 MW.


Ok, I get it now. This does make sense.


No typo, I meant Watts. I averaged the 4 hours per day


It was not so well explained, but the GP does mean averaged over 24 hours, the power requirement is 250MW.


one way it could just be mW is if he/she meant “a coal power plant for the 5 years that the bug was active”


Only works if this was a constant drain for everyone, though? If it only saves 100 watts once a month, from a quarter of users, things change. A lot.


How are you calculating 5W? And subsequently 20Wh?


Imagine what could have been saved if you stop loading ads in your browser.


user must be running windows


This is just one bug in the world affecting power usage with Firefox. There are loads more like https://bugzilla.mozilla.org/show_bug.cgi?id=1404042 which caused me to abandon it on macOS as my primary browser.


> mpengine.dll version 1.1.20200.4 was released on April 4, so the fix should be available for everybody now. See the end of comment 91 to know what version you are using. Also, the latest discoveries in bug 1822650 comment 6 suggest that we can go even further down in CPU usage, with all antivirus software this time, not just Windows Defender.

Really nice to see open collaboration between Mozilla and Microsoft development teams resulting in a net improvement for everybody.


Yes. I mean it took 5 years, but who would count. /s


People care about open Firefox bugs much older than that. Basically any long-lived program will have ancient bugs that never made it onto someone’s todo list.


For example, it only took 20 years (!!) to stop Ctrl+Q from quitting Firefox on Linux. :)

IIRC, a couple of patches did get submitted, but never accepted for unknown reasons.


Isn't Ctrl-Q quitting the program normal and expected behavior? For example, right now on Firefox 111, I can see in the File menu that the "Quit" option has the keyboard shortcut Ctrl-Q. I'm fairly sure it works.


On linux, I'd want it to be dictated by my DE or WM. I have Super+shift+q set to close windows on my machine, so I wouldn't be crazy about software adding its own keybind to do the same.


It is a very common binding.


Thank you, I was wondering why it stopped working, but never took the time to investigate... :D


Well, a net improvement for the people who paid Microsoft for an OS that wasted their energy and wore down their computer (heat damage) for 5 years.


I'll note that mpengine.dll was updated for me on Windows 11 Pro 21H2 in spite of my blocking autoupdates via Group Policy.

To be clear: I'm aware Windows Defender updates itself independently of Windows Update, and I actually don't mind and even appreciate this behaviour since Defender updates almost always aren't intrusive unlike the rest of Windows Update.

Just an FYI for people with similar configs.


This update is causing intermittent browsing issues in Firefox and Chrome for me and several others (not Edge, though). Getting 'PR_END_OF_FILE_ERROR' in Firefox, with Chrome experience issues at the same time (though Chrome does eventually load the page)


If you're on a Mac and using FF (probably not FF specific), turning off "ambient mode" in youtube can save 30% cpu. I just found this out while searching why FF was taking 90% of my cpu while watching youtube videos in normal mode, but went down to 40% use if viewing in full screen. Turns out that this youtube "ambient mode" was the culprit. My lap is now cooler and the fan doesn't turn on anymore. I wonder how much power I've wasted due to this new "feature" they added 6 months ago that I didn't know about.


To save a search:

"Ambient mode uses a lighting effect to make watching videos in Dark theme more immersive, by casting gentle colors from the video, into your screen's background."


To save another search:

On desktop and mobile devices:

    While playing a video, select the Settings button.
    Locate the Ambient Mode setting in the list of preferences.
    Toggle it to off to disable Ambient Mode for all videos on YouTube (in that browser).
It's in the same popup used for video quality and playback speed.


For those who may be wondering, the Settings button referred to here is the gear button in the Youtube video player.


for those unfamiliar with visualising a gear, seek the doughnut with a notched circumference


for those unfamiliar with visualizing a doughnut, imagine a bagel-shaped treat of sweet cake-like dough, deep-fried and frosted, with optional sprinkles


For those unfamiliar with visuals a bagel, imagine a gear shape, smoothed out and without protrusions


What's a bagel?


a jewish torus, plural torah


A bagel is a first-degree pretzel.


For those people unfamiliar with a pretzel, imagine eating something that is making you thirsty.


For those of you unfamiliar with eating, it's similar to photosynthesis, but with matter instead of sunlight.


For those people unfamiliar with this joke, watch the TV show Seinfeld.


For those people unfamiliar with 'TV shows', they are similar to HBO series, but with less interepisodic narrative.


lmao


I dont have that option. firefox on windows 10.


only shows up when in 'dark mode'.


that is some amazingly crap UX. who the hell thought this was a good idea?


Ambient mode is a dark mode specific feature. If you don’t use dark mode then it’s not going to do anything.

Completely hiding it in normal mode isn’t the best choice, but it is somewhat understandable.


In dark mode. Don't see it. Windows 11, Edge and Firefox.


Windows 10 and Firefox. In dark theme. No Ambient mode.


this makes sense. I stopped doing per site dark mode and just use the darkreader extension to make everything dark mode, so the option never appears.


Neat idea, I bet the intern had fun implementing it, why was it on by default?


This seems unnecessarily passive aggressive. Everyone makes mistakes or bugs, intern or not. It makes no sense to get this salty about basic human error. Also there's nothing wrong with implementing minor UX enhancements.

If anything redirect the frustration to the leadership that doesn't prioritize fixing these kinds of errors.


I don't think there's any error to fix. It's a feature - casting light from the video onto the UI, using JS, surely takes that amount of CPU.

The question of why it is on by default stands - because it's little bit of eye candy, vs people's laptop batteries, CPU that could have been used to get other stuff done faster - so also their time, device thermals etc... I don't think it's just unnecessarily salty to point out how the choice to turn this on by default should have been more nuanced and thought through.


IMO the implementation sucks and the feature is questionable. Recently I set the browser to dark mode, which tells YT to also use dark mode, and if I haven't read here I wouldn't know that this is a toggleable feature. It's sad when we can't tell a feature and a bug apart.


Not being able to distinguish between a feature and a bug is a feature, not a bug.


It's an UI issue.


How much can websites determine about the power of the device they're running on? Obviously it'd be a security issue for them to know too much, but it would be nice to be able to progressively enhance the experience for more powerful devices that can handle it, beyond just mobile vs PC. Even just knowing whether a device was running off battery power could be useful.


Here's what's available, requires permissions:

- BatteryManager.charging

- BatteryManager.chargingTime

- BatteryManager.dischargingTime

- BatteryManager.level

https://developer.mozilla.org/en-US/docs/Web/API/BatteryMana...

https://caniuse.com/?search=BatteryManager


Isn't available in Firefox though...


I don't think you really want websites to make this determination anyway. There are a million reasons why a user might want a website to use fewer resources than their machine could support.

It should just be a setting the user can select. No probing of the machine necessary.


What should be a setting? This specific youtube ambience thing? That is, but it shows the issue - you need to pick a default, and most people won't know they can change it. Having some idea of the capabilities of the device you're running on could allow you to choose sensible defaults.

But if you mean there should be a setting in the browser that websites could check, I agree that could be better.


Sensible defaults go a long way, yes! Although in the case of this YouTube thing, I think the sensible default is to have it disabled regardless of the capabilities of the machine.


It's not unreasonable to hold YouTube devs and QA engineers to a higher standard than everyone else who doesn't work for a ~trillion dollar corporation or deploy code that runs on billions of devices.


Just to be clear I was being a bit snarky, but what I meant is that this is sort of a small, fun, less important project that could be easily given to an intern.

I don’t think there is a bug? It seems like a sort of image processing thing that might take a bit of compute run. To the extent that there’s blame, I’d lay the blame at the feet of whoever decided it should be turned on by default.


This is definitely worth getting salty about when you consider the cumulative electricity wasted for something so trivial. Google should be strictly monitoring performance and CPU consumption of their changes on youtube since a screwup there is the climate change equivalent of paying for 747s to fly in circles.


We aren’t talking about misaligned element here, you know.

There are millions of FF Mac users, it’s not unreasonable to expect YouTube to do some basic testing. Never got any issues showing ads, though.


Because the target audience for the feature is not tech savvy people but common users whom won't know it exists until it is shown to them/might be intimidated to delve onto FF settings

If you are tech savvy, you are then expected to be able to "bear the burden" of turning the feature off if it bothers you


Hell, I'm tech savvy - not a tech worker, but you'd better believe that you want me to be your end-user contact, I know a hell of a lot more than the people I work with - and I didn't even know this was an option. I'm not afraid of fixing FF settings, done it plenty of times. It's on by default. If someone who can install OpenBSD and make it a router for DSL over PPPoE in 2001 (side job) doesn't even know it exists and eats cycles [i.e., a "prosumer", not an expert, but not too far below a new hire and well beyond the masses), it's a bad idea. I don't have time to stay up on every way that people want to eat my electricity. I do know that YouTube spins up the fan on my iMac with disturbing regularity in a way that videos from alternative sources do not. So it's not the decoding.


> might be intimidated to delve onto FF settings

It’s a YouTube setting, not a Firefox setting.


Looks like it's a Youtube feature rather than a Firefox feature?


They are not the intern anymore - they are senior vice president of battery draining, this feature absolutely killed it at the end of year review.


For the replier that wanted a more substantive response; my comment can be expanded on by pointing out it is ultimately a commentary on the motivations involved in the employee-manager relationship within the bureaucracy of the mega-corporation, within a society where financial success drives including but not limited to comfort, mate selection, breeding, health, longevity and day-to-day basic survival itself - factors with the importance of which has been determined by millennia of evolutionary biology.

Ultimately the result can be that an employee chooses health, survival and the social status that a job title brings and ships a product or feature that they can sell to their manager as innovative / disruptive / greatness reimagined.


I really like it!

Then again, I am using a real computer and not a toy.


> Neat idea, I bet the intern had fun implementing it, why was it on by default?

Total speculation, but Firefox seems to be pushing out a lot of UI gimmicks. Maybe they're trying to drum up interest in the browser that way, since they seem intent on killing many of their other differentiators.


This is a YouTube feature, not a Firefox one.


The "average color" (or whatever it is) could have been pre-computed server-side rather than tiring out the poor innocent client CPUs.


But then Google would be responsible for that one-time computation instead of making the clients do it billions of times.


They could do it on a few clients then ship the data back to the server. If they’re resourceful those clients don’t even need to be watching the video! (they could send it and compute the output in the background of another stream)


But that's a distributed problem now and those use up valuable developer time, which we know is the most important resource in the world...


Yeah, but if google solve distributed processing just imagine the cost savings elsewhere (imagine involuntarily crowdsourcing video encoding, it would save them millions)


The effect is created by scaling and blurring the storyboard images that are also used for the seek preview. The image is refreshed every ten seconds, with a CSS opacity animation fading from the old background image to the next over a couple of seconds.

This sounds like it should be relatively cheap if composition is properly accelerated.


Couldn't this be done cheaply on the GPU?


The browser isn’t the ideal place to do things “on the GPU” unless the site is designed around it.


It could probably be somewhat cheaply extracted during decoding.


They went for copying philips ambient lights on tv's but with software, what could go worng


> make watching videos in Dark theme more immersive

the best way to make youtube videos more immersive is to block obnoxious advertisements, remove useless algorithm-driven recommendations, and delete the comment section


oh i saw this happen to me the other day, i was wondering if it was a new youtube feature or something. can't say i care for it.


Just noticed it recently too, though it might have been an update to the stylus theme I use, I actually quite like it


Thank you! I had no idea this was a thing YouTube did.


As I don't use edgelord mode I'm guessing I don't have to worry about it.


This is why I like terminal, rss, or other technologies where it’s hard to add this kind of fireworks to the UI.

When done right, sure, they improves the user experience by some percentage. But when done badly, the UX goes down by orders of magnitude.


If you turn off your computer power usage goes to 0% too.


Not for long! [1] It's an unconfirmed rumor for now; but perhaps we're heading there.

> The iPhone 15 Pro and Pro Max will use a new ultra-low energy microprocessor allowing certain features like the new capacitive solid-state buttons to remain functional even when the handset is powered off

[1]: https://www.macrumors.com/2023/03/20/iphone-15-volume-mute-b...


absolutely. besides, graphical UIs bombard the brain with everyone's unique take on visual aesthetics, consuming limited mental resources like attention


I honestly thought my monitor or GPU was having issues with weird colour banding around YouTube videos. Turns out it was an intentional choice they made to do that. I don’t know why it’s on by default.


Or stop using the crap YT frontent entirely - https://yewtu.be

Run your own alternate frontend - https://github.com/iv-org/invidious


Wow! didn't know that. Very helpful


That site doesn't even seem to support anything over 720p, so much for crap


Similarly gifs and animated emojis in Slack chews up the CPU. Something like 20% at idle before I turned it off.


Where is that setting? In YouTube Settings? I don't see it, there.


It's not in the general settings - instead it's in the setting menu in the video player itself, where you'd select the quality and playback speed, etc.


its not there for me. I dont see it in any settings anywhere.


Is Dark Mode in either your OS or Youtube turned on? It's not an option unless you're in DM.


I stopped doing per-site dark mode and just use the darkreader extension, which is why this option was never there. I call it a win to not deal with per site settings.


oh, I never use dark mode so I guess that is why I don't see it.


I don't see it either, maybe it's on an A/B rollout for desktop.


If you use uBlock, add the following to the filters:

  youtube.com###cinematics.ytd-watch-flexy


I think it is time to have a way to fine tune consumption based on settings. I assume the less complex way to do this is, really, use the telemetry information gathered.


To be fair this wastes CPU cycles in all platforms. And it probably took more human effort than keeping Google Reader alive FFS.


Same behaviour for me using Safari.


as I don't care about the comments section or the recommender algo, I search (youtube-fzf) and launch (yt-dlp + mpv) youtube videos directly from the terminal. i have a bash pipeline for this and, naturally, it is very resource efficient


post the script pretty please


This is one of the myriad reasons why I have a strong preference for Linux.


Something happens

> This is one of the myriad reasons why I have a strong preference for Linux.


Because browser users on Linux have never, ever been shafted by a browser bug? Riiiiiight.


I just bought a Macbook because my dedicated Linux laptop, made by a popular Linux-only manufacturer, had so many issues that I got tired of diagnosing. I love Linux, but it's not a panacea for every computer issue under the sun, just a few of them. I, personally, am stoked I no longer have to deal with issues with this new machine, and can just take it into a Genius bar appointment to let someone else deal with it, for pennies a day. You can't get that on Linux!

Feel free to tell me I'm a sell-out, I am happy to be one today.


I switched to linux. I like it and haven't really had any issues to speak of. Not with sound, video, wifi or any of the other things people complain about. My fan went, but likely it was a pet fur issue, and easy to fix... I'm not an admin. I know how to use the command line, and how to use it as a work machine. Really my experience over the past 3 years, its been as trouble free as my Mac used to be. It really is the great development platform.

Glad you like your machine.


Can you hybernate your system without issues?


it "suspends" fine. I have 3 options, suspend, off and reboot. I usually suspend overnight. Sometimes I turn it off. It boots super fast anyway.


Windows can't be hibernated without issue. My wife had issues the past week already on Windows. She wants me to put Ubuntu back on the machine.


> I switched to linux. I like it and haven't really had any issues to speak of. Not with sound, video, wifi or any of the other things people complain about.

What's your machine+distro?


I'm running a last gen pangolin, (AMD 5700U). Pop OS. I don't do anything too crazy. I develop on JetBrains I did keep it awake for 3 days doing some genetics work I usually do on a cluster. Its fast, quiet and decent on battery. AMD on the notebook is really impressive.


> I, personally, am stoked I no longer have to deal with issues with this new machine, and can just take it into a Genius bar appointment to let someone else deal with it, for pennies a day. You can't get that on Linux!

Honest question. If you could get that on Linux, would you? and what kind of pricing would you consider reasonable? Is it something that would have to come with the computer (i.e. would you pay for it separately or would you only use it if it was "free" aka included with your laptop purchase)? Did you stick with the vendor-provided install or did you wipe and install your own preferred distro?


I would pay the same amount for a Linux laptop that worked as easily as a MBP and had similar build quality, performance and battery life.

Howver, whatever crazy-stable and easy to use and well supported hypothetical Linux this is wouldn’t be compatible with my “real” Linux use cases so I would then also install Arch or whatever and live with constantly borked everything and just swap between my Arch “Dev” OS and my “Linux Mac” business/work/consumer OS.

Current Linux cannot be made “MacOS”-stable. But maybe in 5 years.


Current Linux cannot be made “MacOS”-stable. But maybe in 5 years.

Stable?? Linux is a rock of stability, you must mean something else.


Just today I needed to run some software only available in .deb and only on Ubuntu 22.04. So whipped out an old laptop that had Ubuntu installed and ran sudo do-release-upgrade to upgrade to 22.04. boom! GUI gone and the terminal was flooded with weird filesystem errors. I had to spend the whole afternoon reinstalling from scratch.

I would love a Linux daily driver but I've had similar experiences to the above every time I've tried it for the last 15 years.


https://manpages.ubuntu.com/manpages/bionic/man8/do-release-...

It says use it for non-GUI newest release upgrades? Why are you surprised the GUI disappeared?!

Also, why did to do that, if you wanted to keep the current version?

Anyhow, I never have such issues with Debian. Do bear in mind, Ubuntu(or any distro) is not Linux.

Are you going to call Linux "unstable", because you get a crappy phone with Android(which is judt more Linux).


Yeah I’m talking about Ubuntu being hyper aggressive about upgrading my GPU drivers even after I turn off all auto updates and borking my entire OS install and this happening enough that it’s faster just to make scripts which auto reinstall Ubuntu and all my versioned packages from scratch.

Or non-Ubuntu, software just randomly falling apart after 2 years. Like I need a newer kernel for a new Wi-Fi card but then I need a different GPU driver and that’s incompatible with some softwares UI library, so they update that but it has an incompatibility with something else.

Or how Linux still has Wi-Fi 6 totally broken.

Sure, Linux is rock stable if you have a production environment where everything is nailed down. But from an actual daily consumer user point of view it doesn’t feel stable. At all.

I love Linux. I love fixing it when it borks. But my god does it break every week.


No. Apparently for you, Ubuntu breaks every week. Not Linux.

Don't blame the good, for the things bad does.

I go months on my Debian desktop without issues, through upgrades, and only reboot for kernel updates!


The kernel yes, the distros and userspace, not so much. Linux is my go to for hosting, but macOS and Windows are designed to prioritize for the desktop user experience.


You seem to be confused, Linux is the most stable OS available for desktop or servers.


You are not sellout but just the average Joe. No problem with that I guess. Have fun with your Mac that uses a soldered ssd that when failing makes your whole Mac useless as well.


Nice part is that I have a warranty for hardware failures! The big issue I have initially is getting used to not having quite as much control over the OS level stuff like Linux, but I was avoiding even using my Linux laptop, so any improvement on usability will get me more productivity than having full control of every aspect of my computing. Maybe someday I’ll have time like I did when I was young, but until then I just prefer to have everything work together seamlessly, as things should in 2023. Apple offers that kind of support and service for a price I’m willing to pay.

Being an average Joe is fine with me.


Meh hasn’t happened yet but I’d just buy a new one. That being said, I always also have a windows and Linux machine, they’re just not my daily drivers.


You just didn’t get my point: in 2023 it’s about the environment at least for me. That’s why I wrote my comment. Just buying a new one is somehow dated imho. But ymmv


With all the attention being paid to macOS these days, there's enough mods and addon's that I don't miss Linux so much on my laptop. Hammerspoon gets me drag and resize windows how I want, and there's Rectangle.app for tiling-ish window management. There's no /proc, and all the rest of the cli utilities are just wrong (netstat, route, top, etc) but I can live with my M1.

(brew addresses a lot of the issues though, even if I do have to remember to run gdu instead of du (for gnu du))


> With all the attention being paid to macOS these days, there's enough mods and addon's that I don't miss Linux so much on my laptop.

One problem that I have is all that all those mods and add-ons are out there, but there's a real mindset that "everything must be an app" and a pursuant mindset that "might as well charge for it". I don't mind paying for a complicated app, but there are certain basic features that used to be served by the freeware model on Macs that just aren't any more, and my impression is that things are heading ever more in that direction. (As well as in the direction of subscriptions over app purchases, which are right out for me for basic utility needs.)

You mentioned Rectangle.app and Hammerspoon, which are both open source. Do you have any good recommendations for where to look for other high-quality open-source mods and add-ons for macOS?


What are you looking for? I don't have anywhere specific in mind, but with hammerspoon I find myself writing lua to write the functionality I want. Which I used to do with Linux. Ideally I'd write up and post it on a blog or github somewhere though.

Personally I'm okay with paying for software (though I do whine about it from time to time). Yeah it should just be built in functionality, but someone devoted their time to building a thing, so I don't begrudge them a couple bucks for it - so long as it's a one-time purchase and not a recurring subscription model for unchanging functionality.


yabai is the full featured window manager for macos


You're a sellout but I am too, so welcome :).


This has nothing to do with macOS vs. Linux, though


This happens on Linux too. I was wondering if the weird CPU-hogging flickering was a bug in my compositor (picom) or window manager (i3) or browser (Firefox). Turns out to be a "feature".


not sure what your point is... ambient mode is a visual effects thing YouTube does and reading the descriptions, not surprised it causes increased CPU usage regardless of OS.


Recent and related:

Firefox engineers discover a Windows Defender bug that causes high CPU usage - https://news.ycombinator.com/item?id=35458746 - April 2023 (215 comments)

Is the current post significant new information* or just a repeat of that submission?

* https://hn.algolia.com/?dateRange=all&page=0&prefix=false&so...


It's so frustrating this discussion took five years.

I'd be grateful for an overview of the bug. I don't think I've seen it on my two systems but I can't be confident.


Five years is nothing for MS. You should see how long the bug in File Explorer has been there, where after navigating to a folder and pressing the down arrow, the second item is selected instead of the first. And it's one of those things that, even though I'm aware of it, it still always catches me causing extra keystrokes. It's like they're trying to force me to use the mouse for some reason.


That one I can almost agree with the reasoning for. The first item is selected by default but also by default you have to intentionally trigger a keyboard navigation for it to go into that mode since most don't intend to do that when hitting enter on a freshly loaded directory. As evidence of this behavior instead of hitting a directional key to change the selection whacking space should activate the highlight on the first item and then another navigation action is needed to actually do anything.

I think it'd be more convenient (for me as a keyboard centric user at least) if it were done differently but I don't think it's actually a bug as much as an intentional decision at the cost of keyboard user. This is unlike the Defender issue where it's of no purpose to be significantly slower than it needed to be.


Wanted to chime in with this. Agreed, it's not a bug. Just a UX decision that was made to prevent people from hitting Enter over and over and constantly launching/navigating stuff.

I often accidentally rename files in macOS Finder (hitting Enter activates rename).


That doesn't particularly sound like a bug. There may be varying opinions about whether it's good or not, but it kind of almost makes sense to me.


I think you can press Space to select the first item.


Windows Update and Windows Defender are notorious piles of shit that eat up huge amounts of CPU for seemingly no reason.

The problem is that there is zero incentive to get them right. Nobody is going to get promoted because they use 10% less CPU. Nobody is losing their bonus because 10% of all computers melt down. etc.


This bug would suggest that ETW is actually the giant pile of shit.

Which, in my experience, isn't far from the truth.

Other people think so too: https://caseymuratori.com/blog_0025


That's one way to look at it, but a very biased take. An equally valid take is that Firefox was calling an expensive platform feature too often, and even though it has been killing performance for years (possibly, for the entire history of the project) nobody noticed or bothered to fix it on the application side.


The platform feature in question was normally cheap and just made artificially expensive by Defender intercepting calls to it and blocking until analysis was performed. I don't think it's the FireFox' team's responsibility to be aware of and take into account arbitrary software intercepting system calls.


> I don't think it's the FireFox' team's responsibility to be aware of and take into account arbitrary software intercepting system calls.

One of the first, hard lessons I had to learn about web development (like, stare-at-a-wall-and-consider-my-career-hard) is that web development is way more about network effects than application architecture.

Real people run systems with real configurations, and when you're targeting "the public" as your userbase you must account for that. And Mozilla knows this: if you go into the source code (circa 2009, YMMV) and look through the initialization and boot-up logic, you would find places where the system used heuristics to figure out whether some extensions had been installed in odd places instead of the "Extensions" directory (because the tool had been installed before Firefox) and hot-patch paths to pull in that component. Because if a user installs Flash and then installs Firefox and Flash doesn't work in Firefox, it's not Flash that's broken... It's Firefox.

It doesn't matter if the bug is in "Microsoft's code" or "Mozilla's code." That's unimportant. If you're a Mozilla engineer, all that matters is whether this bug would cause a user to get pissed off and uninstall Firefox.

Thats. All. That. Matters.


I completely agree with you and have been on the other side of this too, having worked on a native enterprise app running on various MacOS, Windows, iOS and Android versions. Customers don't care if you have a great explanation why stuff with your app doesn't work. That being said, it's completely unreasonable to have the proactive expectation of something working well today (writing many files) breaking tomorrow (due to defender heuristics changing) and proactively trying to prevent this by optimizing. Mozilla reacting to this by both reporting the bug to Microsoft and optimizing to work around the problem is really the best you can do.

"They shouldn't have written so many files in the first place" is not a valid preventative strategy, but a one way road to premature optimization hell.


Yes, but it’s incredibly difficult to work out what is causing the problem. That’s what happened here.


It's the application owner's responsibility to make it the app run as best as it can on a given platform. Platforms are messy, but you have to deal with it. You should escalate to the platform owner, sure, but you can't rely on them fixing it in any reasonable time-frame.

I worked on a desktop<->cloud file sync app. On Windows, only one badge can show up on a file's icon in Explorer. If there's multiple apps trying to set the badge, who wins? Well, it depends on the lexicographical order of the registrants names. So what did we do? We added some spaces to our registration name to make them show up first. Good for the user, as best as we can know - since the user or their admin had to install the app to get these badges in the first place. And they were useful ones too - whether a file was synced or not. We tried our best, and escalated.


> I don't think it's the FireFox' team's responsibility to be aware of and take into account arbitrary software intercepting system calls.

Per the bug report, Firefox was generating up to ~14,000 calls where Chrome was generating ~300, though.

Surely it is Firefox' team's responsibility to use system calls in a sane way, say not almost 50x more than the competition?


> Surely it is Firefox' team's responsibility to use system calls in a sane way, say not almost 50x more than the competition?

The docs for that function don't say anything about performance: https://learn.microsoft.com/en-us/windows/win32/api/memoryap...

They also don't say anything about "sane" usage, and while I don't have an MBA, I'm pretty sure they don't teach anything about `VirtualProtect` ratios when doing competitor analysis.

One possibility is that the Chrome team's implementation was more efficient due to luck, or they invested the resources to identify the performance characteristics of this function call, whereas the Firefox team missed it. I don't think "Chrome has more development resources than Firefox" is news to anybody.


There are three facets to any protocol, API, or standard in software:

The spec, the intent of the spec, and the implementation of the spec.

Doesn't matter what the docs say; what matters is what performance testing shows. Docs lie.

And even if Chrome lucked into a cheaper implementation: that luck has given them a market edge.


Did you read the bug report? This is literally about writing to files in a temp folder. Surely you can optimize that but you should also be able to assume that this does not use excessive amounts of CPU on a modern operating system.


I usually assume that even vaguely considering looking in the same direction as a file on windows will melt my CPU.


Windows Search Indexer automates that for me. CPU keeps burning even when monitor is off and I'm working on another computer.


Why is Search Indexer constantly rescanning the same files? Can they not cache the results from the previous scan? That and OneDrive are constantly making my work laptop scream.


Come on, anyone that has even unzipped Linux-centric stuff on Windows knows how slow individual file operations are compared to Mac or Linux.

It's very common knowledge that on Windows you will get terrible performance if you have many many small files.

I don't know why Microsoft doesn't fix that. Maybe they can't for compatibility reasons or something. But that's the way it is, and any software that wants to run well on Windows needs to deal with it by using fewer bigger files.


Yes, I have read the bug report. It mentions that Firefox writes wayyyyy too much in the temp folder. It also mentions that the team should fix this behaviour independently of the fact that some of those calls are more costly than they should be because of the bug in Defender:

> With a standard Firefox configuration, the amount of calls to VirtualProtect is currently very high, and that is what explains the high CPU usage with Firefox. The information that the most impactful event originates from calls to VirtualProtect was forwarded to us by Microsoft, and I confirm it. In Firefox, disabling JIT makes MsMpEng.exe behave much more reasonably, as JIT engines are the source of the vast majority of calls to VirtualProtect.

> On Firefox's side, independently from the issue mentioned above, we should not consider that calls to VirtualProtect are cheap. We should look for opportunities to group multiple calls to VirtualProtect together, if possible. Even after the performance issue will be mitigated, each call to VirtualProtect will still trigger some amount of computation in MsMpEng.exe (or third-party AV software); the computation will just be more reasonably expensive.


> It mentions that Firefox writes wayyyyy too much in the temp folder.

> > the amount of calls to VirtualProtect is currently very high

Calling VirtualProtect is not writing to the temp folder. The VirtualProtect call is to change the permissions of the in-memory pages. It should be an inexpensive system call (other than the cost of TLB flushes and/or shootdowns).


You really shouldn’t assume anything in software or any complex system. I know this wouldn’t fly at my job, and I don’t work at Mozilla.

This is basic testing.

Normally this is the mark of a bad software engineer, but attempting to blame the platform you’re on for your lack of testing takes it a to a new low.

Mistakes happen, admitting full incompetence that basic testing isn’t done is damning. This is not a good defense of Firefox nor Mozilla.


Not sure what your job is, but in my job:

- we implement a feature, test it thoroughly for functional and non-functional requirements

- when we are happy, we release it

I don't see myself being responsible for a third party software company coming along years later and introducing a bug in code that injects itself between my software and the operating system that users of the software I wrote happens to install at some point.


Maybe you're not responsible, but if someone says "something changed in the OS and your previous method is now adding substantial overhead", you could either a) report the change to the OS and mitigate or b) report the change to the OS and ignore the problem for years. It sounds like Mozilla chose b, for whatever reason.

As a software developer, I've had to workaround many many bugs in OSs, especially when dealing with updates to Android. It's just part of the job.


The OS isn't some random third party software, it's one of your dependencies. Your software doesn't work without the OS and if it also doesn't work with the OS, it just plain doesn't work.


That's really not a tenable mindset to be taking these days. With how much Windows has become a constantly-moving target rather than a stable platform, you need to regard it first and foremost as your adversary, whether you are developing against it or are simply an end user. And the days of being able to thoroughly test against every relevant version of the OS are long gone; Microsoft has ensured your QA will be Sisyphean.


At the end of the day, it's about your users.

If your users are on Windows, you have to be where they are. Moving target, wonky API, warts, and all.

Yes, it's Sisyphean. That's why my shop had a whole room stuffed with parallel Windows installs. We couldn't afford to have our users be the first ones to notice Microsoft pulled the rug out from under us again.


You basically just said you stop supporting things once they ship. Doesn’t work properly on Windows? Shrug.


Which is my original point about quality software engineering.. Apparently many don’t test, and if it’s broken, don’t care!


Windows Defender real-time protection is enabled by default.


> arbitrary software

Windows Defender isn't "arbitrary software" - it's built into the OS and enabled by default. To anyone building an application for Windows, it should be considered part of the platform.


I'm not sure how you can possibly qualify VirtualProtect as "an expensive platform feature". Looking at the operation that VirtualProtect actually has to perform, from first principals, it should be one of the cheapest syscalls in the entire kernel.

The bug was that ETW (in the antivirus process) was doing something braindead; zeroing a megabyte of memory unnecessarily every time someone called it just to get the size of a buffer.


> it should be one of the cheapest syscalls in the entire kernel.

That's an educated guess... that is unfortunately very easy to disprove :(


I'd love if you'd elaborate on this. I know very little about what VirtualProtect actually does under the hood but, in theory, it should just have to flip a couple bits in the address space mapping which says what the protection level is.


It also needs to flush TLB entries. Changing permissions on page table entries is unfortunately a bit more complicated than just twiddling some bits.


> I know very little about [...] but, in theory,

You are assuming things you are unsure about :) Even if your assumption was correct things could change from one Windows update to another.

When I worked on a time sensitive java project, our test suite had benchmarks for JDK functions as simple as Arrays.copy() to make sure we are the first to notice if something changed under the hood.


Exactly. If you're going to assume some call is free, write that down in a test that can be periodically verified and, preferably, is.


Branch prediction should be a super-dumb algorithm, but then Spectre comes along and, oh dear.

Malware protection algorithms make fools of us all.


I would not consider a fast AND accurate branch predictor a trivial matter at all


There's nothing about the bug that had anything to do with malware protection, or branch prediction, so I'm not sure how that statement applies to the conversation.

The bug was in ETW, which just happened to surface in a windows utility that ostensibly protects you from malware.


Also worth noting that the "expensive platform feature" you refer to in this specific case means "writing to a file". Something as basic as this should be assumed to be fast on modern operating systems.


No it had nothing to do with Firefox writing files. Firefox was making a bunch of calls to VirtualProtect. Windows Defender (MsMpEng.exe) was then writing to file (an sqlite database) every time one of these calls was made, which was slowing down the system.

This comment is a good summary of what the issue was once they understood the problem: https://bugzilla.mozilla.org/show_bug.cgi?id=1441918#c82


Where did you get that idea? Sqlite? Windows Defender isn't using sqlite at all.


It detects the use of SQLite, then copies it, etc etc. Read the bug for more details.


I can assure you that I perfectly understand the bug and corresponding patch/fix. The patch fixes Event Tracing for Windows (ETW) and how it allocates temp buffers.

The speculation about SQLite at the top of that Mozilla bug report is mostly irrelevant.


It is not a bug that there are overlooked optimizations in some platform features. Windows has a ton of slow features. Starting a process, for example, takes forever. It is the responsibility of application authors to write their performance-sensitive critical path in such a way as to avoid bogus platform behaviors. This goes for Linux, which has more than its fair share of brain damage, as well as Windows.


I generally agree with you. Having worked on lots of cross platform software, a big part of that job is to work around quirks of the underlying platforms, which can be significant. However in this case, it's not that Firefox was introducing the usage of these APIs and was then starting to have performance problems. They used the APIs without problems when suddenly Defender came along and slowed them down by orders of magnitude when they had been working fine for years.


Yeah, your program definitely should not do as many useless writes on the system it runs on, it's just bad behaviour. If every program did the same the disk would grind to a halt, SSD or not.


Recent discussion of this here also cited a problem (not sure if it was the same problem) with Defender causing 100x performance drop with some PowerShell operations.


My only interaction with Windows Defender is the (undefeatable) nag popup every boot that warns me it is disabled.


If you use Windows Pro and Enterprise, you can use GPO to disable Defender. Just run gpedit.msc and edit a few of the policies to disable real-time protection etc.

Under Computer Configuration > Administrative Templates > Windows Components > Microsoft Defender Antivirus

  - Turn off Microsoft Defender Antivirus -> set to Enabled
Under Computer Configuration > Administrative Templates > Windows Components > Microsoft Defender Antivirus > Real-Time Protection

  - Turn on behavior monitoring -> set to Disabled
  - Monitor file and program activity on your computer -> set to Disabled
  - Turn on process scanning whenever real-time protection -> set to Disabled
  - Turn on behavior monitoring  -> set to Disabled
Restart the computer and Real-time protection should be disabled permanently (until you reverse the same settings through gpedit.msc at least).


With 11 (or possibly newer versions of 10, haven't tried lately) this doesn't seem to actually disable MsMpEng.exe from loading anymore. Using something like https://github.com/jbara2002/windows-defender-remover seems to work though.


You can also elevate to Trusted Installer or System and completely remove this garbage from your computer.

Alternatively, if you run windows server as your workstation OS, you can perform an uninstall using Remove-WindowsFeature from powershell.

The old gpedit tricks don't really work anymore in my experience.


My car also nags me every time I unbuckle my seatbelt to park yet that doesn't mean everyone should have it unbuckled all the time. There's a reason it's designed to be naggy.

Having everyone easily disable Windows Defender will not lead to a great outcome.

There's a reason malware on Windows has been on a steep decline from the Windows XP days and I'd prefer it to keep it that way.


Not all uses cases for a car are the same. Some are held entirely on private property and are used as work vehicles where the seat belt chime would be unnecessary and distracting. Which is why most manufacturers provide a sneaky mechanism to disable it. I own the vehicle, why wouldn't they let me disable the nag?

Their solution? Make it intentionally complicated, but still possible:

Step 1: Turn your headlight switch off

Step 2: Unbuckle your seatbelt and turn the key to the off position

Step 3: Turn your key to the on position till the seatbelt warning light turns off

Step 4: Buckle and unbuckle the seatbelt three times and end on the unbuckled position

Step 5: Turn your headlight switch on for three seconds and then turn it off

Step 6: Repeat step number 3

Step 7: Wait for the seat belt warning light to turn on and off again then buckle and buckle the seat belt


I remember doing this sort of song and dance with my RAM and Jeep. Sometimes I am just moving around a parking lot for a brief moment, or especially when off roading (read: stuck) and don't want the constant beeping.

Seat belts are 100% an immediate habit for me. Driving at any rate of speed without one makes me feel super sketchy and uncomfortable, so the nag is not needed at all.

On my Ford's I would use FORScan to defeat it via the OBD2 port.

I do have a security gateway bypass module for my truck though so hopefully I will be able to start playing around with AlfaOBD soon.


Sounds like you are arguing that seatbelts do not increase the safety of its users when it is used on private property.

I know it’s not your main point. But anyways.. it does not increase the rhetorical power of your comment.


TBH the main reason I commented this was to get some kind of validation from the community (positive or negative). Sounds like I need to turn it back on :)

I really only use this machine for MWII, Halo and Titanfall. It's a glorified Xbox. I even contemplated putting it on a standalone VLAN to 100% physically isolate it from my core net.


Haha, you should enable it with exclusions. It's the best AV out there that isn't an EDR. I disable it in labs but I can't imagine running windows in prod with defender enabled. Don't use windows like it's Linux.


Defender, under certain licenses, is an EDR - https://learn.microsoft.com/en-us/microsoft-365/security/def...


Well they call a LOT of things Defender now anything from email and azure specific alerting to EDR and DLP. It's all "Windows Defender ______". But I meant the consumer license AV.

But also even with the most basic win10, cloud submission (if privacy is no biggie) gets you EDR detections to a point but without the edr console and logs.

When I simulate attacks with defender on, I would spend a lot of time bypassing it but then as soon as I break opsec (e.g.: run whoami.exe) if cloud submission is on I basically burn that technique because the edr in their cloud blacklisted it but with that off I can last as long as I want so long as I don't execute things flagged as malware by the defender on the host (and even then, usually that thing gets blocked not my original technique which I can still reuse).


What is an EDR?



It’s humbling to be in the presence of such greatness.


When I've heard people speak of changing Web browsers in recent years, I think the two most common reasons given are performance and privacy.

I wonder whether this situation with Microsoft Defender cost Firefox some market share.


I can count at least one user that Firefox lost to this bug. Pretty happy with Brave now, won't even bother trying FF again.


Conspiracy theory — could this have been done on purpose for browser share dominance purposes?


Sometimes, but probably not in this context.

a) That'd be a very untargeted way to get that effect; Firefox isn't the only app that's going to be making calls like that.

b) Mozilla doesn't need any help losing marketshare in this era.


Nope. MDE is shitty. It pushes updates to millions/billions of machines without sufficient testing and monitoring.

There was a MDE bug that caused Chrome to crash whenever dragging tabs.

Another time, MDE also appeared to "remove" all of some users apps when it deleted shortcuts.


I would bet it is more likely that MS devs noticed but just didn't care. The farthest it would have gotten in conversation with QA triage would have been "does this issue affect any of our services? Ok then that is Mozilla's problem."


MS?


How fast would this have been fixed if was Microsoft Edge that was wasting CPU time?


Depends on how fast google patched it.


Interesting comment on Reddit from a Mozilla engineer: https://www.reddit.com/r/firefox/comments/12hxqjl/comment/jf...

Careful to talk about how this is entirely a fix for Windows and will improve the experience of folks using other software, not just Firefox.


What apps other than Firefox might this have affected that badly (75% CPU usage)?


It's not clear to me if it's the same bug, but recent conversation here about this issue had this to say [1]:

> It also has a bug(?) which makes method calls 100x slower in PowerShell 7: https://github.com/PowerShell/PowerShell/issues/19431

[1] https://news.ycombinator.com/item?id=35459984


I had an issue in early builds of W11 with use of WSL 2 & Node, Github and VS Code. Something in the git change detection process caused Defender to decide it just decided it wanted 100% of a single thread on the 5600X system I was using. While coding it would just have a core screaming at well over 4Ghz. Just all of Mankind's greatest innovations that lead to 7nm lithography and incredible processor design just to be a space heater. I never did get it figured out at the time. It also re-enables itself. So that's cool.


Defender (or other AV) can slow down a lot of things, but in terms of the exact way that Firefox ran into it, the other apps would be anything with a JIT. Well, a JIT that uses memory protection as a security measure, though that's very common. (After generating executable code, the JIT marks the pages as executable but non-writable, so an attacker can't change the code after it starts running.)

Although the V8 JIT stopped using this, at least in some configurations (?), for the stated reason that it's not perfect—another thread could sneak in and modify the executable code in between when it was generated and when it is protected in preparation for execution. They're instead planning to rely on memory protection keys, which should be faster and more robust, but are only available on some hardware.

JITs can show up in unexpected places. Regular expression engines will sometimes have a JIT.

So... I don't know?


I would think anything with a JIT that is toggling the page protection for machine code many times a second, based on a very quick reading of the bug report talking about VirtualProtect calls and the processing of ETW events for them by defender.


I don't think anything is toggling them back and forth, it's just that a lot of chunks of executable code are being produced. But I could be wrong; maybe if you have space left for more code on a page, you'll toggle it off and append some new code, then toggle it on again.

My guess is that this would mostly come from inline caches (ICs), since they're typically small and a lot of them are generated.


I'm hoping that this fixes other apps, because Defender active scanning is a huge and near constant strain on my CPU.


Newer versions of Thunderbird have been rendered completely unusable unless I exclude %userprofile%\AppData\Local\Thunderbird from real-time scans.


Thunderbird is atrociously slow even without an AV with any mailbox that isn't tiny. Could it be that yours has just grown over the years and Defender amplifies it?


It went from ~20 seconds of freezing on every server request to no freezing at all after adding the exception. That's quite the amplification.


All of them? From IDEs through games to email clients. Remove that malware as soon as you can. Either replace it with some more competent antivirus (not sure there are any) or don't use any antivirus at all - as a visitor of this site you should generally know what you're doing and what is and what isn't safe. I use https://github.com/jbara2002/windows-defender-remover and have been running my Windows machines without any antivirus and without any issue for years (if you ask how do I know Defender sucks if I don't run it - I do run it at work where I can't remove it - only disable it temporarily and it turns itself on again after a while).


Eclipse and IDEA both have tickets dedicated to Defender's shenanigans: https://github.com/microsoft/java-wdb/issues/9


Firefox-related CPU use is only reduced by 75% when this bug is caused. NOT in the general case as this title implies


That's actually fairly clear in the title - the second clause depends upon the first.


Then why are comments assuming a large decrease in power consumption?


Because the bug is a frequent occurrence and the increased CPU usage is frequently noticeable?


Except it's not lol


Because a lot of commenters don't read the articles


Exactly


Looks like there's more work left to do to catch up to Chrome: https://bugzilla.mozilla.org/show_bug.cgi?id=1823634

That bug is more subtle. Apparently the various ways to use VirtualAlloc is not self evident, and some variations have wildly different performance characteristics due to undocumented interactions with Event Tracing for Windows (ETW) events that get sent to anti virus products.

So it's not only the original problem of the events being handled inefficiently, it's also that the way they're generated is a bit of a black box and hard to predict without detailed performance tracing work.


I have screamed about this like a crazy person and filed bugs and was always told, "Meh there's nothing there..."

But if you use Firefox to call yourself on Chrome... you'll see that Firefox takes up a TON more energy on an Intel MBP than Chrome does.

You can tell because Firefox literally heats your laptop up to do streaming videos. You hear the fans kick on, the laptop gets hotter to hold.

Anyway I'm sure there are more bugs like this! Glad Firefox is getting some of the people to fix their code... but look, Microsoft isn't the only culprit. Until Firefox takes as little power as Chrome in MacOS & Windows... I think we should all stay outraged! (=


I wonder if this is why Firefox often gets killed when I have other high-resource apps open?


Good thing its a bug though, not a monopolistic attempt to sabotage the competition running on your platform, by doing strange things with API rodeo. This surely ruined the performance of other software too..


Guess that's why I never feel firefox laggy but others said it is. The first thing I do after installing windows is always installing some other antivirus to disable defender. Because the defender start routine scanning at weird time and lag games randomly, which is really annoying.

I really have no clue why engineer at ms think such behavior is ok. Shouldn't scans like these scheduled at some time slot that people are not actively using computers?


I haven't run in to this issue either. The first thing I did when I bought my windows laptop, before ever booting it, was plug in a debian installer disk and then boot to that, erase the entire drive and install debian. More windows users should try this little trick!


I would like anyone that considers Microsoft to be a recent champion of Open Source to reflect on corporate doublespeak. It's plausible that this bug was engineered as an attack on Firefox.


Have you any semblance of proof of this?

By the looks of it took Firefox a few years to figure out what the repro was, they reported it to MS, it was (very) promptly fixed and they were warned that the syscall they were using isn't being used as intended and they should consider changes to FF for future use cases.


It's the AV that was calling TdhFormatProperty(), not FF. The problem was mostly on the AV side, not FF. FF itself was generating many events due to too many VirtualProtect() calls which in itself was only a smaller part of the problem.


I don't have proof. I'm presenting a theory based on circumstantial evidence. I think it says just as much to reject a theory without proof as it does to present a theory without proof. Let me break down the context in which I make put forward my theory.

* Corporate doublespeak is a well documented tactic in which a business will project a message when the truth is the opposite of the message. Sometimes they use euphemisms, ambiguity, or omissions. I am stating that we cannot take Microsoft's press releases about being Open Source friendly at face value.

* Five years ago Edge was rebuilt with a chromium backend and Microsoft had a large campaign to increase adoption of Edge.

* Reduced Firefox performance would make Edge compare more favorably. This error was clearly in Microsoft's favor.

* It is common for companies that own a platform to create advantages for their applications running on the platform.

* Microsoft has a long history in the browser wars, highlighted by an antitrust lawsuit in the late 90s. Their anticompetitive behavior regarding browsers was a key part of the lawsuit.


> I think it says just as much to reject a theory without proof as it does to present a theory without proof.

Except you don't have any proof, and the proof that opposes your wild speculation is:

- 5 years ago, a bug was opened on the FF issue tracker that over the years had a bunch of derailments.

- A month ago, someone _actually_ investigated, found an issue and reported it to Microsoft.

- Issue was promptly fixed

You're spreading FUD, and attacking _me_ for asking for proof when you made the accusation in the first place. If you want to discuss this reasonably, I'm happy to, but in order to do that, the slightest modicum of bait that points to any sort of effort is required, meanwhile to someone who has worked on software for a reasonably long time, seeing issues spin like this for years is par for the course, and it's great that someone fixed it.


>> It's plausible that this bug was engineered as an attack on Firefox.

> Have you any semblance of proof of this?

Does it need proof? Someone can make a statement like this solely based upon past behavior. They're merely stating that it is plausible.


> Does it need proof?

Yes, it does. Stringing together a narrative and ignoring the proof that contridts your claims is conspiracy theory territory.

> Someone can make a statement like this solely based upon past behavior. Yet all the evidence (the bug linked) implies nothing of the sort.

> They're merely stating that it is plausible.

I can state that it's _plausible_ that the US government conspired with Disney to fake the Moon landing as a show of power to the USSR in the 60's, but I'd expect to be asked for proof, and I'd expect to be accused of spreading absolute nonsense.


I've lived through the browser wars and I can tell you that this would not surprise me one bit.


There's a difference between something not surprising you and a wild, totally baseless accusation. Ill happily eat my words if there is a shred of proof, but right now it's "company fixes old bug when it was reported to them"


What a weird take. If this bug was engineered as an attack on Firefox, then it seems like the project has been infiltrated by bad actors, because the bug comes from Firefox's codebase. Indeed, the developers themselves contradict your comment in the linked bug conversation:

> This problem has two sides: Microsoft was doing a lot of useless computations upon each event; and we are generating a lot of events. The combination is explosive. Now that Microsoft has done their part of the job (comment 82), we need to reduce our dependency to VirtualProtect.

(https://bugzilla.mozilla.org/show_bug.cgi?id=1441918#c90)

Compare how many calls other browsers make (this is also quoted in the link): Firefox was generating up to 46 times more (costly) events than Chrome. It is a bit ludicrous to shame Microsoft for the whole situation.

> Firefox with normal configuration: ~14000 events, 98% of which are PROTECTVM_LOCAL;

> Firefox with the preferences from comment 83: ~6500 events, 95% of which are PROTECTVM_LOCAL;

> Edge: ~2000 events, 91% of which are ALLOCVM_LOCAL;

> Chrome: ~300 events.


It is amusing that anyone thinks a company with > 200K employees and probably 10K products is organized enough for something like this.


Inaction is a pretty low "bandwidth" form of action, and can sometimes produce the results you're looking for just as well, if not more effectively.

Microsoft has a storied history of anti-competitive views leaking to public eyes/ears, something like this is quite literally a matter of not organizing anyone.


Why would Microsoft attack Firefox specifically and not Chrome? Chrome is the bigger threat to their business. Firefox has become almost too small to care about - little revenue, small browser market share.


There's an argument that Microsoft's Edge use of Chromium and then the Surface Duo would cause 'don't bite the hand that feeds you" problems. Not agreeing with OP, but it would make sense.


Do we have to assume negative intent every time something like this happens?


Well no, but I also would question the inverse. Holding accountable companies that gain from possibly bad actions and asking the questions is helpful.

See: Microsoft's Supreme court case over their preference for IE and forced monopoly. While Microsoft 'won' the case, the outcomes were exactly what the case feared but "convenient" political climate helped them avoid travelling back to court of course. Microsoft took extreme steps to avoid being broken up in the 1990s however and it's arguable that one of their political mitigation methods, investing in Apple, actually had worse effects on them. (Prior to the iPhone in 2007, it was assumed that RIM and Microsoft would be the big two players in the smartphone space, Apple and Google have basically become the big two players in the Computing space mindshare)

https://en.wikipedia.org/wiki/United_States_v._Microsoft_Cor....


We should at least be aware of it as an option. Many call this "healthy skepticism". It becomes unhealthy when you veer into blind optimism/pessimism/cynicism.


Very interesting point. They might have had the intentions of pushing everyone to use Edge, and it is not surprising after their so many consistent nags and misleading messages to think its the "better" browser compared to anything else.


This seems incredibly unlikely and overly cynical just for the sake of being cynical.


If Microsoft were so good at software engineering that they could pull off such an attack on Firefox, then maybe they do deserve to have a monopoly. /s


Devils advocate, why then did they fix it?


Because it became public knowledge that it was happening?


slow walk.. or.. in comparison, have you contacted your local city government to fix obvious holes in the road recently? Around here, a two-year wait time to fix it is common.


The WWEification of every discourse is the worst thing about $current-year


Never attribute to malice…


Nowadays a lot of malicious acts are intentionally disguised as stupidity and incompetence. Not necessarily in this case, but that quote really is showing its age.


Incompetence and malice are one and the same.


This is a relic of Bill’s tenor. Satya is different in good way.


Someone should create a website to list bugs that haven’t solved for years. Hall of shame. I myself could add couple from Oracle, hibernate



Woof, that's a long time for a bug like that to have sat around and Mozilla to not have come up with a workaround for it.


yet another reason why I don’t touch Windows for any professional/sensitive workflows.

Only keep a license around for the occasional gaming session. Disable all of the Windows features (ie, firewall, auto updates, antivirus) and telemetry. Strip the OS to bare minimum and manage the GPU, mobile drivers manually. Limit it to only games


When you say reduced by 75%, would that mean, say, going from %40 to 10% or from 75% to 0%?


It means the former.

If you reduced something to zero, you reduced it by 100%.


I feel like this could/should be a metaphor for airport security...


Most likely their devs using GPT4 and finally "fixed" it


It’s hard to say that anti-virus isn’t worse than the virus.


Someone diverged the thread into Linux vs Mac. The point is, how did the evil Microsoft monolopy get away with not fixing this bug for so long.


Wow Microsoft should say at least sorry to Mozilla and somehow repay them for this!


"bug"


maybe AI helped them out.


[flagged]


You'd think they'd target Chrome (>60% market share on Desktop) rather than Firefox with < 8% market share.


The new Edge browser is basically a revamped Chromium, so that'd be a pretty dumb move.


Seems less dumb than targeting Firefox though. Presumably, in the universe of this conspiracy hypothesis, they would do it in a way that wouldn't effect Edge.


Then they would lose any semblance of plausible deniability, which would expose them to being positively identified as bad actors. What it looks like now is mere incompetence in the face of enormous complexity, which means they lose a lot less face compared to doing what you suggest. Put bluntly, they're hiding within the space covered by Hanlon's razor.


if processName != Edge {}


“Bug”


DOS ain’t done till lotus won’t run.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: