I don't like encouraging mucking around in about:config, but I might as well mention this one. If you don't mind your window losing vibrancy and rounded corners, you should be able to significantly improve battery life on macOS by setting "gfx.compositor.glcontext.opaque" to true. This makes WindowServer stop drawing whatever is behind the Firefox window.
I have a new 2018 MBP with an i9. I've noticed I can squeeze upwards of six hours battery out of it using Safari. With Firefox I'm lucky to get three.
I made this change just to see. So far it does appear to have lowered the energy impact score a bit, but I'm not sure it's enough to matter yet. I guess we'll know in a couple hours.
Edit: 20 minutes in and I've watched my Time Remaining estimate creep up from 3 hours to 5.5 hours. Seems like a solid win so far (other than it looks like crap).
Yes, unfortunately true. I'm a Firefox user, as loyal as they come, but put me on a long flight without the ability to charge, and you bet I'll be using Safari the whole time. The battery life is amazing.
Sure, those will kill battery usage, but that's not what I'm doing. Typically I'm bouncing between the browser with about 10 informational tabs open, terminal, and sublime text all day long. There's a night and day difference using Safari vs Firefox for just regular browsing.
I'm not the only one who experiences this either, I have a coworker who complains about the same issue with his couple year old MBP. Firefox is a battery vampire.
It'd be interesting to monitor Firefox vs Safari using e.g. powertop. Using Linux, I get excellent battery life using Firefox. But it might be the case that safari is still much better, or that it behaves differently on macOS.
2018 MBP with i9 here as well. I am getting 10-11 hours battery life with Chrome. What exactly are you doing on your machine other than browsing as it isn't the browser causing such low battery life. Even when working in Xcode and IntelliJ I still easily get 7-8 hours.
Sorry to hijack this thread, but I noticed in your presentation (can't find it now) you could test out pathfinder with --features=pathfinder on Firefox Nightly. Any way to tell if this really enables pathfinder? I ask because I can't tell a difference on Windows or Mac (which may mean it's working as intended).
>I don't like encouraging mucking around in about:config, but I might as well mention this one.
One issue fudging around with about:config presents could be increased fingerprintability. More battery life is cool, but will it come at the expense of an easily fingerprinted browser profile?
1) He's speaking in general terms. Some settings could quite possibly leak into fingerprinting, even if that particular setting doesn't.
2) Or does it? I could envision a timing attack in Javascript that can detect rendering speed accurately enough to determine if the window is being composited with transparency.
There deliberately aren't many tricks like this (I don't know of any others, in fact), because having to choose between aesthetics and performance is a terrible tradeoff.
This is a blunt tool. Some sites uses context menu events in a benign way. A better alternative is to simply bypass context handlers with shift-rightclick when necessary.
> webgl.force-enabled
This isn't really a performance improvement. If webgl is disabled it's due to troublesome drivers. Forcing it on can crash things.
Hey I didn't know about shift-rightclick. For me, disabling dom.event.contextmenu.enabled gives me both menus at once, then I tap Alt to get rid of the browser's menu if I want to use the app's menu underneath it. But I won't have to do that anymore.
They will start loading every resource on a page on every visit to every page, as no resources will be cached. One hour of browsing will use a month's data quota and the glorious no-caching detail of every pageview will be closely observed by the server-side logging metrics that are so desperate these days to extract targetable marketing data.
webgl.force-enabled = true
Browsers disable WebGL in some scenarios to protect users from hardware, software, and/or driver bugs that cause crashes when WebGL is enabled. This setting overrides that which increases their risk of GPU, browser, and system crashes. Additionally all crashes are either "exploitable" or "not exploitable", so bypassing crash mitigation processes increases your risk of one such vector being used against your browser.
network.dns.disableIPv6 = true
Over the next ten years, the user will see that more and more of the web breaks down and mysteriously fails in their browser. Sites only load partially, videoconferencing never works properly, video streaming is jerky and slow. Providers shipped IPv6 to their customer endpoints years ago. Disabling it has potential downsides and no upsides either for "privacy settings" or anything else.
Not to mention the infamous `privacy.resistFingerprinting` option, which leads to a lot of random breakage and confused users. It's not ready for non-Tor Browser users yet, or else it would be default!
Presuming intimate knowledge of second- and third-order consequences from seemingly-innocuous preference changes is guaranteed to be a losing bet, even among tech enthusiasts and experts.
I missed the high risk resist-fingerprinting setting and had no idea it would cause so many problems. Those problems certainly are not documented in the gist and I would have fallen prey to them if I had applied it unaware.
Advanced knowledge is not the issue. Misrepresented knowledge is the issue. A document about “privacy settings” contains non-privacy settings and does not contain any mention of the lasting harmful side effects due anyone who uses any of the settings within it.
EDIT: This is how to approach changing one of these settings with the respect and care due to such a suggestion:
So that's you, but don't say that they should not share their knowledge because it does not follow your standards.
I'm more than happy to see someone experience on this, what what worked for them, compare with mine. Better have something than nothing, not everyone have time to write a blog post every time they change one setting.
You change it. Have issues. Realize it does not work for you. Change it back.
If you're not sure or don't have time to deal with it, don't do it. Not difficult is it?
Ans a said previously the responsibility is in the person making the change to their browser, no one is forcing them.
Sharing knowledge respectfully and without overstatement or misrepresentation is a bar I’m not willing to lower. We’ll have to settle for disagreeing on this point.
> Disable Google Safe Browsing and malware and phishing protection. Stop sending links and downloading lists from google.
To be clearer here, lists of partial hashes are downloaded and entire links are only sent after partial match. Still worth disabling for privacy reasons if you care more about that than safe browsing protection, but worth clarifying how it works lest one thinks all links are sent.
> and entire links are only sent after partial match.
Only part of a hash of the url is sent to get an update for all the URLs in the partial match block. The actual url is never sent.
The exception is when download malware protection is on. In that case, when downloading a file the actual URL is sent. That's a regular preference, though ("Block dangerous downloads"), doesn't need about:config changes, and is well described in the help docs.
Yes, my mistake, I should correct to "only checked". Still makes an internet call to a third party (as does the original hash list download) which I think the OP is wanting to avoid w/ the settings updates, but yes, URL not sent to list maintainer.
Still - it is unsettling that the only browser available that somewhat respects privacy, sends anything to some 3rd party, and completely unacceptable that they send it to Google.
How do we, and more importantly how do adventurous end users, distinguish useful from useless from harmful configuration changes? Why should I trust this person? Why should I think it's worth my time?
I find that these accumulated lists of security ideas are usually dangerous - in any domain, not just Firefox or browsers. They collect information and misinformation and add a veneer of legitimacy. Some settings don't behave as you think; some aren't documented; some are poorly implemented or tested by the vendor, who didn't design them for end users, and their malfunctions can create more security holes. Some are implemented for reasons you don't understand ('before you tear down a fence, know the reason it was put there'). Some are dependencies for other settings, features, and subsystems.
It would have to be work by an expert in browsers or Firefox or security, such as a Mozilla engineer or someone like Giorgio Maone, for me to trust a list like this one.
Colouring is not very universal, and so not generally a good choice. Browsers have taken different approaches to this, two popular ones are:
* Identify TLD registry operators who have a sane approach that prohibits or otherwise is effective for controlling homographs, whitelist their TLDs, default to showing punycode (the A-labels used by the DNS system which are always just ASCII). This has the effect that if your name looks "wrong" that's a problem to take up with your TLD registry. Note that com doesn't have such policies at all, it's a vast sleazy market and it remains interesting to me that huge global brands would rather be in that market, trying to shout over the crowd, than leave it to rot.
* Identify cases like you've described with "confusing" mixtures of scripts and display those as punycode.
Both have problems. The former requires that you effectively police TLD registry operators. Find out what their policies are, check they actually implement those policies effectively, and take action if this changes. The latter requires you figure out how all the world's language communities use different scripts, and how that interacts with Unicode, in order to avoid penalising combinations lots of people want, while still detecting attacks.
> when you go to раураӏ.com that last character shows up in red.
All of them would show up in red, all of them are cyrillic. If all but one character was cyrillic, Firefox would detect this and render the url in punycode. As implemented now, firefox renders the URL as shown because all characters are from the same script (~character set). Chrome is more suspicious and renders it in punycode (https://xn--80aa0cbo65f.com/), though it would presumably render the confusing version if my locale was Russian.
They're supposed to display in punycode characters outside of your locale.
Maybe whitelisting specific characters instead of only locale would be best.
Your color idea is neat but look at it from the perspective of people that actually use idn domains and don't speak english. What if раураӏ.com was раураӏ.fans-ùøê.com where ùøê was part of the same charset as ӏ and the user speaks both locale. Even if you color it,it would be difficult to train users to pick up on that.
Some of these are strange choices and conflict. For example, disabling caching and then changing the cache size. Or disabling caching at all since it will have a significant negative performance impact. If you're concerned about what caching leaves behind use memory caching only.
This list is somewhere between worthless and dangerous.
Chesterton's Fence: Presumably Mozilla has already optimized the privacy and performance of Firefox as much as they've felt comfortable doing. If they could change each of those settings as recommended without tradeoffs to help the user, they would have done so.
Without listing the tradeoffs for each one, this list cannot be relied upon.
Dangerous is assuming Mozilla optimized only for privacy and performance. Why would you assume that and discourage people trying to optimize for that? This list wouldn't exist if Mozilla offered an equivalent, optimizing for only those two metrics, and explaining the tradeoffs you're asking for.
What you'll find, as has been the case for Mozilla in some recent decisions, is the tradeoff includes (but is not limited to) profitability and ease-of-use, two things you left off your list of what you think Mozilla optimizes for. That blind trust and invalid set of optimization metrics shouldn't be perpetuated. Granted blind trust in this list is unwise too.
You say still, but this will be the first time in a few years that will probably be true. The details of the Google deal aren't public, but I doubt it forced them to spend the past few months secretly hiding Google tracking into the browser.
I agree. I also try to keep it simple to avoid too much maintenance hassle. Sadly about:config is too imperative, so you need to explicitly undo changes. I wish it was like regular dotfiles.
Along with ublocko and httpseverywhere, that covers a lot without breaking mostly anything.
A quite famous user.js hardening project is https://github.com/pyllyukko/user.js. But I've found that keeping such a complex configuration functional requires a lot of constant effort.
This list is dangerous for the average user, because it disables safe browsing. 95% of people are going to suffer more from being phished than from being tracked.
When I said they'd optimized it "as much as they've felt comfortable doing" I was referring to things like profitability and ease of use as limiting factors.
If you understand that changing one of the defaults only affects ease-of-use, and you're a power user, more power to you! If you want to turn off analytics, cool!
But it gets trickier when it gets to the safebrowsing feature. The gist says to turn off all of those components, rather than just the ones that send hashed or partial file and address metadata to Google. That's not helping anybody.
(To be fair, I'd actually reword my original comment to change "worthless or dangerous" to something milder at this point, but HN won't let me do it. Oh well.)
It helps the people that optimize for privacy above their own security in this case. From my understanding of the algorithm, you can't really leverage safe browsing properly without being willing to check the full hash via an internet call at some point. So might as well have it off or on. An extremely paranoid user might not want a safe-browsing download to occur at all, informing Google when their computer is on. These are paranoid stances, but it is a list of settings for that. I use a Chromium-based browser that doesn't use safe browsing lists and I accept the risks.
Am I supposed to intuit from that discussion that they have different incentives and/or goals, or that they haven't found time to get it done? Because even though I think it is somewhat poor wording, I think it's the latter.
Also, although I also think the proposed "upload" is poor wording, as many people seeing that will be confused because buttons that say that usually then request what to upload (e.g. "hmm, maybe it's asking for a local image to upload to the cloud for storage?"). You actually are saveing in both cases, in one case to the cloud, inthe other case to the disk. Thus, "Save to cloud" and "Save to disk" (or "Save locally") are much better options in my mind.
> It's been a year and it's a relatively small UI change to at very least make the local save button the primary colored one.
That doesn't really make a lot of sense though - the experiment was specifically launched with the goal of trying to solve the problem of sharing screenshots with others; a problem they identified as common through user research. So while I think there's a lot to say for changing the wording, the primary action will always be sharing it.
Correct, and that's why I don't like encouraging messing with about:config. The few settings that can have any positive impact invariably come with drawbacks, which is why they're off by default.
What Mozilla did in the default Firefox configuration, is made the trade-offs on behalf of an average browser user. It's neither worthless, nor dangerous for a savvy browser user to make the trade-offs for himself. That's exactly what software configurations are for.
That might be a good argument if the link was to an article with a large amount of discussion about the pros and cons of each configuration change, instead of just a list of changes with a brief overview of what they do.
As the one who posted this link here, I found the list to be a great starting point for my research. Instead of switching to something like Waterfox or Brave, I can now remain with a major web browser by just enabling the features I need.
Yeah right, because me being forbidden to right click on the site is really in my best interest. I might, you know, inspect DOM and block annoying permanent menus or ads that ublock origin doesn't filter out. And being able to do copy&paste of the content should send me straight to prison.
Well, that is a good example actually. Of course you don't want to lose the ability to right-click, but there are also various web apps that override the right-click menu for legitimate reasons. Two examples that come to mind are Jupyter Lab and Outlook 365, both of which use a custom context menu. Disabling the context menu setting presumably breaks this functionality.
My idiot bank disables right-clicking, and I'd be all over the option to disable right-click hijacking, but yup I also use a web app that overloads right-clicks in a sane and appropriate manner.
Chesterton's Fence is about a technology somewhere between thousands and tens of thousands years old. Fences are extremely well understood.
All of software has been around less than a century; the concept of personal computers running software that an end-user might adjust settings on is about sixty years old; thinking about Internet security is roughly thirty years old; the modern web ecosystem is no older than JavaScript, 22 years old.
You're still right that blindly changing settings is not a great idea, but the Chesterton's Fence analogy is misplaced.
Notably, the Safe Browsing Update API never sends the URL itself to the server: it does a local lookup with hash prefixes of the URL, and only if that finds some threat does it then send the hash prefix to Google (which gives some degree of deniability, given they never know what actual URL you've visited, and only rarely get anything).
Says Mozilla, and the source code is available for anyone to double-check. Google doesn't have control over what data Firefox sends to the safe browsing API.
For instance, disabling geolocation. You need to explicitly approve geolocation requests anyway. If you're sure you'll never ever want to share your location with a site, sure, disable it globally. But recommending this to others as a way to protect their privacy is a bit silly.
Because those features can involve uploading your browsing history to 3rd parties... does Microsoft check locally against malicious site lists, or does it upload every address you visit in order to tell you it's safe?
I'm unaware of any anti-malware/phishing tool that uploads every address, because it'd be far too slow (you want to know if it's malicious before you show the user anything, to avoid them having the chance to interact with it). They all have some local partial list that is used for a first match, and only if that matches do they interact with any online service.
MS don't have a publicly documented API, which makes it harder to know what's being sent (short of reverse-engineering it), but Google's never uploads actual addresses (it almost always only uploads the uppermost 32-bits of the SHA256 hash of the address, and it never uploads the full hash).
> These settings are best combined with your standard privacy extensions (HTTPS Everywhere, NoScript/Request Policy, uBlock origin, agent spoofing, Privacy Badger etc)
Side question: how many of these ought to just be standard behavior of the browser? For example, will Firefox's new tracking protection make Privacy Badger obsolete? Should the HTTPS Everywhere extension, which attempts to route any HTTP request to an equivalent HTTPS, be the default behavior?
I'm sure plenty of people on here would object to HTTPS Everywhere being default: "but the user typed the HTTP scheme, the browser should do what the user asked!".
That said, the bigger problem is that too many things that HTTPS Everywhere tries to upgrade to HTTPS are only partial versions of the site; you'd probably need something more conservative to avoid too much breakage.
I moved from HTTPS Everywhere & Privacy Badger to DuckDuckGo Privacy Essentials. It does the job of both of those with one add-on.
In response to HTTPS Everywhere breaking things, it does, and I've done tests comparing it and DDG PE and and found sites that would break using HTTPS Everywhere did not with DDG PE. They may simply be doing more testing.
Plugin fingerprint protection was removed 3 years ago by a poorly-reasoned patch, favoring the large bug of unimpeded surveillance over the very tiny bug of sites looping through non-existent lists if and only if they need to interact with a plugin. Seeing the conversation, I'm looking for a different browser.
"remove_plugins-enumerable_names.patch
Bug 757726 hid most plugins from navigator.plugins enumeration to reduce fingerprinting. Plugin detection scripts could ask for a plugin or MIME type by name, but they couldn't get a list of all installed plugins. Unfortunately, the feature had to be disabled because it broke pretty much all plugin detection scripts because they naively search for the desired plugin using an O(n) loop instead of a O(1) query.
This patch removes the disabled code because it is unlikely we could ever re-enable it. In addition to removing the obsolete navigator.plugin tests for detecting hidden plugins, it adds tests for detecting click-to-play and disabled plugins."
The site bug wasn't "sites looping through non-existent lists if and only if they need to interact with a plugin". It was sites trying to detect whether Flash is installed by looping over the list, deciding it's not installed, and not trying to play the video (or the game or whatever), instead pointing the user to a "Download Flash" page.
This affected enough sites that it was a serious problem for users, not a "very tiny bug".
In any case, at this point Firefox supports exactly one plug-in, so all enumeration can tell you is whether Flash is installed or not. That's still one bit of data, of course, but that bit could be extracted even with the "no enumeration" patch by explicitly querying whether Flash is supported.
Am pleased to discover the "network.IDN_show_punycode" option and wonder why it doesn't default to true, given the way other browsers seem to handle this.
Other browsers vary their behaviour depending on locale, if I'm not mistaken, so that people in countries where IDNs are common don't constantly see meaningless punycode.
I noticed recently that unless I manually went in and set a master password, anyone can easily go into the FireFox settings and view all saved passwords..
I've been using FireFox Quantum as my primary browser for 3 or 4 months now and am really trying to love it, but chrome is just so much snappier..
Depends what system you are on? It might make use of libkeyring if available on linux for example, i prefer not to save passwords on browsers and use password managers.
Probably not. OTOH, you can put the changes into a file called 'user.js'. Go to the folder of the FF profile you're using. Backup the file 'prefs.js' to restore in case you don't like the result. Then drop 'user.js' into that folder and restart.
When you're sure you're happy with the changes, then you can just drop 'user.js' into the profile folder(s) on your other machines.
After a number of experiments with Panopticlick, the single biggest fingerprint was ... screen depth and resolution. Even after disabling most of the other main fingerprinting mechanisms, my maximized browser window on a 30" display was a very small subset of the last 45 days test results. This was surprising to me.
One thing that's frustrated me about FF (and kept me using Chrome) is that local development is a pain in general... for example, when I type `localhost` in my nav bar, Chrome autofills the port; FF doesn't (it just puts `localhost/`, which is useless). Next, FF interacts oddly with NGINX, telling me `The plain HTTP request was sent to HTTPS port`, whereas Chrome just passes me right along, giving me a little `Not Secure` text next to the URL.
Given I have to go to this website about 1000x per day, you can see why using FF is painful...
Anyone have any advice (on config, for example) for fixing this? And if anyone on the FF team is listening... I'd like to use your browser, but this has kept me from doing so.
I don't know what is going on with your nginx setup. Most likely you've configured it differently than you think you have, or are making different requests in Firefox and Chome without realizing it. Both firefox and chrome behave the same for me when I make an http request to to a server using https (e.g. http://www.example.com:443/)
I have opposite experience with Chrome as well: I like when URL bar behaves just like URL bar and does nothing extra. I worked in company that used custom TLD for all sites in internal network, and every single time I entered such `site.tld` in Chrome's URL bar it insisted on _searching_ for that instead of just appending protocol, '/' path and visiting it like reasonable browser.
To be honest, Firefox likes to mess with input too, but it can be easily tamed with few prefs, namely:
keyword.enabled=false
// no implicit searching from URL, must use explicit keyword or Searchbar
browser.fixup.alternate.enabled=false
// this prevents trying www. … .com or other configured suffixes when domain-like URL fails
browser.urlbar.trimURLs=false
// do not hide protocol and slash
browser.urlbar.filter.javascript=false
// bookmarklets FTW
It must be a different port, otherwise localhost without a port number would work fine. Presumably 8080 or something, which is common for user apps to use since without privileges you can't open a port below 1024 normally.
If you type it so often, just make a single character keyword for it, /goddamit/. Or gesture. Or global hotkey. And delete the wrong protocol and/or port entries from history. If you visit that site so often, I bet your URL Bar will suggest / autofill it for you just after typing 'l' or 'lo'.
Somewhat tangental but can someone explain why Firefox users need to explicitly enable new Streams API support? [1]
Safari, Chrome implement these APIs out-of-the-box but FF wants users to open `about:config` (and ignore the ominous warning) and set `dom.streams.enabled` and `javascript.options.streams` to `true`.
It seems a little backwards to me since this has been available since version 57 and their webpage is the first result when searching for this documentation!
While talking about Firefox I have a quick question I am hoping someone here can help answer.
One feature of Chrome I like a lot is the super simple per-site settings options. I use this to disable JS on a number of sites without impacting any other sites. As far as I can tell there is no option built into Firefox that allows me to do this. Does anyone know of a simple way to get per-site JS blocking?
I have looked at a few extensions which works but I was wondering (hoping) for a hidden Firefox option to do such a thing. I hacve tried to get Firefox policies to work but they appear broken?
This is superfically not a simple way, but if you get uMatrix and spend 1 minute to change the default config to allow all, you can use it to block js on per site (and per origin) basis.
This does things like blocking analytics and browser fingerprinting, and it's a simple to install or revert---just drop the user.js file in your Firefox profile directory.
Before anyone tries this, I really really really encourage you to read the `Known problems and limitations`. eg it'll erase all your saved passwords, bullet point 23 (at least it's in bold :/)
toolkit.telemetry.cachedClientID will be repopulated by Firefox even after you clear it. I have just locked the preference [1], hopefully this stops it.
// Our internal trim prevention logic is effective on 2K/XP at maintaining
// the working set when windows are minimized, but on Vista and up it has
// little to no effect. Since this feature has been the source of numerous
// bugs over the years, disable it (sTrimOnMinimize=1) on Vista and up.
timing attacks, maybe? a script could make a request and measure the response time to heuristically determine whether a given resource is cached locally.
I assume the less data your browser stores about your browsing history, the better your privacy is (at least in terms of someone snooping around your computer applications to determine your day-to-day behaviour)
This bug tracks the proper solution: https://bugzilla.mozilla.org/show_bug.cgi?id=1429522