Hacker News new | past | comments | ask | show | jobs | submit login

Firefox already includes many of these, some of those plugins are redundant or unnecessary.



> already includes many of these

Make sure you turn them on though!

Firefox by default doesn't block canvas fingerprinting, that's a setting you need to enable in `about:config` under the `privacy.resistFingerprinting` section.


I've been treated with endless Google Captchas with that setting, and they get very difficult with each passing day, never mind the time-sink.


Google captcha works by uniquely identifying you. The easier Google can track you the easier your captcha. That's why it's nearly impossible to solve captcha with resistfingerprint on (ie tor mode)


That seems illegal in some places if true. The premise is that it proves we are not robots, and that's all.


By tracking you, they can remember that you are not a robot. As much as I dislike tracking, I don’t think having the captcha be always hard would be all that great either.

Ideally I’d like to see fewer captchas. But there’s no good alternative to it really. I mean, requiring phone verification instead is an alternative. But I don’t necessarily want to hand out my phone number to each and every site on the net that I interact with either.


Its absolutely not true that there is no alternative to tracking for captcha "hardness".

https://privacypass.github.io/


You can prove trust once and get several tokens to spend later. There are various people looking at this like Cloudflare and Google.


Wow didn't know that. Do you have a source?

How do they decide which captcha is harder ?


The parent used wrong phrasing but it's real. The captchas don't become harder to solve but longer. If Google detects you're blocking trackers (or possibly cookies) it will throw at you an huge number of captchas until you give up. Imagine having to click like 12 panels of street lights, cars, bicycles, trucks, buses, etc. sometimes multiple times. This is getting out of control, really. I miss so much the Web 1.0 from 1999.


What I do on captchas is to hit the skip button until it would prompt you to verify. You can try if this could help. Captchas should be tone down to something bareable. I even got to a point that I'll drop the site because of this.


the worst part is when cloudflare puts you on their shitlist and you have to fill out these dumb captchas for 6 minutes before you can even access just about half of the sites on the internet


It's really sad the situation we have come to.

I think that only two things can defeat this madness

1. Legislation

2. Breaking captcha to the point it's not effective anymore


I use Firefox with all the privacy stuff turned on. I have to do multiple, difficult recaptchas all day. I've even missed out on Yosemite campsite reservations because of the delay from all the captchas presented to me when trying to book :(


I use buster. It fetchs the audio captcha and sends it back to google for speech recognition: https://github.com/dessant/buster


Buster's readme says they need resistFingerprinting set to false


This just made me snort, I think it's comic genius.


I tried it but the captcha told me my request looks automated. Any way to circumvent that?


That's the sad thing. We are getting to a point where software that would be used by malicious actors, is required by normal legitimate users because of how broken the web is.


Instead of resisting fingerprinting and suffering captcha everywhere, can't FF feed false data into the fingerprinting "sensors" so that it will change every time ?


I've wondered the same, the answer seems to be maybe, but the challenge is doing so in a way that doesn't blow up normal webpage functionality.

For example, a fingerprinting script might try to measure the viewport height and width, calling on window.height can give it that info, but if Firefox were to fake that info when a friendly script calls for it, the page might try to reflow to the new size, etc. All kinds of desired behavior can use these same values, the challenge is determining whose a bad actor.


I used privacy.resistFingerprinting until I realized that was the reason websites were displaying times incorrectly, as they receive the default timezone value of UTC - [0] - and I couldn't find a way around it.

[0] https://bugzilla.mozilla.org/show_bug.cgi?id=1330890


That's a feature, not a bug. Your time zone is one of the ways you can be fingerprinted.

That said, it wouldn't hurt to split it out into a different about:config preference. I'd probably disable it since I don't use a vpn so my time zone can be deduced from my IP anyway.


I know it’s a feature, but even sites which allow you to set a timezone (eg Slack) still displayed times incorrectly. It was a tradeoff I was not prepared to make.


If you’re considering resistFingerprinting, you might be better off using Tor Browser, which maximizes fingerprinting protections regardless of breakage to websites. I believe that the Tor folks share patches sometimes that get integrated into the Firefox rF pref, but you would only want to enable rF if you’re looking for the best possible protection regardless of other desires (such as addons or fonts or being able to access all websites), and for that, there’s Tor Browser.

(Or, if you’re just interested in helping advance the anti-tracking ecosystem! In which case you can test resistFingerprinting and file Webcompat issues when you encounter them — but be sure to mention that resistFingerprinting is enabled or your issues will probably be closed “unable to reproduce”.)


You're right, a lot of the resistFingerprinting features are being uplifted (with some modifications) directly from Tor. There are a couple ways of looking at this, and more than one of them are valid.

The first is, like you said, that resistFingerprinting can be kind of a gateway to Tor in general, since Tor will do everything resistFingerprinting does, and better.

The second is that uplifting Tor features to "normal" browsers and allowing "normal" users to enable them makes it harder for website operators to say, "well, I don't need to worry about this because it's just Tor users and they're all criminals." Right now, enabling these features in Firefox will result in some website breakage, but as more people say, "well, this is a mainstream browser thing", maybe more website operators will start to accommodate the protections.

I think there's value in continuing to blur the line between Tor and other browsers, if only to push the idea that the kind of privacy protections Tor offers should be available to everyone across multiple browsers. Not to mention that it's nice to be able to take advantage of a few Tor features while still getting stuff like fast video streaming.

But agreed, there's definitely a continuum here, and it might be valuable for some people to explore farther down it.


Why aren't they on by default? What's the downside?


Google bullies you with Captchas if you have `privacy.resistFingerprinting` on.


One solution for this, at least for search, would be to use something like https://github.com/benbusby/whoogle-search


I've also found DDG to be more then adequate for search.

The issue I had more often is random captcha's for sites I actually need to use not letting me through. (Thanks school).

My solution for this is to keep de-googled Chromium installed, and just use it when I run across these sites.


I've used DDG for about a year straight now, it's adequate but I find myself needing a !g at least once a day. Searching DDG for error messages & recent events from the news tab are really lagging behind what Google provides unfortunately.


I use !sp if I want to see Google results. I only use it two or three times a year for critical searches that I want more than one result set for but it may suit your needs. Google News is pretty good but I prefer Inoreader anyway, !ddgn is usually good enough for me, for specific news searches.


Increased captchas, sometimes sites will also break for non-obvious reasons (usually because you need to flip the setting that enables canvas data reading to the left of the URL bar), your time zone will be set to UTC on every website that you visit.

There are a few ways of looking at the captchas; the optimistic lens is to look at it as a response to people who say that it's impossible to meaningfully reduce fingerprinting. If that was true, Google wouldn't be so mad at me for flipping this setting on.

But it does make some browsing more annoying, especially if you're not technically savy enough to realize what's going on when something unexpected happens. I think it's the right decision for them to have it off by default (at least for right now).


A couple of the less debilitating yet still somewhat annoying downsides I've found:

If you have the "restore previous session" option enabled and have grown accustomed to Firefox remembering all the windows you had open before, you may find it annoying that it no longer remembers the size of your windows; it just puts them to the default size. Although now that I think of it, this might possibly be specific to the X11/Linux version, as other window systems might handle window size in such a way that it's not affected by this.

Also, if you like having websites automatically detect if your system uses a dark color scheme and adjust their CSS accordingly, that no longer works. Again, speaking from an X11/Linux perspective here.


This was infinitely more annoying to me than the endless captchas since due to age and my eyesight getting worse I have different zoom settings for pages I visit.


Besides captchas, I mostly have issues with trying to watch content from tv channel websites. Especially when trying to link a channel with a cable service.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: