Clean links - removes redirects from search engines, facebook, twitter etc to hide a fact that you clicked a link, google doesn't know what link you clicked from search results, so if you block GA, it can't track you https://addons.mozilla.org/en-US/firefox/addon/clean-links/
I have used Noscript, AdBlockPlus and Ghostery before but found they where lacking functionality, flexibility and performance.
I used Privacy Badger too but if I remember well, it is based on the same engine as ABP and suffers for the same performance problems.
uMatrix provides a tabular view sorted by host featuring toggleable category columns e.g. Dis/Allow iframe, script etc. It's granular client side resource whitelisting.
uMatrix offers more granularity. You can choose exactly what each third-party site can do in terms of cookies, CSS, images, plugins, javascript, XHR(!), frames and media (audio, video, pdf, etc).
After years of using uMatrix (formerly HTTP Switchboard), many sites "just works" wrt Youtube, Vimeo and similar even without first-party javascript enabled.
I've considered sharing parts of my global ruleset so others can just copy-paste the sections/sites they want to whitelist without having to discover what's required themselves.
Considering the amount of variables that contribute to the browser fingerprint, you would be forced to conclude that the only way to prevent being so unique is to run a browser in a vanilla VM (although the OS is already a variable in itself).
I think this is a topic that gets discussed by (for example) the Firefox developers, but I get the feeling that this is one of the hardest problems to fix.
I would like to see a browser mode akin to the privacy mode most browsers feature that reduces the number of identifying variables (at the cost of features). So instead of telling the world that my time zone is CET and I prefer English (GB) as language, it would select a random time zone and locale (although this does inconveniently mean that sites might suddenly serve me content in Portuguese).
Come to think of it, TOR Browser probably does a couple of these things. Disabling Javascript is surely the biggest factor, although that does make the modern web pretty much unusable.
> Considering the amount of variables that contribute to the browser fingerprint, you would be forced to conclude that the only way to prevent being so unique is to run a browser in a vanilla VM (although the OS is already a variable in itself).
It'd have to be more like a VM running the OS with the highest market share (Windows), the browser with the highest market share (Internet Explorer), with the most common language used, with the most common time zone of users of the site you're accessing (varies by site and time of day), etc.
Anything else and you could stand out in the crowd. Using Linux or OS X, for example, really make fingerprinting easier for sites, which is quite disturbing.
Randomizing the values of certain attributes, as you've described, may help a lot if more people adopt it and make fingerprinting a futile exercise to those using it. :) If the people doing the fingerprinting see millions being successfully tracked with just a handful they're unable to track, they wouldn't even care. It's kinda like ad blocking. A few do it and it's not seen as a problem. If the majority does it, then the sites take notice. For a larger scale effect, browser makers should get into this. Mozilla, Apple, Microsoft and Google, in that order (with Opera somewhere in the middle), may be interested in thwarting browser fingerprinting.
A lot of factors that make up the total fingerprint have an influence on how sites react to your browser, so I would have it change per-session and per-domain to prevent weirdness.
Request filters: These add-ons filter requests to 3rd party hosts, effectively blocking everything (if set to default to deny all). Most sites, other than web applications or ecommerce, only need to connect to at most a single remote host to pull down their CSS files; the next most common requirement is Google Hosted Libraries.
* RequestPolicy: No longer developed, but still works for me
uMatrix if you are a control freak. I usually stand it for two weeks before giving up. And yup, I've trained it to know the sites I regularly visit. Thing is, I also surf NEW sites all the time.
Now I use Self Destructing Cookies, uBlock Origin and HTTPS Everywhere. That works just fine without taking the fun out of the web.
I've found NoScript a bigger part of my browsing habits lately. I wait for the moment when a site just goes haywire, maxing CPU cores, at which point I just nix its script privileges. This isn't nearly as disruptive as distrusting all websites.
There's really no need to use both Privacy Badger and Disconnect as they both do pretty much the same thing. I'd ditch them both and just use uBlock Origin with the "Privacy" filter lists enabled.
I thought Disconnect was based on preexisting lists and Privacy Badger automatically worked out which sites seemed to be setting cookies and using them for tracking across sites. I'll need to look into it more, thanks.
You're correct. I'm not sure where people get this idea that Privacy Badger's supposed "lists" are included in uBlock, but I've seen it around here a lot. Personally I use both.
CsFire is the result of academic research, available in the following publications: CsFire: Transparent client-side mitigation of malicious cross-domain requests (published at the International Symposium on Engineering Secure Software and Systems 2010) and Automatic and precise client-side protection against CSRF attacks (published at the European Symposium on Research in Computer Security 2011)
Firefox lets you do that natively too. Self destructing cookies deletes the cookies after you close the browser tab, not the entire window. If the browser doesn't provide complete isolation between tabs, this'll make it so the cookie isn't there to harvest from. You can set each site to tab/browser/never too.
There are duckduckGo privacy settings you can set. I'm not a big fan of a cloud store for your settings and thankfully duckduckGo allows for settings parameters in the url[0]. You can do things like require POST instead of GET, redirecting, forcing https etc. Once you get your search and privacy settings how you like them, take the resulting url and make an openSearch plugin out of it, manually or with something like my mycroft project[1]. Now any searches use your settings, no account/cloud settings needed. It's then easy to throw in every browser you use. Technically a plugin.
I wish their was a way to isolate third party cookies/html5 data to SiteVisited/ThirdPartySite instead of the current ThirdPartySite model. The third party site could track you within the site visited but you would appear as a different user when visiting a different site. There would be no way to track you across websites.
There's no need to use Disconnect anymore. Disconnect's list is now included in Firefox's native tracking protection feature (https://support.mozilla.org/en-US/kb/tracking-protection-pbm), and is also available through uBlock Origin subscriptions.
It wouldn't hurt to get SQLite Manager and periodically check what's in the browser databases. For example, if you buy anything online you might find your credit card number in there.
They claim to only sell your data if you opt-in to something called "GhostRank." [1]. It's proprietary software so there's no way to actually confirm that though.
There's really no reason for privacy conscious individuals to use Ghostery when uBlock Origin can do the exact same thing.
In addition to asking you a series of invasive questions when you uninstall — which, I suspect means they sell that data once you're no longer using the software.