Hi, I created this one-time account to address some comments in this news item
I'm Thorin-Oakenpants, the creator and maintainer of the ghacks user.js. I don't want to get into a long discussion, if you want that then bring it up in the github repo.
---
The FF67+ changes that introduce cryptomining and fingerprinting protection have nothing to with anything in the user.js (except Tracking Protection preferences - which we have not included, and the TP prefs that are there are all inactive and not applied - we're happy to leave the defaults up to Firefox). TP's fingerprinting is about blocking fingerprinting scripts, whereas actual anti-fingerprinting in terms of privacy.resistFingerprinting or disabling WebGL etc are about stopping JS etc leaking too much entropy in the first place.
Very interesting concept, although not very usable for me as it stands right now. This collection of settings disables features like HTTP/2 and websockets for some reason.
Furthermore, if you're the only person in your city using Firefox with very different behaviour, that just makes you easier to fingerprint. If you want to resist fingerprinting, wait for Firefox's fingerprint protection to advance and keep everything as close to default as possible instead.
It also disabled IPv6, form auto filling, and automatic updates of Firefox. Like 99% of these projects, lots of questionable choices here. I never understand why people would recommend applying hundreds of settings without understanding what they do. It’s registry cleaner-level of irresponsible (breaks tons of things for little to no benefit). To be fair, in the “Implementation” page (???) of the project wiki, the authors state “ULTRA UBER IMPORTANT. Do not just take the user.js and use it with your profile. There are some considerations to make first that concern your online security etc”, but nothing of the sorts is in the readme or even the overview wiki page.
The problem is that those comments are often speaking from a position of unwarranted authority - for example, the section on HTTP/2 is simply wrong but if you didn’t understand the technology the tone would make you think the author is making a reasoned trade-off.
I refer to you my earlier comments about my background (for your info: take it or leave it), and about HTTP2 in particular.
I agree with you that OFTEN the web is full of shit and bullsitters, but I would just like to say, that I am NOT one of them. I'm not always right, so please, if you can correct me in any way, I am more than willing to listen and correct my mistakes. Thanks
Yes, but the caveats are not properly explained for many of these things. Playing around with settings without fully understanding what they do can lead to nasty surprises. I find it questionable to promote such things without the necessary disclaimers.
I've taken great pains to try and provide as much info as possible, in as easy non-jargon terms as possible, for the layman. I have also gone to great lengths (days on a single item, weeks to follow up on research and contact with experts) in order to verify things, and provide relevant links. You can take me on my word, or not: but I've basically been living this stuff 24/7, 365 days a year, for 8 years.
I can still get things wrong, and I can always improve things, but please, don't label this as
- promoted: I have never promoted it, and internally it has always been labeled as a hardened setup and a template
- not understandable: there is a limit to how simple comments can be made. Some jargon is needed. At least I provide links to back up what I say and also for those interested in digging deeper. And at least I provide those comments in the first place.
- questionable: I know what I'm doing
I personally, know fully well what each and every setting does - well, at least the outcome and consequences of them being changed. It's hard to find a balance, but this is not a simple task. The user.js is primarily "fairly hardened" and as a result, breaks things. That cannot be helped. It is promoted internally as a TEMPLATE, not a cure-all.
Where am I lacking in disclaimers? Caveats? Do you mean I need to point out that things break with some prefs? We do already, and we're adding in [SETUP] tags that pinpoint items that can cause breakage e.g [SETUP-WEB] indicates that this pref might cause breakage, and we say what breaks or it should be painfully aware without saying (or at least that's the plan: the comments can always be improved for sure).
"but the caveats are not properly explained for many of these things" - can you enlighten me as to where. Feel free to contribute and improve it at the github repo (where it will be easier for me to deal with, since your criticism seems to indicate I have a lot of editing to do)
form auto-filling is disabled for a reason. It is trivial for third parties to steal form data. It's an inconvenience for some, but does not break anything - websites still work. This is a known bug/issue for at least 8 years - and yesterday I confirmed the leak can still happen in the latest Firefox Nightly - however, this is something I could possibly get addressed (or at least get some action on: I wasn't BS'ing in an earlier comment about what I do and where I have a voice: its a little voice, but it is a voice), maybe, with the Tor Uplift and apply first party isolation to (which also reduces the usefulness of it). IPv6 I addressed in an earlier comment.
I'll address automatic updates in another comment
I agree with you that 99% of these sorts of projects have questionable choices. But every single one of ours has solid reasons (and research, and validating that research, even creating our own Proof of Concepts, lots of testing, and more), and the user.js has only ever been billed as a TEMPLATE - never as a cure-all, grab it, set it & forget it. It was also started as a means to be comprehensive, and documented, and I've tried to make comments as easy for the layman to follow as possible.
I have also never recommended applying this without reading it and making changes etc. I even tell people to test it in a new profile first. And I have gone out of my way to make sure nothing is ever lost (the exception being that cookies and history are cleared: I can't bring those back) - i.e any pref can be changed back, and nothing is lost. I also encourage users to ask in the github issues. And I provide as much easy to follow comments as I can. This is not some irresponsible collection of prefs just thrown together with no thinking.
I have re-arranged the wiki page on implementation. The readme of github DOES tell everyone to read the implementation,. It's the second hyperlink. That readme is also kept very short so no-one misses anything. The readme in the user.js DOES point at the implementation and tell users to read it. And because I know human nature says a lot of people won't, I even ADDED the SAME info in super shorted form to the actual user.js - at the top, right after version, author etc. What more can I do?.
I have now added a link and message (almost at the very top) to the Overview page as even more fool-proofing. So thanks for that. Always happy to improve things.
I'm not liking the registry cleaner level of irresponsible comparison, at all. The user.js changes around 300 prefs from their default (that's not all the prefs listed, thats the ones that are actively changed if you applied it). Of these, at a rough estimate, there are only about 30 or so that people complain about. And they can change them. The vast bulk of them don't affect/break anything at all (or extremely rarely) - i.e site breakage, etc - they ONLY increase the prime objectives of increasing privacy, security, etc.
I have toyed with the idea of making the template much more relaxed & less breakage (i.e those 30 odd prefs that people complain about made inactive), and then tagging items as "harden", and/or supplying a hardened section, and or providing a hardened that users can tack on as their overrides. But that's doesn't quite fit with the original purpose of what I set out to do.
It has always been my aim, and intention, to make things as easy as possible. Hence setup tags etc. And part of that, a long time coming, has been to provide a relaxed.js or sticky issue at github, for people to apply to reduce most of the breakage. I've even taken a scientific approach and done polls on breakage, collated data from users about it, and so on. I have a basic list and should get that out, as soon as I kick myself in the backside to finish it. Once again, the user.js is a TEMPLATE, but more than happy to make things easier for end-users.
Files like this are posted every month, and every month they’re upvoted, and people end up using less secure browsers because they’re misled into turning off TLS 1.3, HTTP/2, and client autoupdates (all real examples from posts like this).
It’s like watching a dog bark at its reflection in a funhouse mirror. You don’t dare interfere because the dog is obviously rabid about something, but it’s just as obviously unglued from reality, and so you watch helplessly and hope that others have the wisdom to stay away too.
I've already typed quite a few replies, and I don't want to come across as all preachy.
"into turning off TLS 1.3... (all real examples from posts like this)"
The user.js does NOT disable any TLS settings. HTTP2 I talked about in another reply. Client auto-installing is disabled (as that really fits with our user-base), but auto-checking for updates is not, and hasn't been since I moved to Github (it was from memory, way back when I had it on ghacks where it was shared as MY settings)
Not from this comment, but I was already aware of some of our defaults putting users at risk, which is why Tracking Protection and Safe Browsing are not disabled (except real-time binary checks), and why auto-update checks have never been disabled, and so on. And why I pointed to and tried to make the wiki implementation page highlight those important settings.
I was aware, but this thread prompted me to actually change it - I made the auto-updating extensions disabled -> inactive. The reasoning here is that APP update reminders are in your face, you get a dropdown panel notification, repeatedly - but extension updates are not in your face - so best practice here so no-one is disadvantaged, and one less thing to list at people and overwhelm them. Keep in mind that this is still not aimed at the average person.
I also updated the wiki implementation page to make it a bit cleaner and really point some things out. Thanks everyone
IIRC, these preferences tweaks are aimed mostly at users who doesn't approve Mozilla decisions regarding Firefox improvements and guardianship, who prefer having control over browser and who refuse to move to Blink-based browsers for various reasons (main reason would be supporting Google dominant position). File was created out of user comments contribution on Martin Brinkmann's gHacks news blog.
It's not just about anti-fingerprinting or privacy.
First of all, these were my tweaks, which I shared with ghacks.net. Martin Brinkmann from ghacks.net published the first draft (the commentators at ghacks inspired me to clean it up etc). It has always been my settings and pushed as a template. Over the next few years it got updated with changes as Firefox changed, and as the information was refined, and things added, removed, changed. Some changes came from suggestions by commentators.
THEN, I moved it to GitHub (over two years ago) where it has essentially been me and earthlng driving the whole thing.
It has never been anything about Mozilla's direction or guardianship etc. It is solely focused on providing information on privacy, security, anti-tracking, anti-fingerprinting etc. It's just that because it is so comprehensive, that it is used by what I can only call "nutcases" that scream malware, spying, and other BS, to promote their causes and scream LOOK: telemetry etc. I am not one of those guys.
I personally have no issues with telemetry, I trust Firefox, and as long as it allows me to continue to tweak and be help me be more private, anonymous, etc, then I'm a happy camper. Do I care about things like Mr Robot or Cliqz, or other eseentially inconsequential things - no. Every company makes mistakes, as long as they learn from them. I don't care if they rearrange the about:config page as they strip out the last of the XUL. I don't care unless it's to do with our primary goals. I even ban github accounts from my repo who start spouting this sort of carry on. And I actively have to ASK to get myself removed from utter rubbish like Librefox, which took all our hard work (it's free, no issue with that) and bastardized it into an absolute mess. I had to get my name removed from that, and post at ghacks.net that I was not involved with it. It still proudly refers to MY repo.
It IS just about anti-fingerprinting, privacy, anti-tracking, security and anonymity.
And it was created out of me (and earthlng), not ghacks. On a side note: I basically stopped doing anything at ghacks.net over a year ago: the negative commentators and anti-firefox brigade just made me shun the place. I'm sure they're just a super tiny minority, but man, did they find a place to congregate. I'm not ones of those guys!
True, but on the other hand it's not so much about being fingerprint resistant, but about making fingerprinting not worth it. If you can identify 90% of users, you're good, you'll likely not spend twice the time to be able to identify another 2%.
"Furthermore, if you're the only person in your city using Firefox with very different behaviour, that just makes you easier to fingerprint"
If you do NOTHING, you are already unique. And not all FP'ing techniques carry the same threat. Also, and I'm sure you're all aware of this, sites that calculate your entropy are only good for seeing what you return as values. They do not have "real world" data but rather are comprised of biased data from repeat visitors with a vested interest in tweaking, constantly rechecking their configurations. The data sets can also be long lived and out of date. Everything I now know about fingerprinting is based on science, sound principles (of methods to mitigate it), and in the last year validated by some large scale real-world studies. There's also the building and usage now of OpenWMP, (which is now part of Mozilla). For example we're looking at the usage of DOMRect across the top 10,000 sites to evaluate the threat (the FPing is certainly possible and easy to add, but is it used, and how?). This gives us a sense of priority, and ideas on mitigation (there are a number of ways this could be done: but some will break more than others). Real data, real world cases, real solutions - lowering entropy, using the right methods, practically zero information paradoxes, minimum breakage: and an all-in buy-in. THIS is your only salvation (or use the Tor Browser, which uplifts and helps out with RFP). See the next point though - you're still unique.
"If you want to resist fingerprinting, wait for Firefox's fingerprint protection to advance and keep everything as close to default as possible instead."
I actively work on RFP. Even with RFP, you are still unique. It has a long way to go - years. Keeping everything at default is a sure way of always being unique. I will say that FP'ing, which is my real passion, is not as important as the other factors. Eliminate unnecessary third party calls, limit JS functionality (or just use Tor Browser for goodness sakes). But it is taken into consideration.
For example: we do not disable geo requests (they are behind a prompt), we do not disable prompt permissions (they are behind a prompt), we do not disable gamepads or vr. All of these are fingerprintable, yes, including your default prompt permission! We don't change any TLS settings (thats server side entropy) - we could make the minimum as TLS1.2 for security, but less than 1.5% of the web uses those, and we're happy to let Mozilla decide when to change the value. We don't change any crypto prefs, as they are also server side entropy, and again while they may harden security, the risk really isn't here - we'll let Mozilla take care of that. There's more, but I'm a bit knackered (been up way too long), and I'm not quite up to trolling through the user.js to pick out more examples. Fingerprinting is my passion. Totally immersed in it. I'd like to think I'm qualified to talk about it.
Most FP'ing comes from "simple" scripts such as fingerprintjs2. The corporate surveillance world is basically not interested in spending money or using server side FP'ing or developing new techniques and so on, when they have 95% of the world already serving it all up - via IP, header referrers, ssl session ids, cookies, persistent local data, logins, third party connections, and OMG, the nightmare of smartphones with location tracking and dodgy apps and so on. FP'ing is slightly overrated as a threat - but yes, it is being used: e.g as a 1st party script on reddit.
RFP here almost has you covered (basic FP techniques, scripts). So don't sweat the FP'ing too much. It's more about us being proactive and anticipating possible threats and mitigating them ahead of time. If you need it, then use Tor Browser.
Sorry for the long post again. I love talking about this stuff.
Hi, Read my first comment at the top. I'm the owner, creator and maintainer of the ghacks user.js
---
I understand it's not very usable for most people out of the box - but it's not aimed at most people. I have never gone out of my way to promote this AT ALL (this is my first time posting here, I also do not post anywhere else: I simply have just let word of mouth and the quality of work/reputation do it's thing) - those who find it are usually knowledgeable enough to handle the gist of it (no pun intended). It's a niche product, but it IS a TEMPLATE. No one size fits all. It's a fairly hardened setup (it could be a LOT harder, trust me) and every decision has solid research and considerations behind it. I've gone out of my way to make it as responsible as possible, while still adhering to the users. And I've tried to make it as easy as possible, with setup tags, a wiki and more. Always happy to hear things that can help make it better (more on that in another reply), and I've always been mindful of breakage and inconvenience - but it cannot be avoided if you want to "harden" things. The internet was never designed for privacy, anonymity or security. Everything has a trade-off.
A little background, I'm basically retired, and have spent the last 8 years doing this as my "hobby". But to be honest, I actually spend more than a full-time job on it. It's like being a full time Ph.D student, 365 days a year. All free and voluntary. It's my passion. I'm not sure how much I want to share here, but lets just say that you are benefiting from my input at the Tor Project (a little, especially upcoming changes), and Firefox (quite a fking lot: RFP in particular). Do I know everything? Hell no, no-one can. That's why I have contacts, that's why I research everything to the nth degree, and validate it. And of course, I am always happy to be corrected (because no-one is right all the time). I share this little background, not trying to sound all important (I'm not), but because of some comments further down. Of course, I could be lying - that's up to you to decide, not me.
HTTP2 I will address below, but websockets aren't disabled. The preference for that network.websocket.enabled was removed in FF35, There is a websocket pref for HTTP2 which is not needed but included for practical reasons. The master pref for HTTP2 alone would disable http2 websocket. Perhaps this is what you are referring to.
I agree with you that changes to FF's behavior absolutely can make you stand out. Fingerprinting is the one area that I would consider myself VERY (extremely) knowledgeable in. Server side entropy such as HTTP2 is still entropy, and the effectiveness of that depends on how widespread end-user uptake is. I'll expand this to include AltSvc and IPv6. Also note that the uptake is always changing (HTTP2 use is massively up from a couple of years ago), and we are always happy to revisit, and we do. Such as keeping an eye on what Tor Browser do (they too had HTTP2, AltSvc, and SSL Session IDs disabled until the last release, i.e TB8. - they do that for a reason you know). We (I say we, because I do make most decisions after talking with a few other people) don't just turn things off because we don't understand them - there are always solid reasons (and considerations).
I have always said IPv6 should be handled at an OS level, and we only flipped that pref to active about 6 months ago for several reasons. Worldwide, IPv6 is not enabled at an ISP level at the levels you may think they are (even I was surprised). It's hard to get actual stats. I'm in a rich, western, almost want for nothing country, and none of our ISPs have it (telco's may be different story, and the js is aimed at the desktop, especially once Fennex is released). The setting doesn't break any web pages or connections, but does provide POSSIBLE protection (in case a VPN drops: and not all VPNs are equal, there are a lot of shit ones that don't even cover this), and then there is of course the documented issue of how IPv6 can be a privacy nightmare (why do you think top VPNs make a point of this). At worst, IF (and that's a big IF) this was used to facilitate fingerprinting, then at my best guess, I would say it's 50/50. There's also an assumption here that you're using masking your IP, otherwise you're already giving away much more commonly used tracking data. So VPN users, this setting does not hurt, it actually helps. Non VPN users, the setting might make them stand out more (e.g the only user in the IP range with no IPv6), but the threat model is not there. Fingerprinting and tracking companies have far more lucrative and easier ways to do what they desire. And of course, we say in the readme at github, and the readme in the user.js, that users should consider using Tor Browser first.
HTTP2 is also not that widespread. It's around 50/50 of the web now. But every modern web browser supports it. So anyone falling back to HTTP1.1 on an HTTP2 website would stick out, a lot. We do disable this, and it's contentious, but there are privacy issues with it. There's even a study and proof of concept, that shows it leaks entropy/data: sorry for being so vague, but I don't have the information super handy right now. Disabling it does not break anything - no harm, no foul..But it really is a 50/50 call on enabling or disabling it. The threat probably just really isn't there for most people. The PoC etc is not entirely convincing. And I seriously doubt anyone is bothering to use it (to track etc - easy and cheaper ways to do that, etc). We followed Tor Browser's lead on this, and have debated it to death yet again (and unearthed more info, including the "PoC"). In the end the consensus was, no breakage, so lets leave it disabled for now.
Just to clarify something before I continue, the user.js is only concerned with issues that can improve privacy, security, anonymity, anti-tracking, anti-fingerprinting - but does try to balance the trade-off with usability and functionality. Speed is not considered. So yes, HTTP2 does speed things up, more than likely imperceptibly, but isn't really a factor. Don't get me wrong, if a pref caused things to be really bogged down, janked, etc, then that would be a usability issue.
AltSvc also gets a lot of flack (usually lumped in with HTTP2). I'll just leave you to read the actual link provided in the user.js. It has serious privacy and security issues. If this threat doesn't apply to you, then comment it out. It's that simple. It's a template
Thanks for putting up with my long ass reply, and thanks for the criticisms (I do listen, and learn). More replies below to follow.
This has a bunch of dangerous advice such as disabling updates, protection against unsafe downloads, disabling saved passwords, and things which make the web worse for users such as disabling IPv6, HTTP/2, caching, etc. with a rationale which is basically “I don’t understand what this does so it must be bad”.
Using the default Firefox with the suggested privacy features enabled is much safer than installing something like this.
Last reply. sorry for bombarding you guys and gals.
It doesn't disable updates, it only disabled the auto-installing of updates. Bit of a difference. However, I was reminded/prompted to change our default on extension auto-installing their updates. APP updates you get repeated notifications from Firefox, in your face. So I do not consider this a risk, especially given our user base. I also updated the wiki to make all of this clear. Of course users should update - I've never said otherwise
Safe Browsing has never been disabled, except the real time binary checks with Google. And this has always been pointed out. And the user.js itself says not to mess with TP or SB any further - i.e at your own risk. And we even had Francois from Mozilla who worked on all of this, to give us the inside scoop and correct details on exactly how all of these work. The header says that there are no privacy or security issues (outside the real time binary checks). Given the user-base, this has never been an issue as far as I can see. But as a default, I can see it maybe putting someone at risk - but I'm not here to babysit the internet - the end user has to take some responsibility of where they get binaries and so on.
The user.js has NEVER ever disabled password saving.
IPv6, HTTP2 have been commented on elsewhere. There are reasons for them being disabled - 50/50 call on those really. The thing is, that they don't break anything.
Cache is a lot trickier. There are certainly tracking/privacy issues with cache, but usually you also need to disable memory cache - we only disable disk cache. It's also not really a speed factor - I've browsed like this for nigh on 8 years. Most content isn't pulled from cache: it's dynamic. Cache is almost a throwback to dial up days. What is pulled from cache, is usually tiny, the js libraries, css, nav sections, footers. The big content is media and images, and that's usually distinct per page. That's a pretty broad generalization. Yes, you will get a speed boost, and it may be worthwhile for you.
One of the reasons these very few issues all you guys have brought up (cache, http2 etc) have come up, is because, while this project is not trying to be the Tor Browser (we actively tell people to use that if it fits their needs), it does follow it's design: such as disk avoidance, app isolation (e.g not leaving MRUs in external places), and more, and even going as further with shoulder surfing issues - but I've relaxed that quite a bit in the default user.js, as I believe this should be in the hands of the end-user: e.g encrypt your device, don't allow shoulder surfers, and just basic safe OpSec. Otherwise all it does is reduce functionality.
I never just disable things because "I don't understand it therefore it must be bad", in fact I do the opposite. There must be a benefit: enhances security, privacy etc. And it's always weighed against breakage, how much the API is used, risk assessment, etc. There's almost always pros and cons. For example, we don't disable geo, because that's behind a prompt, and the user is already protected. But you'll find hundreds of other "lists" doing just that. I could name dozens more that we don't do - because while they sound good, actually achieve nothing and only create barriers for those who actually use them (gamepads, vr instantly spring to mind: default for permissions is another)
Anyway, thanks guys. Got to say what I needed to. And made changes thanks to your comments - 1 pref change (so far, maybe another) and a revamped cleaner wiki page. Hope you now see this project in a different light.
https://blog.mozilla.org/futurereleases/2019/04/09/protectio...