Hacker News new | past | comments | ask | show | jobs | submit login
Firefox Configuration Guide for Privacy Freaks and Performance Buffs (12bytes.org)
321 points by bigbugbag on March 2, 2018 | hide | past | favorite | 118 comments



One thing not mentioned in the list is Firefox Multi-Account Containers [1]. It helps by segregating your on-line information (cookies) into separate containers that you set up.

It doesn't stop sites pulling all cookies, but it reduces the amount of information being sent: only the information in the same container is accessible.

[1] https://addons.mozilla.org/en-US/firefox/addon/multi-account...


Also worth mentioning is the Temporary Containers add-on [1] [2] which works in conjunction with Multi-Account Containers.

If you browse the Internet in default Tabs or in a specific Container you still collect Cookies, Storage and Cache in one place — which is something advertisers and other data-collecting services really appreciate — it makes tracking you easy. Fortunately there’s an easy way to automatically create new Containers every time you open a new Tab and delete the Container if it’s not needed anymore: the Temporary Containers Add-on. By default you can open new Tabs in Temporary Containers with the Toolbar Icon or the keyboard shortcut Alt+C. If you enable the “Automatic Mode” in the options however, it will overwrite your standard ways of opening websites in new Tabs and external programs opening links. Instead of opening the website in No Container, it will open the website in a freshly created Temporary Container. You’ll notice how the names of the Containers keep counting up every time you open a new tab and visit a website: tmp1, tmp2, tmp3. As soon as you close the last Tab in such a Container, it will automatically get removed and with it all that data that makes you easy to track.

[1] https://addons.mozilla.org/en-GB/firefox/addon/temporary-con...

[2] https://medium.com/@stoically/enhance-your-privacy-in-firefo...


Nice! This is exactly the feature I had been pining for, with Containers already being a core part of my browser experience. Works as advertised. Always pays to open the comments, thanks!


Sounds very nice. I'll take a look at it. It should be useful for when I open tabs to links from websites that I usually don't go to (suggested by social media friends, for example).


I abandoned that. Mukti containers have lots and lots of usability issues.

I had some tabs (mostly YouTube) opened thrice when clicking on a YT link.

They don't automatically switch back to the default container. That's a big problem. You open your FB container (and Firefox can do this automatically when entering a FB URL or following a link there), you follow a link elsewhere or enter another URL, and you keep inadvertantly surfing in the FB container for the next hour.


> I had some tabs (mostly YouTube) opened thrice when clicking on a YT link.

Some of the multiple tabs issues have been fixed in version 6.0.0. And the upcoming version will fix them completely.

> They don't automatically switch back to the default container. That's a big problem

I've added an "Isolation" feature to the just published version 0.67 of the Temporary Containers Add-on that gives you several ways to avoid accidentally staying in the same container - including an option to only allow "Always open in" assigned domains to load in their container.


There is a new config in Firefox under privacy.firstpartyisolation that is similar. But I can't use it or the various referer about:configs with my work mac because they break JIRA.

First party isolation was made for Tor and privacy and keeps all cookies in containers in some fashion.


There's First Party Isolation add-on[0] that allows you to enable it by default, and then disable it for a period of five minutes by clicking on an icon.

That way, you could click on it when you're using JIRA and have first party isolation when you're not using JIRA.

[0] https://addons.mozilla.org/en-US/firefox/addon/first-party-i...


It's a promising idea but has issues currently. In recent Nightly builds I found it completely broke websites that override control keys [1], which is loads (GitHub, Facebook, Google Docs, etc.). It'll be great when they iron out all the bugs, because this guards against all sorts of the more advanced, sinister ways of deanonymising people.

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=1433592


This is great and fairly simple to use. I personally use it to separate out Google into its own container, but it is also more practically useful to allow you to login to the same site with different accounts.

It just works! But, the one problem that I've encountered is clicking links in gmail cause them to be opened within the Google container.


I've added an "Multi-Account Containers Isolation" feature to the Temporary Containers Add-on. Enabling that will only allow Websites that are assigned to "Always open in" with Multi-Account Containers to load in their container - every other request will automatically opened in a new Temporary Container.

There's also a discussion about implementing such an Isolation feature directly in Multi-Account Containers.


Right click > Open link in new container tab > No container


Yes, this is what I always do... after my muscle memory has already clicked the link normally!


Thanks for this. I've been looking for something like this for ages. I used to use VM's for this but it was a right pain.

As an aside I used this guide having discovered it a couple of years ago. I can't remember how but I ended up on a site that had been hacked, might have been following a link from a forum, and the page I was looking for wasn't there. Instead there was a link on the page saying it had been moved. I stupidly clicked the link and off I went to a random site that I can only assume was meant to drop some form of malware or take control on my browser. Anyway all I was left with was a message congratulating me on how secure my browser was, I didn't stay around.


thanks for that - i'll look into adding that information


And what you don‘t mention is that that privacy extenson now also has more telemetry. Sad.


I just checked all the links in the repo of the add-on, and only links to Mozilla's domains are links to the MPL license.


I’ve had it installed before and received the prompt “to install the update of this extension please accept the new telemetry conditions.” I don’t claim every user sees it (maybe I’m a/b tested?). I’ve rejected it, and checking the repo wouldn’t help for that.

The new requested permission, different from the ones previously accepted by me is, if I remember:

"Monitor extension usage and manage themes"

What's that if not a new telemetry? In a privacy-oriented extension!

It's explained here:

https://github.com/mozilla/multi-account-containers/wiki/Per...

""Monitor extension usage and manage themes": Required to provide interoperability with other container Add-ons by checking if they have the required permissions."

How about not being required? Older versions really didn't require. Knowing the management (see "Looking Glass"), even if the extension is not using the telemetry at the moment it's just "hey the user already agreed!" Especially troublesome as I haven't agreed anywhere else.


> How about not being required? Older versions really didn't require.

Unfortunately there's no other way for us to check whether an Add-on that tries to access the "API" has the needed "contextualIdentities" permission. I can assure you that it's in no way about telemetry and never will be. If the Multi-Account Features would be a Firefox platform feature then such an API would have the same requirements.


But why "required" to agree? If I know that I don't want other add-ons to call that extension, why can't it use it as it behaved before, that is, without effectively having to "agree" and "accept" telemetry?


I agree that it would be nice to have a way to show that permission prompt only if an external extension wants to use the API - however, that's not possible because such prompts can only be triggered after an explicit user interaction, not from the Add-on itself or external Add-ons.

Also, you didn't effectively "agreed telemetry". The same way you didn't "agreed to sent all website data you visit to the Add-on developers" when you accepted the "Access your data for all websites" permission that the Add-on also requires. If you don't trust the Add-on or its developers, then that permission should scare you more than the "Monitoring extensions" permission. But as you probably know; it's just a wording you have to take with a grain of salt and a lot of Add-ons/Extensions require permissions which would in theory allow them to inject any arbitrary content in websites you visit, or read data from them for that matter. It's how permissions work.


You didn't explain WHY should I'd ever want to allow that every other "external extension" automatically "uses the API" of the extension to which as you properly note I anyway gave a lot of major permissions. And why the extension forces me to allow something to the "other extensions" automatically.


You didn't allow every other Add-on to access the API, but explicitly only the Add-ons with the "contextualIdentities" permission ("Container feature"). Which is the reason the "management" permission ("Monitor extension usage") was needed in the first place - to check if the Add-ons accessing the API have that permission. Also, the API doesn't give external Add-ons access to the permissions you gave the Multi-Account Containers Add-on, but instead only a really specific amount of information; which is whether a specific domain is assigned to a container (you can read about it here https://github.com/mozilla/multi-account-containers/wiki/API).

Now, why "automatically" for Add-ons that have the "contextualIdentities" permission? It's simple, if you grant an Add-on that permission, you already gave your consent that the Add-on can access the Container API; and the Multi-Account Containers Add-on itself being from Mozilla, is just additional Container functionality, but as an Add-on. So if you grant the Container permission, you also get API access to some additional informations from the Add-on; and with that increase interoperability between Container Add-ons.


For privacy, on a linux box are there any downsides to simply creating one or more extra accounts, and running Firefox in them for privacy ('DISPLAY=:0 firefox')?. I use this approach to set up firefox as I like it on a spare account, then copy '.mozilla' to '.mozilla-base'. Then it's just a simple case of 'su -l guest' and (via a script) 'rm -fr ~/.mozilla; cp -a ~/.mozilla_base .mozilla; DISPLAY=:0 firefox; rm -fr ~/.mozilla' (actually the script deletes the local cache as well).

Net effect is that firefox starts exactly as I like, but forgets everything that happened in the session ('groundhog-day mode').

Edit: added 'su -l' step.

Edit: As an adendum, note that this technique can be extended to the complete 'guest' accounts as well, e.g. 'cd /home; rm -fr guest; cp -a guest.base guest; su -l guest'; the entire 'guest' account is then 'groundhog-dayed'.

  #!/bin/sh
  #
  export DISPLAY=:0
  # Set up clean copy
  cd ~
  rm -fr .mozilla
  cp -a .mozilla_base .mozilla
  cd - > /dev/null
  #
  /usr/local/bin/firefox $@
  #
  echo "Holding...."
  sleep 2
  echo "Cleaning...."
  # Clean out junk (so we start clean next time)
  cd ~
  rm -fr .mozilla .cache/mozilla*
  rm -fr .adobe
  rm -fr .macromedia
  cd - > /dev/null


You should look into using firejail. You can do this exact same thing by basically:

    firejail --jail /tmp/firefox /usr/local/bin/firefox
You can even enable things like seccomp to further restrict the FF process.


Interesting, thanks; I haven't seen it, looks interesting and a worthy project. I tend to prefer simple low-tech solutions though, and given the first two commands to start it are to do with fixing pulseaudio bugs and desktop integration is (for me personally) somewhat of a put-off.


There's also bubblewrap, which I haven't used, but promises to be an even lighter way to sandbox applications.

Generally, I agree with you that the lighter the implementation is, the better, but when it comes to sandboxing and other security measures, I would prefer not to roll my own.


From my brief look at firejail, how would I do the rollback? (i.e. reset everything back to exactly how it was before the run). Also n.b. I'm not rolling any security measures of my own here - it's reliant on the kernel correctly separating users.


You can use --private-home to 'import' an existing set of files (e.g. a 'clean' FF profile) into the sandbox, then any modifications made to it are discarded when FF quits.


Thanks for sharing the script! This is a more "complete" version of Firefox Multi-Account Containers. Is there any concrete advantage in doing this versus opening Firefox in a new profile? For example, do Firefox profiles share some browser download caches?


I'm not familiar with Firefox Multi-account containers; do they use separate Unix accounts? (since my approach means Firefox is running as a completely diffrent user, a simple firefox compromise or bug shouldn't be able to access anything sensitive on the main account - like SSH keys, say).


They're not containers as in Linux containers, they're "contextual identities" within the same Firefox process, that isolate things like cookies. Much weaker isolation (technologically) than process/user privilege separation, but OTOH highly usable. (Well, there's the e10s process separation, but nothing anything extra for multi-account containers, as far as I understand).

What things it isolates are listed here: https://wiki.mozilla.org/Security/Contextual_Identity_Projec...

FWIW they work great.


I don't think they offer much in the way of security, but they do let you isolate the websites you choose from each other. I wrote a very small blog post about them: http://iamqasimk.com/2017/11/21/firefox-containers/


AFAIK Firefox profiles should give you the same kind of privacy as running them under different user accounts.


Indeed, that's the approach I use to run multiple copies of Thunderbird. Using a separate account does add protection against browser exploits though (if visiting an interesting-but-risky site, the 'guest' account approach prevents access to the main account - at least as long as a browser exploit doesn't then lead to a privilege escalation).


Although exploitation is more difficult, all programs that run in an X session have complete access to all the other programs in it, regardless of user ids or for that matter the host the program is running on.


As a simple demo of this, running 'gimp' as a 'guest' user allows you to take screenshots of windows owned by the user who started the session. Thus, a compromised copy of firefox on the 'guest' account could, for example, easily capture the contents of any window on the system.

Still, the main reason I run the browser as a different user is to isolate it for privacy; there are some security benefits too, but I agree it's not something that would defeat a targetted attack.


You can use xauth to get an Xauthority cookie for untrusted clients, so they can't meddle with other X clients.

The biggest downside is that you lose access to the X clipboard - which is also good, so its data doesn't get compromised.


I did not know that. Turns out it’s a bit weak though, all untrusted clients have full access to all other untrusted clients. So you only have two levels. Still, better than nothing.


I'm curious, if you try https://www.nothingprivate.ml/ from two different instances of Firefox using your script, is it still able to tack you?

FWIW, I use different Firefox "profile" and that site is able to link the two profiles.


In both cases (on separate accounts, same computer), it said "Thank you, xx xxxx! Let's see the magic..." (I used the same user name and IP address from both).

Restarting (exactly the same version) of Firefox a second time and revisiting the site gave:

"Are you anonymous? Do you think that switching to your browser's private browsing mode or incognito mode will make you anonymous?

Sorry to disappoint you, but you are wrong!. Everyone can track you. You can check it out for yourself. Just type your name below."

Which seems to suggest that whatever the site does failed in my (admittedly unusual but still simple) case.

However, I have little doubt that my (rather atypical) setup could be fingerprinted accurately - assuming, of course, I was part of a big enough minority to be worth advertising to.

N.B. other local factors could affect the results here; the more obvious ones are local DNS and a firewall between the ADSL router and the LAN.


Thanks. So running it in different Unix account gives at least one more level of isolation, compared to Firefox profiles.


An opinion: The most serious threats to user privacy from advertising companies are the "features" of the browser that allow data to be sent to or from the users computer without any input from the user. In other words, the features that let developers of websites trigger GET and POST, to "push" media to the users computer and allowing "pulling" user data indiscrinately, without explicit consent and sometimes without the knowledge of the user.

For example, there is the feature of automatically loading resources, such as images. No user input required. There is the feature of automatically loading the contents of iframes. No user input required. There is the feature of cookie headers sent automatically. No user input required. There is the feature of XMLHttpRequest triggered through Javascript. No user input required. There is support for HTTP/2. Imagine websites pushing media to users computers with greater efficiency than ever before (advertisers rejoice). No user input required.

If one is serious about regaining control over the sending of user data to these corporations and websites ("privacy"), then IMO one needs a browser that either lacks or can disable the features above and any others that allow media to be "pushed" to the user without any user input. Such a browser would only execute GET or POST upon user input, not upon input from other sources, such as websites.

Perhaps users could have two browsers: one for commercial activity and running "web apps" and another for non-commercial activity, which may not need to be default compatible with "web apps" that push media to the user. This is an alternative to having to become an expert in browser settings.

Instead of disabling features or installing add-ons, the later browser is incapable of pushing unsolicited media or leaking user data because it lacks the necessary features to do so. (I have been using such a browser for many years now. While this is probably not for everybody, I like it.)


This is a great post, and is exactly the line I've been thinking along for a long time.

I currently use uMatrix for this and it implements this almost perfectly. However, it's scope is too narrow: it only controls requests within the webpage, so doesn't have access to it many requests the browser will make outside of that scope.

If you start by broadening the scope from webpage to browser, you eventually get to the operating system level, at which point we're really just talking about a firewall/proxy tool with granular control. I've used things like privoxy and proxomitron for this in the past; little snitch is the best I've used in terms of UX and control, but it's still nowhere near as good as the uMatrix interface.

There are a number of challenges with making such a tool, the primary two being: (1) mitming secure connections, (2) contextual control, differentiating iframe, js, css, image, etc. requests becomes more difficult once you're working at a global level.

Given these limitations, uMatrix in combination with a good, strict about:config that allows granular control over everything may be the best we can ask for in the short term.


>it only controls requests within the webpage, so doesn't have access to it many requests the browser will make outside of that scope.

You also have the "behind the scene" settings:

https://github.com/gorhill/uBlock/wiki/Behind-the-scene-netw...

That's for uBlock Origin but I seem to recall it works similarly in uMatrix (can't check now).


Wow. That's not a feature I was aware of at all.

My only concern here is: does the extensions API used by this feature definitely cover all requests made by the browser. e.g. I don't see requests to geolocation services from the Navigator.geolocation API, Google Safe Browsing or CT auditing included in the list of example request types there.


>… so doesn't have access to it many requests the browser will make outside of that scope.

You have access if you chose to modify/recompile the .cpp/.rs files that deal with sending requests that those higher level functions use, this is what I do.

Some particular places of interest on /mozilla-central/:

- /servo/components/style/gecko/urls.rs, will pertain to calls called from css image functions

- /netwerk/protocol/http/nsHttpHandler.cpp, some classess that deal with handling things related to sending all http requests


To clarify, you are modifying and recompiling a Mozilla browser from source?


Correct. If anyone else is interested you can get started here and modify things for yourself for your own purposes [0].

[0] https://developer.mozilla.org/en-US/docs/Mozilla/Developer_g...


Check out umatrix, from the creator of unlock origin and original creator of unlock


This isn't the most unreasonable list I've seen, but beware you will break many websites with this, so you need to be prepared to deal with the fallout. Notably, a lot of the breakage is hidden in user.js, including:

* No WebGL or WebRTC

* Aggressive TLS settings (will break many websites)

* Mixed-content upgrading (Nightly ran an experiment on this recently and it also broke a lot of websites)

* No history

The text warns about this, but it should at least be clear why Mozilla doesn't ship this as default.


Yeah I've tried some of these extremely hardened configurations but ultimately there's too much breakage. My config now is basically:

- uBlock Origin in default configuration

- No 3rd party cookies (breaks some things, but not too many)

- Clear history and cookies on exit

Combined with an /etc/hosts file, and rather frequent browser restarts (generally daily).


> Clear history [..] on exit

Why the history? That's not readable by anyone except you, right?

> Clear [..] cookies on exit

So do you have to keep logging in to websites daily? Isn't that very annoying?


I use Cookie AutoDelete, which autocleans cookies shortly after closing tabs. I whitelist sites where I want to stay logged in or save settings.


I'm someone who clear cookies on exit to the sole purpose of having to login again every time. And I don't even use a password manager.

It does get annoying because I have to type my credentials all the time, but it just takes a few seconds so it's not a big deal. I'm sure this is not preventing me from doing anything better with my time.


I'm a person who clears cookies on exit. With a password manager with autofill it's not much of a burden but it does ensure that people can't resurrect old sessions and can't access my accounts without my password DB unlocked.


You can whitelist exceptions.

Which of course trades off against privacy.


> Why the history? That's not readable by anyone except you, right?

history sniffing


> Combined with an /etc/hosts file

Check out pi-hole and manage that for your whole family


If you're using OpenWRT/Lede you can implement this with the adblock package: https://github.com/openwrt/packages/tree/master/net/adblock/...


/etc/hosts travels with you. A LAN DNS server does not. It's a good idea to use both.


A much shorter guide that will get you 80% of the value for 2% of the effort:

* Install extension "uBlock Origin"

* Install extension "Cookie AutoDelete"

* Go into Preferences -> Privacy & Security, set "Accept 3rd party cookies" to "Never"

Done.


Also, block all tracking servers at the hosts level by adding rules in your /etc/hosts. I've been using https://github.com/StevenBlack/hosts for a few years now and it's incredibly useful.


How do you automate its updates?


Cron and git would be my idea.


Also check out Firefox's "privacy.resistFingerprinting" [1] and "privacy.firstparty.isolate" [2] prefs in about:config. These are Tor privacy features that Tor and Mozilla are merging into Firefox.

resistFingerprinting reduces the uniqueness of various Firefox properties that are visible to JavaScript and web servers.

First-party isolation will isolate third-party cookies by first-party domain. So Facebook Like buttons on cnn.com will see different cookies than Facebook Like buttons on nytimes.com. Both of these features can break some websites.

[1] https://www.ghacks.net/2018/03/01/a-history-of-fingerprintin...

[2] https://bugzilla.mozilla.org/show_bug.cgi?id=1299996


Is there any reason not to just block third-party cookies ("Accept third-party cookies: Never" in settings) all-together? I've never encountered anything breaking as a result of doing this.


I've encountered several sites that broken with blocked third-party cookies. Most of the website for one local bank around here and a payment form for a local puppet theater are the ones that come to mind offhand.

Typically, it'll be small sites that are outsourcing part of their site to a third party but don't want to open a separate tab for that which will be affected by this, obviously. If you only browse major sites doing everything in-house you're not going to run into problems.


Some sites do break. The problem is that by the time you run into one you'll have forgotten you changed this setting.


That's what I do too. Deny all third-party cookies, plus uBlock Origin with a sizable filter list and prefetching and link auditing blocked, video autoplay disabled gets me most of the security benefits with minimal setup and cognitive load. Privacy is not much better than running vanilla because although 3rd party cookies are blocked, most sites can still identify by device fingerprinting. That can only be blocked by not running Javascript, but that's personally too much of a hassle to handle and unbreak for practically every site out there.


If you ever visit the Facebook website directly the cookie is set because then it’s a first party cookie. Then it keeps sending this cookie on other sites.


// NOTICE: Breaks Firefox addon "Cookie AutoDelete" as of February 2018

https://github.com/pyllyukko/user.js/commit/d6ac49a531b58c8f...


What I don't understand is why someone would go to the problems of essentially spending a lot of time breaking their Firefox installation instead of using Tor directly when they care about privacy. Most of the tweaks boil down to turning Firefox defaults into Tor defaults, but without the benefit of actual anonymity, unless you're going to go ahead and install and trust the VPN provider which you also need.

The majority of these settings aren't the default because they cause significant site breakage (uBlock is maybe the exception). If they could have been enabled by default, they would have. Mozilla has been taking gradual steps in that direction, and backed off/reverted a few times when too much stuff broke.

Use Tor Browser. If a site breaks, at least you'll know why, and you have the choice of whether it warrants lowering your privacy.


>>Mozilla has been taking gradual steps in that direction

I disagree with this.. I do not see Mozilla taking any steps toward privacy at all, infact in many ways they are taking steps AWAY from privacy with many of their recent actions and blunders. From their use of opt-out rather than opt-in for various privacy invading features, their pushing adware to all users via what was suppose to be a QA/Feedback feature, their purchasing and invest in certain companies...

No the Mozilla Foundation from the 90's is long dead, it has been replaced by the Mozilla Corporation, which is really no different than the Google Corporation or the Microsoft Corporation


> I do not see Mozilla taking any steps toward privacy at all

https://bugzilla.mozilla.org/showdependencytree.cgi?id=12609... are some concrete steps being taken.

Or the containers work. Or the tracking protection work. If you're not seeing those, it's because you're not looking.


Indeed, Moz://a revenue stream is at odds with privacy.


Privacy is a spectrum, and not all of us have the patience/time to deal with Tor.

I consider myself somewhat tech-fluent (not in the industry) and privacy-oriented, but there's a happy medium of installing FF on my dayjob computer and loading it up with ublock Origin/privacy badger/etc versus some of the other suggestions. I'm OK with the occasional day-to-day leak if most of the footprint can be obscured.


Because if you don't use Tor browser correctly, you might actually be leaking some of your private data.


Wow, this is a great post. Thank you for taking the time to write this up. I've been using Firefox for a while now, but kept most settings fairly close to the defaults. I'm unhappy with many of their defaults, but hadn't been motivated to start tweaking stuff.

I'll note that disabling custom fonts breaks certain sites. I don't consider it a deal-breaker, but it's worth being aware. Many sites abuse fonts for icons. Developers, please consider using SVG icons instead.

Another comment mentioned how user.js disables WebGL and WebRTC. IMO, that and many other browser features should be disabled by default. If a site requires their functionality, I should be able to whitelist it. Safari used to let you conditionally enable WebGL access for only certain sites, showing a prompt when the functionality was accessed. It's a damn shame they removed the feature. I don't think most sites should have full access to all these browser APIs. Heck, all the storage APIs should probably be limited to the current session by default, with the option of requesting longer-term persistence for trusted services.

I'd really love it if we had an easy way to create fully isolated containers for each web service or group of web services, with varying tweaks in their security preferences.

Since we're already on the topic of configuring Firefox, I have a tangential question. Does anyone know how to configure Firefox to automatically save rar files? You usually receive the option to always save different file types, but the choice isn't available for rar files, so you always receive a download popup. It's quite annoying, and I have no idea why it happening. A cursory search didn't reveal any useful information on the matter. It's perplexing, because tar and zip files can be set to automatically save without any problem.

I hadn't seen uMatrix before, but it looks promising. Does anyone know of any user-friendly OS tools that lets you monitor and inspect requests? On macOS I used Little Snitch for a long time, but I'm trying to shift away from closed-source tools (no problem with paying, but I want to be able to compile it myself), especially for something so critical. Also, it doesn't let you inspect requests.


> Since we're already on the topic of configuring Firefox, I have a tangential question. Does anyone know how to configure Firefox to automatically save rar files? You usually receive the option to always save different file types, but the choice isn't available for rar files, so you always receive a download popup.

Maybe I'm missing something but for me, going to the Options tab, selecting General tab, then going to the Applications section and modifying the entry for RAR file in the list from 'Always Ask' to 'Save file' does the job.

Does it work for you?

For your second question, how about Wireshark? It's open source and does let you inspect the traffic.


There no RAR file entry in the list. It might be a macOS quirk, or it might somehow be caused by some sort of conflict with The Unarchiver. I'll probably play around with uninstalling it and trying other tools to see if that helps. Perhaps there's some sort of unexpected sandbox restriction with the default file handlers due to The Unarchiver having been installed from the App Store, or a bug with the app itself.

I just noticed that in the popup RAR files are identified as binary, while ZIP files are properly identified. Gonna have to dig into it a bit.

I've used Wireshark before, but most requests nowadays are using HTTPS. I vaguely recall at some point having tried to snoop on local HTTPS requests with Wireshark and ending up frustrated.


Most likely the site is sending a content-disposition: attachment which forces Firefox to show the save file dialog. This is a very old bug.


> NoScript Security Suite: since uMatrix will be used to block scripts, this functionality is not required from NoScript

This is a mistake. uMatrix will block requests which would pull source code, but it does not stop script execution, i.e. those embedded in the page itself. NoScript stops script execution completely.

NoScript also activates `<noscript>` tags which will allow content to render on Medium. And which break Twitter by redirecting away just after it loads...

Caveat emptor: NoScript is somewhat buggy in its WebExt form. Sometimes it needs a click on the global revoke button to render `<noscript>` again. Sometimes it pops up blank windows. I'm not aware of any alternative though.


> NoScript Security Suite: since uMatrix will be used to block scripts, this functionality is not required from NoScript

> This is a mistake.

thanks for bringing this up - i couldn't quickly find info regarding how uM handles inline scripts, but i see that uBO does have that specific option - this has caused me to look deeper into uBO and i'm now considering revising the guide and dumping uM completely since i personally don't require all the granularity of uM (i always allow images/css globally for example)

so although there may be some caveats with not using NS, it seems that uBO can basically eliminate the need for NS, at least for those of us that just want stuff to work for the most part


> uMatrix will block requests which would pull source code, but it does not stop script execution, i.e. those embedded in the page itself

You are mistaken. You could have taken a few seconds to try for yourself before making this erroneous claim.

> NoScript also activates `<noscript>` tags which will allow content to render on Medium.

uMatrix can also "activate" the `noscript` tags, and this can be disabled/enabled on a per-site basis.


Can you explain how to enable/disable the noscript tags? I can only find a global "spoof noscript tags" option which does indeed seem to make <noscript> tags work.

I did base my comment on an observations. I wouldn't be using NoScript myself if I wasn't fairly sure it's the only way.

Seems I was wrong indeed, because there are no scripts listed in the Debug tab of the Inspector if uMatrix blocks them.

Thanks for pointing it out! removes NoScript


> Can you explain how to enable/disable the noscript tags?

It's one of the per-scope switches, see: https://github.com/gorhill/uMatrix/wiki/Per-scope-switches


Thanks!

Unfortunately, Twitter's forced redirect still takes place even after it's off :(


> NoScript also activates `<noscript>` tags

Ah, that explains why I occasionally see <noscript> on links. I use NoScript too, but only allow scripts from a select list.


For the user.js bit, I prefer pyllyukko's relaxed branch[1]. I've got a setup quite close to it[2] and it works very well with next to no breakage.

Also, using both uBlock Origin and uMatrix is somewhat redundant. Gorhill himself has advocated using per-domain permissions in uMatrix and not having different settings for each element type in uMatrix (if I remember correctly, I can't be bothered to look up the source right now) which is easily done in uBlock Origin using Advanced Mode. One can also replace Neat URL and Skip Redirect with Request Control[3], which is a more flexible solution, imo, though it requires one make their own rules.

[1] https://github.com/pyllyukko/user.js/tree/relaxed

[2] https://github.com/savyajha/dotfiles/blob/master/Firefox/use...

[3] https://addons.mozilla.org/en-US/firefox/addon/requestcontro...


Just a general comment, the author mentions and links a vpn provider several times and the links contain referral codes, though the author never mentions it.


Getting pwned via plain C++-induced memory-unsafety bugs can also lead to privacy trouble, so recommending turning off security updates or recommending forks that aren't staffed well enough to fully track Gecko security patches is not great advice.


> Firefox Configuration Guide for Privacy Freaks and Performance Buffs

I'm surprised that this post is linked to a non-https URL while the website supports https.

I wish HN have some policy/recommendation to prefer https URLs to non-https one (if the URL support both).


Privacy used to be a hobby for people with tinfoil hats or the monicker 'cypherpunk', but soon even Grandma will be going through a key signing ritual while muttering something about a return to the gold standard.


It's actually sad how so many people don't care about their privacy, sure you can give a little but it should be an option not mandatory and in many aspects it is mandatory or done without the person knowledge.

The least thing they could do is ask for it...

Edit: I shouldn't write early in the morning, tons of grammar mistakes.


I am consistently surprised why this isn't a bigger deal giving the gravity of what is happening. I see news headlining all the time for very transient things but have yet to see an easily accessible version that I could hand my mother of why privacy is so important if anyone could show me one I would greatly appreciate it


> C:\Program Files\Mozilla Firefox\browser\features\

WOAH.

That's some genuinely nasty stuff that no one would normally want on their machines AND visible only from an obscure about:support page AND with no clear way of disabling it, save for deleting .xpi files:

    followonsearch@mozilla.com.xpi [1]
    shield-recipe-client@mozilla.org.xpi [2]
There are also these two that explicitly disrespect and ignore one's updating preferences:

    aushelper@mozilla.org
    webcompat@mozilla.org.xpi
These appear to be a way for Mozilla to push "urgent" patches bypassing the normal update mechanism and user consent.

    ---
This is completely unacceptable. This sort of functionality should be in the main UI and it should be possible to disable it with one click, permanently.

[1] https://blog.mozilla.org/data/2017/06/05/measuring-search-in...

[2] https://wiki.mozilla.org/Firefox/Shield - generalized engine for running "study" recipes.


Shield Studies are in the "Privacy & Security" settings: "Allow Firefox to install and run studies"

"Follow On Search" is AFAIK controlled by the Telemetry settings in the same dialog in "Privacy & Security".

ignore one's updating preferences

Are you sure? They're add-ons, so they should follow the add-on update preferences. Given that aushelper apparently does nothing aside from modify the update URL to include info whether the system is affected by some bug, it's hard to see how it wouldn't respect the settings.


While I understand you not wanting Mozilla to be able to push urgent patches, this is (in my) opinion necessary in sufficently serious cases. Imagine a remote code execution hole in Firefox, being actively attacked through ad networks. In that situation, every minute counts, and an attack could perform serious damage.


"The road to hell is paved with good intentions."

It's not about denying Mozilla an option of pushing zero-day patches. It's about the fact that it's a built-in always-on _concealed_ feature.


The source code is public, there's probably been a blog post about it, there's an about-page for it, and there actually is a setting for it in the main-UI.

I really don't see how it's particularly concealed. If they actually tried to conceal it, you would not know about it at all.


Why are those concerned with privacy "freaks", but those concerned about performance "buffs"? As a person who cares about privacy I find this a bit offensive and it's not helping the privacy debate.

Nice article otherwise and I have to congratulate the web designer - what a beautiful, readable site!


i meant no offense - i used the term "freaks" because a lot of people who don't give a crap about privacy may consider us 'privacy freaks' - sine the article is about privacy however, i think it's obvious that it's just a poke in the ribs, if you will


gotta love the fact that guide for privacy freaks is served via plain http :)


While searching on github for more user.js privacy and security hardening modifications (https://github.com/search?q=%22user.js%22&type=Repositories ) i've added https://raw.githubusercontent.com/CHEF-KOCH/NSABlocklist/mas... to uBlock. take that NSA!

Also check out https://www.privacy-handbuch.de/handbuch_21.htm for more interesting firefox modifications (german site).


I like a lot of the suggestions in the article and the comments here but at the same I feel like attempting to evade is a battle already lost.

Ruining the analytics is a better tactic in my opinion. Flood the tracking with useless data


Nice, I haven't seen uMatrix before (I user uBlock) and I'm used to use NoScript. NoScript is kind of "complex" to use and many website will not work or require a lot of setup to make videos or other media running. with uMatrix it seems a bit easier.


Very thorough and useful, thank you!

A non-firefox-related but privacy addition I will suggest is a strong hosts file [1]

[1]: http://someonewhocares.org/hosts/


The one thing I'd like configurable in FF Quantum is the amount of processes it spawns. I run Selenium tests and if I choose to run five or six instances I end up with two dozen processes. Fucking hell.


But it is configurable!

Preferences > General > Performance > Uncheck the box next to Use recommended performance settings.

You will then be able to change the following settings:

- Use hardware acceleration when available

- Content process limit


Nope, that doesn't work. I have it at one (default) and it's still spawning too many processes. I guess it's part of how things work now with FF. You get a snappier experience in exchange of higher memory usage because it's multitasking. If only we could turn that thing off.


My bad, sorry... What about this, from [1,2]?

> To disable e10s/multiprocess go to about:config by typing it in your URL bar. Search for browser.tabs.remote.autostart using the search box on about:config. There may be multiple results. Set them all to false and restart the browser (if there are no entries, create it as a boolean and set it to false).

[1] https://support.mozilla.org/es/questions/1191898

[2] https://support.mozilla.org/es/questions/1191898


Yeap, that did the trick. Thanks a million dude.


The setting that Severine mentioned controls the number of content processes, which is the number of processes that Firefox uses for its tabs.

It will still have another process that controls all the UI and all those browser tab processes, a process for sandboxing extensions/plugins and then even more processes for miscellaneous things, for example I think asynchronous scrolling has another process and when doing performance improvements, they often just stuck long operations into a separate process.


Great article.

However, the days when we could install some plugins and tweak a few settings to restore our privacy are, unfortunately, pretty much over. There’s only so much a plugin can do when it doesn’t have access to the core APIs of the rendering engine or the network stack.

As long as Google and Firefox are incentivized to make money by ads, user tracking and all of the rest, they won’t stop.

Long story short: the business model of the web has to change from one where the default state is to monetize the invasion of our privacy to one where we can control who gets to advertise to us and that our attention is valuable; we should be paid for it.

In short, that’s what the Brave browser is all about: https://brave.com/com465. By default, it blocks ads, tracking scripts, fingerprinting and 3rd party cookies in such a way that most pages don’t break. It even blocks those cryptocurrency mining scripts that some sites like Salon are using: https://www.cnbc.com/2018/02/14/salon-disable-ad-blocker-or-....

Brave allows you to pay content creators with a cryptocurrency called Basic Attention Token (BAT) based on the amount of time spent on their sites or as a percentage of a monthly contribution. BAT is based on the Ethereum token standard.

Later this year, Brave users will be able to opt-in to getting paid to watch high quality, relevant ads if they wish. How? By using zero knowledge proofs, Brave can show you these ads without leaking your personal information, based on your browsing history, that never leaves your machine.

Be aware: Brave is in beta; it’s not done yet. It’s based on Chromium but the rest of the tech is under heavy development. It has come a long way in the 3-4 months I’ve been using it regularly. And there are lots of good things in store, including Tor on a per-tab basis, which I’m looking forward to: https://github.com/brave/browser-laptop/wiki/Brave-Tor-Suppo...

Brave runs on macOS, Windows, Linux, iOS and Android; even if Brendan Eich of Javascript and Mozilla fame weren’t involved, I’d feel this is the spiritual successor to Firefox: https://brave.com/com465.


This sounds 100% like an ad just FYI


I didn’t intend to sound like an ad——my apologies if it came off that way. No more 4am posts. ;-)

But in all seriousness, I stand by what I said—just installing plugins isn’t going to do it any longer. Blocking ads and tracking scripts and the like needs to built-in to the browser and that’s what Brave has done.

Brendan Eich's explanation is on point: https://vimeo.com/209336437


Are there any mods on this site? This is spam with referral links.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: