Hacker News new | past | comments | ask | show | jobs | submit login
All extensions disabled due to expiration of intermediate signing cert (bugzilla.mozilla.org)
1318 points by xvector on May 4, 2019 | hide | past | favorite | 874 comments



There's a workaround that involves going to about:config and setting xpinstall.signatures.required to false.

However, if you're running the Stable or Beta version, it will only work under Linux. On Windows and MacOS you'll need to download Nightly or the Developer Edition.

To fix this on MacOS I did the following:

1. Downloaded and installed Firefox Nightly

2. Ran /Applications/Firefox\ Nightly.app/Contents/MacOS/firefox-bin --profilemanager

3. Changed the profile to "default" so my normal Firefox profile would be used

4. Started up Firefox Nightly, opened about:config, then set xpinstall.signatures.required to false

Not sure if it's a good idea to use my default profile in Nightly. It might be a wiser idea to copy it instead.


Upgrading your profile from Release to Nightly, which occurs automatically when you open it with Nightly, is a one-way irreversible step. This could prevent your profile from being used with Release without crashes, or lose profile data such as bookmarks or saved passwords when later used with Release, depending on what work is underway in Nightly and if it happens to be backwards-compatible. Be sure to backup your profile if you choose to switch channels.

Note: I am told that Developer channel uses a separate profile, but there are instructions below showing people how to override that, at which point this warning becomes relevant once again.


FWIW I started using beta, nightly and the old "UX" channel, first on Mac and then on Linux, and before I knew it could be a problem I switched between them with the same profile all the time. Maybe there were subtle bugs I wasn't aware of, but nothing I ever noticed.


I haven’t run into any issues in a while, but you only have to get hit by lightning one time to lose your profile data. Best to be consciously careful about it.


I do agree, and I'm more careful now. Always keep a backup, at the very least. I now symlink ~/bin/firefox to nightly because some apps seem to have it hardcoded to open "firefox" rather than what's set as default.


Oof. Would you happen to know if it's the same with the developer edition as well?


Yes, the risk remains. If I read this right (from my phone), Release is 66, Developer is 67, Nightly is 68. This isn’t guaranteed to be a problem, but it’s not guaranteed okay either. YMMV.

(See reply about Developer, though.)


The developer edition has its own user profile.


That’s a good point. However, some of the instructions below specifically tell people how to force any channel onto using the existing Release profile. I’ll update my post.


And I told the developer edition to use my regular profile because that's the one that has all my settings and add-ons and I didn't realize the risk was there. Guess at this point all I can really do is hope and cross the bridge when I get there.


If you’re on Mac, you should be able to recover the old profile with time machine. Or if you are on windows and have another backup setup.


Looks like it would have been better to copy the profile instead. I managed to get most of my profile back using Firefox Sync, though for some reason it didn't transfer across my preferences and I had to redo those.


Gotta love the Linux release team for not disabling this ability.


And Linux desktop for being pretty usable. :)


The following workaround works on regular editions: https://www.reddit.com/r/firefox/comments/bkhzjy/temp_fix_fo...



Firefox stopped respecting the signature-required setting in the mainline version in 2016. I know because I got burned by it and made a Hitler parody.

https://youtube.com/watch?v=taGARf8K5J8

And frankly, this an extra absurdity on top of that. If you’re going to require signatures for all extensions, regardless of user preference, shouldn’t you be keeping an eye on the signing process?


Why does Mozilla do this? Same with removing the option to not update. Why not let users choose (in the case of update maybe with an about config setting)?


Because (stable) users are dumb, are easily manipulated and can't be trusted. Thus the mothership has to be in control for the greater good. They also argue that enduser computers are already effectively "compromised" from a mozilla perspective because adware runs installers with admin privs and thus could insert things into the program folders. Thus anything the user can do adware could do too and therefore they can't give them any choice.

They put it in nicer words though.

To their credit, you can opt out but only if you switch to dev edition, nightly or custom builds, which either is a one-way road since downgrades corrupt profiles or tedious because you don't receive auto-updates.

But what they should really have done is allowing additional signing roots. Even secure boot does that.


This sounds like a threat model and mitigation developed by a college intern.

How, exactly, is a user land application going to protect itself from modification by a computer admin? I think DRM, anti-virus, and os vendors everywhere would love an answer to this.

This threat model completely fails to account for live patching, trusted cert root modification, dll hooking, etc. Either the Mozilla security folks are incompetent / winging it, or this isn't the real reason.


Here's the official reason in case you don't trust my grim representation of it: https://blog.mozilla.org/addons/2015/04/15/the-case-for-exte...


I get the ostensible justification, but attacking this way requires the user to dig into the obscure dev settings and load an xpi from outside the browser[1]. Is there even one case of a user compromised that way?

[1] or at least they could have allowed that as a compromise


I updated my previous comment. They say there exist crapware installers that use elevated privileges that do inject stuff into the browser and that's why we can't have nice things, yes.

But I disagree with their value tradeoffs. They want to add a little "protection" - which is really flimsy since there is no privilege separation - for users who already compromised their systems with adware at the expense of the freedom of everyone else.


I'm totally fine with software already running on my machine being able to install addons into my browser. It can also already install a keylogger and record the screen, what's the big deal?


Are you fine with calling “editing of crypto certs” a study? And do you endorse all Orwellian doublespeak, or just this instance?


Because they don't want trojans to hijack the browser. If the user can change the signing preference, any application can.


It is not possible for a user land application to prevent root processes from hijacking / modifying it. Such protection requires the protecting mechanism to run at a higher level of trust / security ring than the attacker.


Yes, the sibling comment and thread already brought that up.


This worked for me on Firefox 60.6.1esr on Debian 9 Linux—changing the setting instantly restored my addons.


No go for me on Firefox 59.0 / Debian 64bit. I even restarted Firefox but they're all still "Legacy Extensions". :(


Legacy Extensions is different.


BINGO... X-ring.

I OWE you, dude.


On Windows and MacOS you'll need to download Nightly or the Developer Edition.

The workaround also works if you're running Firefox Extended Support Release on MacOS. Thankfully.

For me missing extensions aren't just an inconvenience. I simply don't browse with JS on. Firefox is dead to me without NoScript.


Same is true for ESR on Windows.


Works on android too.


Thank you!

Saved me tons of ultimately pointless thrashing.


It is probably safer to use an unbranded build with the same version as the currently installed Firefox (take note that it will not update). Page with links to the latest release builds: https://wiki.mozilla.org/Add-ons/Extension_Signing


This also works if you build from source, even if you build off mozilla-release. (Just tried it.)


Doesn't work for me. Using Arch Linux. I was already on Nightly when this happened.


What timezone are you in? I'm in UTC-4 (Detroit), and haven't seen any problems so far. (Also running Nightly on Arch Linux - I haven't made any previous changes to the addon signing either)


To clarify, by 'not working' I meant none of the addons with signing issues are re-enabled after changing xpinstall.signatures.required. I might have wrongly assumed this would happen. However, I tried installing a new addon I had never installed before and that works, but reinstalling one that I had previously installed still doesn't, even after uninstalling it (uBlock Origin).

My timezone is America/Los_Angeles.

EDIT: Sorry, I'm dumb. I actually have two versions of FF installed and I chose the one that wasn't Nightly.


This does not work with Firefox 66.0.3 in Arch Linux ...


Update: We have rolled out a partial fix for this issue. We generated a new intermediate certificate with the same name/key but an updated validity window and pushed it out to users via Normandy (this should be most users). Users who have Normandy on should see their add-ons start working over the next few hours. We are continuing to work on packaging up the new certificate for users who have Normandy disabled.


I've been through all of Firefox `about:config` a few times in the past, fixing preferences to, e.g., try to disable umpteen different services that leak info or create potential vulnerabilities gratuitously, but this is the first I recall hearing of Normandy.

Apparently I missed `app.normandy.enabled`, because I think I would've remembered a name with connotations of a bloody massive surprise attack.

Incidentally, `app.normandy.enabled` defaults to `true` in the `firefox-esr` Debian Stable package. Which seems wrong for an ESR.

For personal use (not development), I run 3 browsers (for features/configurations and an extra bit of compartmentalization): Tor Browser for most things, Firefox ESR with privacy tweaks for the small number of things that require login, and Chromium without much privacy tweaks for the rare occasion that a crucial site refuses to work with my TB or FF setup.

Today's crucial cert administration oops, plus learning of yet another very questionable remote capability/vector, plus the questionable preferences-changing being enabled even for ESR... is making me even less comfortable with the Web browser standards "big moat" barrier to entry situation.

I know Mozilla has some very forthright people, but I'd really like to see a conspicuous and pervasive focus on privacy&security, throughout the organization, which, at this point, would shake up a lot of things. Then, with the high ground established unambiguously, I'd like to see actively reversing some of the past surveillance&brochure tendencies in some standards. And also see some more creative approaches to what a browser can be, despite a hostile and exploitive environment. Or maybe Brave turns out to be a better vehicle for that, but I still want to believe in Mozilla.


I too use Debian's Firefox ESR. I noticed the "Allow Firefox to install and run studies" option in Privacy & Security Preferences a long time ago. It was unchecked and greyed out (i.e., unclickable), and a label below it says "Data reporting is disabled for this build configuration", so I gave it no further thought. This morning I woke up and launched Firefox, noticed this headline, and then noticed my extensions were still running. I looked in about:config and lo and behold, app.normandy.enabled=default [true]. I'll be filing a bug with debian to disable this in the build configuration.

Edit: There are some questions about whether Normandy is really enabled in Debian Firefox ESR even if the about:config setting defaults to true. I've filed a bug report, and I'm sure once a Debian maintainer has a chance to look at it we'll find out the answer.

https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=928433

Edit2: It should go without saying, but please do not spam this bug report with "me too" and its ilk.


I had mine disabled. So let's think about this for a second. If I disable a security hole that you can drive a semi-truck through, I remain foobar'd. If I run my "secure" firefox configuration, with the security hole enabled, then they un-foobar me first. Before anyone else. So I could effectively get rewarded, for always keeping a security hole open. But I didn't keep it open, so... yeah... they'll get around to me sometime.

?

>:-(

Grrr.

I'm just getting old and curmudgeonly maybe? I've decided though, I'm starting an animated security blog to show people the ludicrousness of all this kind of stuff in plain language. I'll be Statler, and I just need someone to be Waldorf. Because this stuff really is getting Statler and Waldorf level ridiculous.


> I'm just getting old and curmudgeonly maybe?

You're not. You just have standards.

We need people with standards in this industry, because that's the only source we have of market signals that prevent the market from going full user-hostile.


>If I disable a security hole that you can drive a semi-truck through, I remain foobar'd. If I run my "secure" firefox configuration, with the security hole enabled, then they un-foobar me first. Before anyone else. So I could effectively get rewarded, for always keeping a security hole open. But I didn't keep it open, so... yeah... they'll get around to me sometime.

That's needless drama. They will be rolling out the fix in a point release. Whatever way you use to update your browser will install that and get the fix. So the worst case is just going back to the old days where you'd have the issue until your distro issued a new package or you manually updated the browser version on Windows or OSX. What exactly would you expect that's not exactly what's happening?


> So I could effectively get rewarded, for always keeping a security hole open.

That's the way it always works, isn't it? Security and convenience are opposing concerns.


Good security designs increase convenience (eg ssh, touch id, single sign on).

The three goals of computer security are integrity, confidentiality and availability.

All three of those expand the usefulness of the system to the end user.


But the point here is not about integrity, confidentiality, or availability. It is about whether you trust Mozilla, and how much trustworthy they are.

A configuration where Mozilla cannot push remote updates is neither more secure nor less secure. Mozilla is often under fire for not allowing a privacy conscious, minimal trust use case.


Automatic updates aren't a security hole. They are a security enhancement


Unless the entity that pushes the updates become malicious, then they're a security hole.


How do you audit Firefox updates? Because if the answer is “I don’t”, Mozilla already controls the most important piece of userspace code on your computer. And if the answer is “I don’t install them”, then everyone with a few grand to spare already controls the most important piece of userspace code on your computer.


I rely on the Debian system to assist with that. Normandy bypasses that system, if it's enabled. (The jury is out whether it's actually enabled in Debian Firefox ESR.)


What do you think the median size of a Firefox release is, what do you think the resources (let’s call it US dollars FMV) are to audit that, and what do you think the resources Debian has to devote to it?

Clearly more eyes are good, but... In between “Wild West WebExtensions” and “Mozilla backdoors my Firefox and it gets used for nefarious purposes” and “delays in browser updates increase exploitation windows”, I know which threat models I’m buying.


I agree an unpatched vuneribility is probably more risky. However this feature can change settings the user explicitly sets. The bigger issue is it does not give me any indication the settings have been changed.


I audit software updates by looking at developers announcements and community discussions (in social media, forums etc) before installing updates.

Then, even if developers keys and computers are compromised, I would notice something is wrong.

* No, of course that I don't always do that. I even don't often do that. But I did do that in the past, and I'd like to have the option to do that.



Yes, that's right, if you install software that had a bug, then if you give someone permission to modify your software, you can get a bug fixes faster.


There are already channels for bug fixes, and some of the friction on those channels is intentional, such as for visibility and oversight/approval.


Exactly.

Why even have an official channel, providing visibility and official oversight, if when it comes down to it, you're just gonna push remote code updates through the same side channel a potential hacker would use?

People are saying it's for convenience. OK, but then they have to understand that doing things in that fashion is a really bad look. And now your users are set up to believe that, at least some of the updates coming from the side channel are "trust"-able.


This is a piece of code downloaded from Mozilla servers to re-enable extensions, which are other pieces of code you download from Mozilla servers. If your threat model includes not trusting Mozilla servers then you've presumably disabled browser extensions (or sideloaded them) and this issue is irrelevant to you. If you do use extensions and get updated versions from Mozilla I don't see any way in which this increases your attack surface.


> noticed my extensions were still running.

Reportedly, Firefox only checks the date once per day, so if it hasn’t yet checked for you today, this will be the result.

> I looked in about:config and lo and behold, app.normandy.enabled=default [true].

I would assume that the config setting only has any effect if the feature is available in the build. Which it isn’t in Debian.


that would be good news, how can I verify that the Normandy feature isn't available in the Debian build?

Has Mozilla provided instructions to manually fix the issue? if so where? (XORcat was helpful to provide a solution, but I refuse to apply it if it doesn't come from Mozilla itself...)


> how can I verify that the Normandy feature isn't available in the Debian build?

In the Debian bug about this issue¹ it says “Firefox from the Debian package has data reporting disabled so using studies is not possible.

1. https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=928415


Thank you for reporting it. Debian's pro-user stance is one of the things I like about it... now tell me they've disabled Pocket too and I might just switch from Arch to Debian Sid.

FYI, I have learned from other user's comments and the Wiki page below that Studies and Normandy are different things. The former depends on the latter, but not vice versa. So it is possible that Debian disabled the studies program but did not disable the underlying Normandy tool. You might also want to look at whether firefox is affected in addition to firefox-esr.

https://wiki.mozilla.org/Firefox/Normandy/PreferenceRollout#...


On Windows I have the "Allow Firefox to install and run studies" option disabled and yet in about:config Normandy was still enabled. I haven't received the fix. Could be that Firefox simply hasn't checked for it, or it could be that there's more than that about:config setting that determine whether Normandy is run.

Weird.


Posting an update for posterity. It appears that even with app.normandy.enabled=default [true], as long as app.shield.optoutstudies.enabled=false, Normandy is disabled. app.shield.optoutstudies is the key controlled by the UI element "Allow Firefox to install and run studies".

I've closed the above bug report as it's not really a bug.


Can we get a clarification:

Unchecking "Allow Firefox to install and run studies" in the UI does not change "app.normandy.enabled" to "false".

Then, does unchecking "Allow Firefox to install and run studies" really disable Normandy, or not?


As explained on Normandy's wiki page, they are related but two different things:

> Preference rollout is meant for permanent changes that we are sure of. Shield is meant for testing variations and figuring out what, if anything, is the best thing to do.

https://wiki.mozilla.org/Firefox/Normandy/PreferenceRollout#...


Except as we have learned "preference rollout" is also "installing extensions". So this is much the same as studies, but studies was disgraced, so now this is studies 2.0, no option to disable this time around.

And if you look at the big normandy JSON, hey, it's all the same Pocket and heartbeat shit we've seen from studies.


That doesn't answer the question: does Normandy get disabled by the UI option or not?

One can guess based on the wiki page that the answer is "no", but that's just a guess.


"Explained" is perhaps too generous a word. I'm a software engineer and I found that page to be confusing. It seems to be written for internal Mozilla employees, not for the general public.


Okay I will try


Is there a CVE for this "normandy" backdoor?


ESR means extended support, not "never changes". As long as it is supported (and it is: the bug got fixed), it's appropriate for ESR.


As I said elsewhere: There are already channels for bug fixes, and some of the friction on those channels is intentional, such as for visibility and oversight/approval.


And this is another channel. How can you be sure this one has less friction than the others?


I read at https://discourse.mozilla.org/t/certificate-issue-causing-ad...

>12:50 p.m. UTC / 03:50 a.m. PDT: We rolled-out a fix for release, beta and nightly users. The fix will be automatically applied in the background within the next few hours, you don’t need to take active steps.

>In order to be able to provide this fix on short notice, we are using the Studies system. You can check if you have studies enabled by going to Firefox Preferences -> Privacy & Security -> Allow Firefox to install and run studies.

>You can disable studies again after your add-ons have been re-enabled.

>We are working on a general fix that doesn’t need to rely on this and will keep you updated.

I refuse to enable studies, even temporarily. This comes very close after the IE6 conspiracy revelation, where ends justifies the means.

Please provide a link to the certificate file, and step by step instructions for installing it, without enabling and conflating with mozilla studies...


JSON response from the `normandy` API here: https://xor.cat/assets/other/random/2019-05-04/normandy_sign...

hotfix-update-xpi-signing-intermediate-bug-1548973: https://storage.googleapis.com/moz-fx-normandy-prod-addons/e...

From the looks, it installs the above plugin, and changes `app.update.lastUpdateTime.xpi-signature-verification` to `1556945257`

I can't get it to work in ESR 60 though. Getting file not found on "resource://gre/modules/addons/XPIDatabase.jsm"

edit: The linked XPI definitely seems to add the new certificate, whatever mechanism used to reverify the signatures just doesn't seem to work in 60.

edit2: Restarting Firefox appears to have forced the reverify... Possibly a flag that I twiddled with though, hard to be sure. Either way, the above should help people get everything running again without having to enable studies/normandy.


Yes, this is broken on ESR, but only somewhat broken.

The hotfix extension does two things:

1) Install a new certificate for "CN=signingca1.addons.mozilla.org/emailAddress=foxsec@mozilla.com", effectively replacing the old certificate that expired. This should work.

2) Then it tries to import the internal "resource://gre/modules/addons/XPIDatabase.jsm" module and calls XPIDatabase.verifySignatures().

This does not work on ESR, as "XPIDatabase.jsm" is a new-ish thing that isn't present in ESR yet. In ESR the function is still in "resource://gre/modules/addons/XPIProvider.jsm" (XPIProvider.verifySignatures()). Thankfully, the non-existing module is imported using ChromeUtils.defineModuleGetter, which only lazily loads the module on first of the imported property, so after the certificate-adding code has run.


So not only does this 'normandy' thing exist, but it goes to a google server? So much for using Firefox to keep google out of my life. :(


that's an interesting question: when we install add-ons or extensions, are these hosted on google servers? I'd rather not have google know what versions of which add-ons I am running...


Hey, if you just click on that storage.googleapis.com link it installs the hotfix directly without having to enable normandy ;)


Unrelated to cert problem: Yes, clicking on the link installs the plugin, but it is suprising to see that firefox claims that it is the news.ycombinator.com, not storage.googleapis.com, that wants to install plugin. Could it be a security issue since if an attacker somehow manages the post/inject a link for a malicious plugin in a credible site, firefox will claim that plugin is from that site?


oh wow! that's really bad


I just installed some random googleapis link. This is so stupid, and very disappointing from Mozilla.


This should be sticky comment somewhere on the top of the comments. It bought all the addons back for me.


It does, but it didn't fix anything for me. All my extensions are still gone. :(


You might have to reinstall them unfortunately, on the system I figured that out on Firefox had decided to uninstall them (I think because I had to update the browser from the ancient version the user was using first).


Just tried on Android. Hooray!


Clicking the URL was the only way I was able to get the hotfix on Firefox mobile for Android


Thanks for the sleuthing, but who does this repository belong to? I'd like to apply it but only if mozilla provides such instruction on their issue page, I don't know who the actual owner of /moz-fx-normandy-prod-addons/ is...

https://storage.googleapis.com/moz-fx-normandy-prod-addons/e...

Can mozilla please verify, confirm authenticity, and list this instruction on their issue page?


I would have the same question if I didn't see the response come back from https://normandy.cdn.mozilla.net/ myself.

I encourage you to go through the whole Normandy process yourself in a test environment, and even better (if possible), check out the code to see whether it looks legit or benign.

I'm happy, because I went through and checked it out myself without needing to enable Normandy on my actual Firefox, but ultimately, it will be great when Moz can get instructions for manually applying the fix out.


>hotfix-update-xpi-signing-intermediate-bug-1548973: https://storage.googleapis.com/moz-fx-normandy-prod-addons/e...

This fixed it for me. Thanks. W10/FF 66.0.3


The last sentence you quoted literally said that they will provide you with the option to fix this without needing to enable studies.


correct, and I am emphasizing and pointing out my choice to wait such that others can make the same informed choice if they so wish.

(I would have wanted to read my comment if someone else had written it, so by the golden rule I make the comment I wish I had read)


> This comes very close after the IE6 conspiracy revelation, where ends justifies the means.

What?!


you probably missed this story: https://news.ycombinator.com/item?id=19798678


I actually did read that story but I don't understand what that has to do with anything being discussed here.

Yes, Youtube put up a banner asking IE6 users to move to a more modern browser 10 years ago. How is that in any way related to Firefox pushing a hotfix in 2019 to fix a certificate issue? Are you worried there is a big evil conspiracy to use this mechanism to uninstall Internet Explorer from peoples' computers?!


Okay, so, youtube targets a small subset of users, and changes their experience capriciously, and to suit their own purposes.

Firefox, it turns out, has a built-in telemetry system that defaults to enable exactly the same behavior: changing your system, to suit their desires.

You’re words “a big evil conspiracy to use this mechanism to uninstall Internet Explorer from peoples' computer” are misleading. No one would propose that the intent is an attack on Microsoft applications. Rather, the intent is to blindfold users on a whim, should a Firefox component prove inconvenient to the providers of Firefox. Ostensibly, in the event that some add-on or extension threatens the bottom line for major backers of Firefox’s funding.


> Firefox, it turns out, has a built-in telemetry system that defaults to enable exactly the same behavior: changing your system, to suit their desires.

An example of the typical use of this system: say Mozilla wants to enable video hardware acceleration in Firefox but they don't know if bugs in video drivers or in Firefox will make crashing more frequent. So they enable hardware acceleration for 1% of users instead of 100% and compare the reported crash rate between the two to determine if it's ready to be pushed out universally.


At some point in the next five-ten years we will see this "feature" abused. Maybe Mozilla will use it to "soften" commonly used ad blockers to enable "acceptable" ads for Firefox users. Maybe Mozilla will be hacked by some government that wants to enable MITM attacks against its citizens, and Normandy will make that happen. Or maybe Mozilla will just cooperate with the government trying to do so.

You say it is "typically" used for benevolent purposes, but why should we trust Mozilla? Mozilla does not have a stellar history with this sort of thing and in my experience they do not take security as seriously as they should if we are to trust them with such a feature.


The level of paranoia throughout this thread is truly through the roof.....

Mozilla has had several "PR nightmare" decisions that a vocal set of users didn't like, and sometimes were genuinely ill advised/bad/shitty. But as far as I can see they do not have a bad track record when it comes to security/privacy. Do you have any examples of actual serious security/privacy fuck ups by Mozilla/Firefox? I mean that stood up to scrutiny beyond the sensationalist headlines?

Their defaults might not be your defaults, but they are even working on bringing Tor into mainstream Firefox. None of this means they are above criticism of course, but... context!

The sum total of their actions points towards an organisation that has some internal problems but that is genuinely pursuing privacy and an open web as a goal for as many users as possible.


> But as far as I can see they do not have a bad track record when it comes to security/privacy. Do you have any examples of actual serious security/privacy fuck ups by Mozilla/Firefox?

I mean, they are currently shipping real actual ads on the new tab page that aren't blocked by ad blockers - and possibly can't be (there are limits to what WebExtensions can modify on Firefox internal pages). Sure, maybe your parent comment was exaggerating a little bit, but what if Mozilla instead starts inserting "privacy-friendly" "recommendations" into webpages in order to "enhance users' browsing experiences"? That doesn't sound at all far-fetched for the Mozilla we know today.


Besides your claim not being true AFAIK tell [1](there are no ads on my new tab page, and as far as I can tell there was no incident of paid for content showing up on peoples new tab), how exactly would shipping ads be a privacy/security violation?

This is exactly the sensationalist misrepresentation I was talking about. You don't like what they are doing, fine. Misrepresenting it as something that it's not is not fine.

Besides: Mozilla is funded in large parts by having Google as the default search provider. This means they are funded by Google selling ads. Them starting up new revenue streams and getting away from that funding model would be a pro privacy step.

[1] If you are referring to something else that I missed, feel free to enlighten me.


Maybe you've opted out of studies or otherwise disabled Pocket? That's how they're bundling much of this new stuff in.

See: https://help.getpocket.com/article/1142-firefox-new-tab-reco... especially the part that says "From time to time, the occasional sponsored story may appear as a recommendation from Pocket. These stories will always be clearly marked, and you have control over whether they’re shown on your new tab page."

All so-called recommendations I've seen have been spammy, the sort of stuff you see linked as "other articles you may enjoy" when you disable your ad blocker on bad sites. Regardless, this directly contradicts your claim that there haven't been incidents of sponsored content on the new tab page: this is explicitly what is happening according to Pocket's own website. Mozilla themselves explicitly said they are introducing sponsored stories to the new tab page: https://blog.mozilla.org/futurereleases/2018/01/24/update-on...

I think there's a world of difference between making a search engine that sells ads the default, and selling ads yourself and inserting them into the browser's chrome. Among other issues, if I help someone install an ad blocker, that ad blocker will block ads on Google, but will not block ads in the browser chrome.

So, given this and other recent behavior by Mozilla, I have to say I don't think seeing "related stories" inserted into the browser chrome for certain web pages is at all far fetched. That should worry us.


I thought you were referring to the snippets.

I actually don't see the pocket recommendations on my desktop (maybe the Linux Mint build has them disabled by default), but they are there on mobile. There is a UI setting to disable them of course. It's explained right on the page that you link to.

More importantly, that page also explains that no data gets sent to Mozilla or pocket or anyone else for these ads to show up.

So again, no privacy violation here. I also think it's an extreme leap from "they show this in the new tab page which they design and control" to "they could start showing it overlayed on other peoples content".

I think they got some decisions very wrong. Among them not implementing a way to allow people to override signing of addons, which people did warn about. Having signatures enforced as a strong default is certainly good and right, but if they had included a "right click on addon, use without signature (WARNING THIS IS SKETCHY REAL ADDONS DON'T ASK YOU TO DO THIS)" option this signing issue would have been relatively mild.

But their track record on privacy/security simply isn't as bad as people make it out to be.


> Do you have any examples of actual serious security/privacy fuck ups by Mozilla/Firefox?

Sadly I don't, but others argue they have top notch standard security practices like automated alerts etc. regarding certificate renewals...


…This one?

This one isn’t very privacy-friendly or open. And that raises all the previous questions again. Should they maybe have learned something about clandestinely fucking with people’s systems?


> Okay, so, youtube targets a small subset of users, and changes their experience capriciously, and to suit their own purposes.

They added a dismissable banner. That falls far short of "changing their experience", in my mind.


pushed it out to users via Normandy (this should be most users)

Is the existence of a back door method of updating Firefox preferences something that will be disclosed to users? What about a UI knob to disable it?


This is the first I hear about Normandy[1]. Firefox has been my main browser for a long time, only because I could use uBlock origin. Now, all of a sudden that is disabled, and with the recent version they got rid of my ability to always prevent autoplaying of videos.

Apparently, there is no one associated with browsers can be trusted in the least.

[1]: https://wiki.mozilla.org/Firefox/Normandy/PreferenceRollout


In the recent version we added the ability to always prevent autoplaying of videos, in the next version we will be adding further UI to let the user disable all (not just muted) videos from autoplaying - https://bugzilla.mozilla.org/show_bug.cgi?id=1543812


That does nothing to mitigate the wholesale disabling of already trusted plugins like uBlock.


Well, if you follow the OP, you'll realize that it's a bug and people are working to fix it :)


The bug is that plugins can't be manually enable. Nobody is working to fix that.


I have spent ~10 years using Firefox daily, tweaking the config and getting the addons set up the way I want. I was a professional web developer for most of those years.

This is the first I have heard of Firefox changing my config settings invisibly in the background. This is obscene. Who on earth thought this was a good idea? The security ramifications are limitless.

I understand all too well that most companies have decided to start A/B testing things on subsets of users, but that doesn't mean you should force that mode of thinking into everything. What a horrible decision. I don't recall ever seeing any news or notifications or checkboxes about studies or "Normandy" at any point.

Are there some other good open source alternatives to Firefox? I remember hearing about Brave but also that it was tied into some cryptocoin nonsense, so I'm not sure what else to look at.


>>This is the first I have heard of Firefox changing my config settings invisibly in the background.

you must not have been paying attention the last 3 or so years

Mozilla is doing all kinds of, IMO, unethical things with FireFox that goes against the core value of the mission statement of the Mozilla Foundation.

They are too busy trying to replicate Chrome to care about privacy, security, or basic user rights


I read all about the DRM stuff but I figured that was just the awful standards boards being awful standards boards.

I didn't realize what a true mess Mozilla had become.


Not just DRM

Looking Glass, Pocket, Banning Plugins based on Political ideology, Backdoors like Normandy, and the STUDIES system, their creation of what amounts to Mozilla version of the Ministry of Truth, Their partnership with Cloudflare to send everyone's DNS to Cloudfare over HTTP, and whole host of other things


The way that Firefox needs 5-10 privacy extensions to be usable isn't just inconvenient when the certs fail, but you also have to trust all these strangers and their extension code.

I've been using brave because of that: all of that is baked in so my only extension is my password manager


So why do you use a browser made by an ad company, that is all about analysing your browser history and targetting ads at you?


Exactly the same here, Brave + a password manager after 25y of Firefox/Netscape/Mosaic.


> Is the existence of a back door method of updating Firefox preferences something that will be disclosed to users?

It will even be documented for them: https://wiki.mozilla.org/Firefox/Normandy/PreferenceRollout

> What about a UI knob to disable it?

app.normandy.enabled


app.normandy.enabled

That is not what I meant by a UI knob, and I sure hope you knew that. By UI knob I mean something easily discoverable and self-explanatory. Rooting around a gated (with a mighty strong warning, I should add) config section for something called "normandy" is not intuitive, and it's not self-explanatory.

And I sure hope that by disclosed to users I did not mean some Hitchhiker's Guide-esque disclaimer on a wiki page. Something as (potentially) insidious as a preferences backdoor should absolutely be disclosed to users with the same level of visibility as the stories nonsense.

Perhaps "normandy" is entirely harmless, but you guys lost a metric fuckton of credibility by using your backdoors to spam people[1]. Playing coy does nothing to improve your credibility or reputation.

1: https://www.theregister.co.uk/2017/12/18/mozilla_mr_robot_fi...


I'm sorry to break it to you, but a fuckton is not actually part of the metric system...


This unit modifier was specified under RFC 69420


Well, it should be, but that's an entirely different discussion.


The UI knob is

    Options -> Privacy & Security > Allow Firefox to install and run studies
They're using the studies system to push this hotfix faster for those that have it enabled.

Edit: Source:

See: https://discourse.mozilla.org/t/certificate-issue-causing-ad...

> In order to be able to provide this fix on short notice, we are using the Studies system. You can check if you have studies enabled by going to Firefox Preferences -> Privacy & Security -> Allow Firefox to install and run studies.

Normandy seems to be the internal name for this system: https://github.com/mozilla/normandy


Why is it supposed to be reassuring that their “studies” can override the cryptographic infrastructure?

Edit: rephrase for clarity


Thank you.

I happen to be one of the users with Normandy disabled, so I'm foobar'd anyway. That said, the reason I disabled it is because it is a security hole you could drive a semi-truck through. And now they want us to enable it to provide a "fix" for the secure way in?

I thought I was the only one who saw a problem with that. Your post is evidence that I'm not completely off in my thinking.


The studies system is also code-signed, but with a different certificate chain, hence why it wasn't affected. What security hole do you think this opens in Firefox?


And thank you for assuring me I wasn’t alone in worrying about that!


If you don't trust your software provider, "studies" don't matter. The same but could come through a regular update. If you don't want to be on bleeding edge, that's fine, and if the UI for Normandy is bad, that's an issue, but it's nonsense to accept updates and then say you don't want updates.


No, it's not. This Normandy nonsense and stories are two separate, yet creepy features. I've already disabled stories but it looks like Mozilla still retains control of my preferences (without disclosing it).


I sure wonder how people so suspicious of Mozilla dare use their browser.


Easy: There's a difference between static, shipped code and a capability to modify software at a distance (which could even by hijacked by an attacker who infiltrates Mozilla's infrastructure.)


If your threat model includes the hijacking of Mozilla's infrastructure, I assume you read and verify the entirety of the Firefox source with every new version before using it, right?


Obviously not?

But there are trustworthy people working with and integrating that code, there's a good chance they'll notice a hinky commit, and they're very close to having completely reproducible builds—which means that there can be verification that the shipped binary matches the inspected source.

https://gregoryszorc.com/blog/2018/06/20/deterministic-firef...


Because Mozilla is easier to lock down than Chrome.

I guess "easier" isn't the word really, because Chrome can't really ever be locked down. It's pretty much always, effectively, an open book to Google.

You can lock down everything in Firefox. The drawback being, of course, times like this, when you can't get the fix unless you leave Normandy enabled. (Which I didn't.)

>:-(

Grrrr.


Setting preferences really should not be shocking, given that they have the capacity to run automatic updates. I'm more surprised that they can push code without certificates.


The expires certificate seems to be in a chain concerning extensions. Not necessarily the same chain concerning core browser updates...


> I'm more surprised that they can push code without certificates.

Where are you getting this from? AFAIK all Mozilla code / prefs they can push should be signed -- this very issue seems to stem from the cert used to sign AMO extensions expired.


They are using the Studies system in a complete violation of the way they said they would use the studies system for when it was announced. This is not surprising since Mozilla is becoming about as Trust Worthily as Google or Facebook


>The UI knob is > Options -> Privacy & Security > Allow Firefox to install and run studies

Well it's a half-assed knob then, because it was unchecked and still I had app.normandy.enabled = true somehow.


[flagged]


I'm nearly certain Normandy does not log all of your browsing history for what it's worth.

I agree Mozilla approach to stuff like this is... less than ideal.


> now I find out all my browsing history has been logged to Firefox servers.

Where are you getting this from?


I am getting it from the simple fact that when I looked at Normandy related settings a unique ID and an API endpoint screamed at me ... Let's assume the explanation given here regarding Normandy's endpoint is legitimate. Why am I assigned a unique ID? How hard is it to make the connection between the fact that for the past N years, despite telemetry and studies being turned off, my browser had been pinging Mozilla with this unique ID. Until proved otherwise, it is safe to assume that this was used to track browsing.


Disclaimer: I work for Mozilla on the operations team responsible for Firefox's backend services, including Normandy.

TL;DR you are not sending us your browsing history.

If telemetry and studies were turned off, your browser wasn't sending us this unique id.

If you had kept them enabled, for normandy telemetry you would have been sending us the data described at https://firefox-source-docs.mozilla.org/toolkit/components/n...

You can read more broadly about what data Firefox sends by default at https://www.mozilla.org/en-US/privacy/firefox/

And learn more about the review process any data collection has to go through at https://wiki.mozilla.org/Firefox/Data_Collection


>> Is the existence of a back door method of updating Firefox preferences something that will be disclosed to users?

> It will even be documented for them:

That sounds like you do not think the concern is warranted. I've used Firefox since the first time it was available, and Netscape starting with the first ever betas. At no point was there a dialog that said "Do you want us to be able to change your browser settings remotely?"

>> What about a UI knob to disable it?

> app.normandy.enabled

That is not a "UI knob" by any stretch of the imagination. Looking in about:config revealed:

app.normandy.logging.level

Is there a way to find out what is being logged and why?

So, the question can be rephrased as "is the fact that Firefox has been logging all users' entire browsing history despite the fact that the user has not chosen to set up a Firefox account going to be disclosed?"


> So, the question can be rephrased as "is the fact that Firefox has been logging all users' entire browsing history despite the fact that the user has not chosen to set up a Firefox account going to be disclosed?"

Chill out, this preference only determines what is logged locally (never sent to the server). It's a debugging tool.

Sources: - https://searchfox.org/mozilla-central/source/toolkit/compone... - https://searchfox.org/mozilla-central/source/services/common...


Look, at this point it’s not the user’s responsibility to “chill out”. It’s very much Firefox’s responsibility to try to repair their reputation by:

1. being completely transparent about all the mechanisms that data or code can be pushed to or pulled by the browser, or pushed from or pulled from the browser; and

2. having a toggle for all of them, yes every single one, in Privacy & Security.


From the wiki entry.

> Normandy Pref Rollout is a feature that allows Mozilla to change the default value of a preference for a targeted set of users, without deploying an update to Firefox.

Rolling out a new certificate goes beyond changing the default value of a preference which rightly raises questions about what else Normandy allows which is not documented.


> app.normandy.enabled

I fail to see both the "UI" and the "knob" part of this. Why is this not a checkbox in the preferences? Why do so many people not know about Normandy?


Agree with this concern.


One result of this, when I use firefox from now on, I'll be disabling "Normandy"


I've deleted my extensions thinking it was the extensions' issue. I'm trying to download again but it's telling me I don't have internet connection. Any work-arounds?



Any way to preserve this workaround's functionality beyond 24 hours?


Works like a charm!


Thanks!


How do you enable Normandy on Firefox For Android? There's no Normandy in the about:config.

This is such a gigantic mess, even for a very loyal Firefox user it's to swallow.


Is it weird that this problem only happened to me (add ons all being disabled) roughly 10 minutes ago?

I was browsing fine this morning for maybe four hours and now all my plugins/addons are gone.


The check is done every 24 hours and it seems for you these 24 hours were over roughly 10 minutes ago.


No. That’s not weird. The validity check runs once a day.


No, because the browser only checks for updates to your addons every day or so, not every five seconds.


Thanks. This must be terribly stressful to you.


Meanwhile, having to browse the web without an adblocker has been nothing but relaxing for everybody else.


Firefox includes content blocking, so it's not that bad.


Sorry, the stress someone on the Firefox team must be experiencing would easily be magnitudes beyond what we are.


No offense but they're not getting inhumane shock treament either if you're going to pull "our stress is holier than thou" and say it's magnitudes higher, I'll be waiting scientific backing on this or else it's just rude... plus they can always just not make me periodically re-install all my addons with no option to just bypass verification... except this time it doesn't work even with the fixes. (Actually, one time I just didn't see it was set to update automatically downgrading one of them. I'm no expert on these issues, really. They just could have asked first in my opinion.) Thanks, Mozillama.

Edit: I had to click "Restart with addons disabled (safe mode)" for those wondering.


There is a before and after with adblockers. Its a real pity they are the worst privacy-vioolating tools ever.


> Its a real pity they are the worst privacy-vioolating tools ever.

What do you mean?


They read your entire page content, they have access to all your information.


But there are free software ad blockers like uBlock Origin.


if uBlock got compromised as an extension even for 1 day, the amount and value of data uBlock can collect will make anyone very very rich.

It's on the level of saving passwords as texts in terms of privacy.


How is it different from any other browser extension? They are all like that, aren't they?


It depends on the permissions: uBlock has permissions to read the entire page on any domain. Some extensions only limit themselves to specific domains, or don't touch that at all.

uBlock still reads your bank website, etc.


Meanwhile, as the "partial fix" is deployed, here is what I think fixed the issue for me: in the Preferences, under "Privacy & Security", check "Allow Firefox to install and run studies" (then wait for the current hotfixes to appear in about:studies then restart the browser).


I don’t like you editing my preferences. How can I switch to this new certificate manually?



I still get “Download failed. Please check your connection” when attempting to re-download the add-on you took away without asking me. That thing was pretty much the only reason I used Firefox at all.


What is Normandy?



So is that a backdoor into my prefs? How can I check if Normandy is active on my installation?


Something with a public wiki page describing what it does exactly is hardly a backdoor.

Also here's the code for the server: https://github.com/mozilla/normandy


The wiki entry evidently doesn't describe what it does because according to the wiki entry it allows for the enabling and disabling of preferences. The updating of a certificate is beyond what is described in the wiki.

Mozilla should follow up with a post describing exactly how Normandy works and the full capabilities it gives them.


From what I understand, Normandy is an infrastructure for delivery of some changes to some of Firefox users (or all of them). There are two major use cases: preferences rollout and studies. In the first case default values of preferences get changed (if your pref has non-default value, it won't affect you). In case of studies some piece of code gets delivered and executed, which cat do anything. In this hotfix the study installs add-on, which in turn installs certificate.


[flagged]


Users shouldn't have to search and then be able to understand the code found for such a feature. When a remote capability such as this exists it is Mozilla's responsibility to document how the feature works and the exact capabilities it gives them. Instead of doing so they have produced a wiki entry which appears to falsely describe the capabilities of this remote feature by stating it is used to change default preference values.


Hacker News

I think people here can be expected to read some code if they are interested in how something works.


[flagged]


[flagged]


Please read and follow the site guidelines when commenting here.

https://news.ycombinator.com/newsguidelines.html


All code is available – as a tar.xzipped archive of Firefox source code containing over 150k files and measuring over 1GB in size when unpacked.


grep -iR normandy

I expect code related to normandy to be ~1k LOC in size and probably written in JS. I haven't checked though, because I don't really care today.


And you shouldn't have to care. No one should. The very fact that this exists and that we are expected to trust it is very disappointing.


Open source software can't have a backdoor because the code is available to review.

Got it.


Type about:config in the address bar and search for 'app.normandy.enabled' flag.


Well that's interesting. I see Normandy enabled, but if I go to the "Privacy and Security" section of the preferences page I see all the data collection and use stuff disabled. There's no obvious way to disable the Normandy back door.

Oh well, at least we don't have another season of Mr Robot spam to look forward to.


Presumably the logic is something like:

    if (studies.enabled) {
      if (normandy.enabled) {
        ...
      }
    }


> There's no obvious way to disable the Normandy back door.

???

It's a publicly documented feature with a publicly documented way to disable it.


With an obscure name and no correlation to all the other spying and backdoor ING Mozilla are doing. Is this really the best option tog etaprivacy focused browser? I think this is all very worrying.


Can you elaborate on what 'other spying' Mozilla does? Do you mean their telemetry?


Spying was the wrong word. But yes, the telemetry. The google analytics that are hidden on the extensions page, that only listen to the Do not track, but not the turn off telemetry checkbox. Sadly it just doesn't seem to stop.


It's named after a world famous beachhead of an invasion. The name isn't that obscure for a feature that invades the userbase with a takeover.


I am not sure what 'spying' Mozilla is doing, but I agree this should be better named and better highlighted.


here the flag is true but the extensions are still unsupported


It may take a little time for the partial fix to be distributed.


Not so in Firefox for Android. No normandy to find.


Thanks! Do you know how many active users were affected by this certificate error and subsequent addon disabling? I guess most users were spared due to the timing and short duration of the error.


Their system runs a check every 24 hours. So a lot of users were affected. They've rolled out a partial fix via their Normandy thingy, and according to them it will fix the issue for most people over the next couple of hours.


some info on the mozilla.org landing page would be useful


I got the update through studies (I can see it in about:studies), but it didn't fix anything for me.


It seems to have worked. My extensions reappeared.


Help my business account


Payment method messenger all app link stor all removed link


What bothers me is the fact that everyone seems to be accepting the usage of these newspeakish romantically sounding euphemisms that are completely opaque about the purpose of things they stand for. "Normandy"... What is this - a place, a sort of champagne, a hotel or the internal name of some freaking area51 document?..

Like, "In order to disable Normandy, uncheck Vaduz, Monterrey, Vologda and select Newcastle in the Xinjang drop-down menu - we'll ship Bronx with next update".

P.S. Keep flagging, I'll repost, no problem.


Just discovered the same message in the Tor browser, and it seems that NoScript got disabled. So people running Tor are a lot more vulnerable right now.

Also, wow, the web has a ton of ads. I've been running uBlock origin so long I forgot how bad it had gotten :(


Considering that JavaScript has been used in the past to unmask Tor users, this is a frightening security bug and is not "fail-safe" behavior. The extension should remain enabled, but with a warning.

It is doubtful that Mozilla will change this behavior, as they will likely consider it a niche case, but the Tor browser should probably look into alternate means of changing the behavior (patching).

Edit: apparently the packaged versions of NoScript and HTTPS Everywhere were not affected. See thread here https://old.reddit.com/r/TOR/comments/bkg7vf/due_to_a_bug_in...


That's really bad because it means it also effects Firefox ESR which is what lots of large enterprise users have installed.


That really sucks that Tor was vulnerable to this too.

Tor needs to fork Firefox "properly" and remove all of Mozilla's bullshit like I've seen some forks do (Waterfox?). I thought this would be common sense for the people at Tor.


That was predictable.


Well, it was. I mean, it's Firefox!

But see https://trac.torproject.org/projects/tor/ticket/30388 for the same temporary fix as at https://news.ycombinator.com/item?id=19823928


Yikes.


The more people who use adblocker, the more ads websites need to make the same amount of money. It's been brought up that many twitch streamers don't receive ad revenue from more than half their viewers.

I can only find a source right now for YouTube, but they're out there for twitch too.

[1] https://www.vg247.com/2015/10/30/around-40-of-pewdiepies-aud...


And? If ads were ads and not malware trackers I wouldn't care. Calling these franken-programs ads stretches the word past its breaking point.

No one can stop actual ads - this comment was brought to you by Pepsi, Pepsi for the love of it. See? Everyone had to read the last sentence even if they had adblock on.


Requiring ads like that will only kill small businesses. Big companies can get custom ads like that. Small ones have to rely on Adsense.

You can justify it however you want, the fact is if you remove automated ads from the internet, the internet would be a lot smaller than it is today.


To add to the specific example of twitch, their ads are broken and annoying as hell.

The broken:

- they still don't have the volume of ads under control

- the android app regularly freezes during ad display

- sometimes it disrupts and buffers the stream without then displaying the actual ad

And possibly more, I wouldn't know since all these are enough to make me either not watch twitch or block ads. I disable it once every few months to see if it got better though.

The annoying:

- the same ad every time often (when The Grand Tour started again this year, it was the only ad that ever played for me)

- most ads seem to be trailers for TV shows or movies. Most of those spoil half the story

- if you just want to see what some streamer is doing you have to watch an ad first

Twitch Prime was the only reason I still had Amazon Prime when it removed ads officially. Not anymore.

Twitch turbo was great before Twitch Prime and I had it. But now it's 9,99€ per month which I find outrageous, especially because the streamers will see very little of this money anyways, afaik.


> Also, wow, the web has a ton of ads. I've been running uBlock origin so long I forgot how bad it had gotten :(

Try turning it off. I got rid of ublock after arstechnica complained about a lot of their users blocking ads years ago and it honestly isn't that bad. Every once in a while I do back out of a page for maxing out one of my cpu cores but otherwise, nothing ever bad happens. With ads: either it takes me half a second to tell I'm not interested in an ad, or I actually am interested and i follow the ad because I am interested and I want to support the website.

The alternative is websites charging insane amounts of money with paywalls (Wall street journal has their "best" price for 12 months at $360 a year). That is horrible because it means only rich people can pay for high quality news as ads are one of the most progressive forms of payment (rich people ads are way more valuable than poor peoples and yet everyone gets the same quality services/news with the ad model despite their income/net worth).


The alternative is those websites not using third party ads with third party trackers on it. Adblockers already do not block those (cause they're indistinguishable from image links). If they really just want my eyeballs they know how they can get them.

But they really want to track me. And I'm not having that. The moment they stop tracking their users through third party ad networks, most adblockers stop blocking (because there's no AI involved and they wouldn't know what to block except images in general).

It's in their hands, really. If they want to show me ads they can do it in a normal and decent manner.

News websites should in fact be the first to adapt this model, because it's exactly the same thing as ads in print media. But they chose to get those disgusting third party tracking networks involved. And not just one or two.

I don't have to put up with that, but I really don't see why there would be an action required on my site to stop blocking those tracking ads.


Just FYI, that's not really true. Adblocker use mostly all the same filter lists and those do regularly block ads that just are regular images, and even text notes. https://www.troyhunt.com/ad-blockers-are-part-of-the-problem... is an example, even if that specific one got resolved

Adblock Plus has the ability to not block ads that conform to a certain standard, but in addition to conform to standards ad publishers need to pay for that. At least that's what they claim.


"Adblock Plus has the ability to not block ads that conform to a certain standard"

Not my standard. Ad blockers should be rebranded as "tracking blockers" so everyone calls them that. Then sites would have to ask you to "disable your tracking blocker", which sounds scary as hell to users, as it should.


If the images are hosted on the site instead of a third party the list won't matter.


Not true, here is an example of easylist blocking OpenStreetMap advertising OpenStreetMap events on openstreetmap.org [0].

[0]: https://github.com/easylist/easylist/pull/900


Even my example I linked in the comment you answered to is not about an image hosted on a different server. It's not an image at all. And ublock origin even still blocks it today.


Which adblocker do you trust not to track you?


It’s the sheer number of trackers online that really pushes me to use lots of security extensions. I really can’t support that kind of malicious behavior.


> With ads: either it takes me half a second to tell I'm not interested in an ad, or I actually am interested and i follow the ad because I am interested and I want to support the website.

If ads weren't doubling as tracking beacons and the occasional malicious drive by download, that certainly would be an option.


What?

This has Nothing to do with "Ads". It has to do with malicious scripts and gratuitous webtrash that sucks up resources.

The instant Mozilla turned off my "Noscript", I got one of those phishing popups that pretends to be from Microsuck and totally locks up Firefux.

On a machine with limited memory, cores, or what-have-you, every webprogrammer's special cute little "Animation" will run, gratuitously, and slow your machine down so much that it becomes unusable.

Maybe "Advertisers" need to finger out how to write adaptive code that doesn't depend on cutesy little videos that choke older systems to death. (I notice Amazon has done that... you can run it on nearly anything. Which is why... oh, never mind.)


>nothing ever bad happens. With ads: either it takes me half a second to tell I'm not interested in an ad, or I actually am interested and i follow the ad because I am interested and I want to support the website.

You just described something bad.


Assuming you mean that half second looking at the ad: Name a better alternative for funding the internet. Paywalls at every website?


An open, transparent, convenient, anonymous protocol for micro payments, with good cost contol build into browsers.



You really seem to care a lot about this, let me guess, you work in adtech?


>Assuming you mean that half second looking at the ad: Name a better alternative for funding the internet. Paywalls at every website?

Funding the Internet? What you're talking about (ads) is a revenue stream for what amounts to a handful of websites. google.com, amazon.com, ycombinator.com, reddit.com, thefacebook.com, tweeter.com, etc. could all go offline right now and the Internet would still be here.


That doesn't sound right. What about all the other websites with ads, like recipe sites, guitar chords, porn, diy, etc.? or apps on the Google play store with ads?


I run sites that don't have ads. I don't make any money off of them. I still run them. Seems like a lot of people in software development think similarly.


This is the web that I like. Hobbyists and volunteers running low-fi websites for common interests. I'm not against commercial sites like Netflix but don't think every last blog should be monetised.


How do you pay your bills? If running those websites were your full time job, would you still be okay not making any money off of them? Or have you just decided that only people who have other income should have websites?


I don't understand your question; what about them? The websites are just nodes of the Internet. And I don't understand at all why you brought up Google app store apps, so I'll refrain from commenting on that until I better understand your point.


It doesn't feel like a handful of websites. It feels like the dominant experience of the internet for most people. Ads are a source of revenue for many more websites than just a handful. They are also a source of revenue for more than a handful of apps. I


What's bad about supporting a site you like while learning about something that interests you? If this were really all ads were I wouldn't block them.


That's not what the post says at all.


The person who I am responding to said the following describes something bad:

> either it takes me half a second to tell I'm not interested in an ad, or I actually am interested and i follow the ad because I am interested and I want to support the website

The second half of that sentence is precisely what I'm describing. Do you disagree with my characterization of that sentence?

I assume they included that part in the quote rather than cutting it off earlier because this was part of what they were saying is bad. Do you disagree with me there?


To re-enable all disabled non-system addons you can do the following. I am not responsible if this fucks up your install:

Open the browser console by hitting ctrl-shift-j

Copy and paste the following code, hit enter. Until mozilla fixes the problem you will need to redo this once every 24 hours:

    // Re-enable *all* extensions

    async function set_addons_as_signed() {
        Components.utils.import("resource://gre/modules/addons/XPIDatabase.jsm");
        Components.utils.import("resource://gre/modules/AddonManager.jsm");
        let addons = await XPIDatabase.getAddonList(a => true);

        for (let addon of addons) {
            // The add-on might have vanished, we'll catch that on the next startup
            if (!addon._sourceBundle.exists())
                continue;

            if( addon.signedState != AddonManager.SIGNEDSTATE_UNKNOWN )
                continue;

            addon.signedState = AddonManager.SIGNEDSTATE_NOT_REQUIRED;
            AddonManagerPrivate.callAddonListeners("onPropertyChanged",
                                                    addon.wrapper,
                                                    ["signedState"]);

            await XPIDatabase.updateAddonDisabledState(addon);

        }
        XPIDatabase.saveChanges();
    }

    set_addons_as_signed();
Edit: Cleanup up code slightly...


Much better, thanks! It's scripts like this that make me miss the old XUL addon interface; sure it was difficult to maintain, but it granted a level of control over the browser that wasn't (and now, sadly, isn't) possible anywhere else.

I was able to piece together most of my compact dark interface theme [1] with userChrome.css by sacrificing the all-tabs menu for its JS binding, but the all-tabs helper addon is a shadow of what it once was, and the Private Tabs addon is dead with no hope of revival due to the lack of a WebExtension API [2]. I can't even switch browsers to get the functionality back since the others are even less configurable.

[1]: https://github.com/techwolfy/rainfox-theme

[2]: https://bugzilla.mozilla.org/show_bug.cgi?id=1358058


Super useful, thanks.

In my case ctrl+shift+j opens a dumb console that only shows messages and doesn't take any input. I had to go to about:addons, hit F12 for the Dev Tools and paste it in the console there. Worked well.


If you go to about:config and set "devtools.chrome.enabled" to true, the cmd-shift-j thing should work


I also have to do this to make it work on Win10 firefox console (F12)


If some addons DISAPPEARED then you have to restart Firefox, go to addon manager menu, find disappeared addons and disable/enable them.

When I say "disappeared" I mean that addon icon or else is disappeared.


This is the step required for me to allow Browser Console for input


I recommend downloading the Firefox Unbranded version: https://wiki.mozilla.org/Add-ons/Extension_Signing#Unbranded...

There you can change the options in `about:config` (these options do not work in the main version of Firefox) ` xpinstall.whitelist.required on false xpinstall.signatures.required on false extensions.legacy.enabled on true `

And then every extension works, even experimental and a large part of those based on the former API.

In principle, the Firefox Unbranded version should be the most promoted because it has fewer restrictions on extensions. Even such a version with default options set in `about:config` should exist, and with a larger extension base than AMO.


This does not work for FF versions older than v57.

I use v56.0.2 because that was the last time we actually got to customize the browser (yes, boo me for using an old version).

So I dug around a little (okay, a lot) and worked out a solution for v <= 56.

Version for FF v <= 56

  // For FF < v57 >...?
  async function set_addons_as_signed() {
      Components.utils.import("resource://gre/modules/addons/XPIProvider.jsm");
      Components.utils.import("resource://gre/modules/AddonManager.jsm");
      let XPIDatabase = this.XPIInternal.XPIDatabase;
      
      let addons = await XPIDatabase.getAddonList(a => true);
  
      for (let addon of addons) {
          // The add-on might have vanished, we'll catch that on the next startup
          if (!addon._sourceBundle.exists())
              continue;
  
          if( addon.signedState != AddonManager.SIGNEDSTATE_UNKNOWN )
              continue;
  
          addon.signedState = AddonManager.SIGNEDSTATE_NOT_REQUIRED;
          AddonManagerPrivate.callAddonListeners("onPropertyChanged",
                                                  addon.wrapper,
                                                  ["signedState"]);
  
          await XPIProvider.updateAddonDisabledState(addon);
  
      }
      XPIDatabase.saveChanges();
  }
  
  set_addons_as_signed();
Please let me know which versions are compatible, and where it breaks down!

Don't forget to enable devtools.chrome.enabled in about:config and use the actual browser console, not the web console


Firefox ESR 52.5.0 This fails with error: Promise { <state>: "rejected", <reason>: TypeError } TypeError: this.XPIInternal is undefined[Learn More]

However I've re-enabled all extensions by: Type about:config in a new window's address bar. Type 'signatures' in the search bar. You should see a line saying 'xpinstall.signatures.required'. Double click on it and 'true' should change to 'false' Restart Firefox.


Hey, so if you're going to do this, you're porting the wrong code. My code (which you ported) will need to be rerun every 24h to reset the signing state (assuming old firefox works like new firefox, it probably does). You should figure out how to make the following code (pulled from the .xpi mozilla created) work in firefox 56's browser console instead:

    /* eslint no-unused-vars: ["error", { "varsIgnorePattern": "skeleton" }]*/
    ChromeUtils.defineModuleGetter(this, "XPIDatabase", "resource://gre/modules/addons/XPIDatabase.jsm");

    var skeleton = class extends ExtensionAPI {
        getAPI(/* context */) {
            return {
                experiments: {
                    skeleton: {
                        async doTheThing() {
                            // first inject the new cert
                            try {
                                let intermediate = "MIIHLTCCBRWgAwIBAgIDEAAIMA0GCSqGSIb3DQEBDAUAMH0xCzAJBgNVBAYTAlVTMRwwGgYDVQQKExNNb3ppbGxhIENvcnBvcmF0aW9uMS8wLQYDVQQLEyZNb3ppbGxhIEFNTyBQcm9kdWN0aW9uIFNpZ25pbmcgU2VydmljZTEfMB0GA1UEAxMWcm9vdC1jYS1wcm9kdWN0aW9uLWFtbzAeFw0xNTA0MDQwMDAwMDBaFw0yNTA0MDQwMDAwMDBaMIGnMQswCQYDVQQGEwJVUzEcMBoGA1UEChMTTW96aWxsYSBDb3Jwb3JhdGlvbjEvMC0GA1UECxMmTW96aWxsYSBBTU8gUHJvZHVjdGlvbiBTaWduaW5nIFNlcnZpY2UxJjAkBgNVBAMTHXNpZ25pbmdjYTEuYWRkb25zLm1vemlsbGEub3JnMSEwHwYJKoZIhvcNAQkBFhJmb3hzZWNAbW96aWxsYS5jb20wggIiMA0GCSqGSIb3DQEBAQUAA4ICDwAwggIKAoICAQC/qluiiI+wO6qGA4vH7cHvWvXpdju9JnvbwnrbYmxhtUpfS68LbdjGGtv7RP6F1XhHT4MU3v4GuMulH0E4Wfalm8evsb3tBJRMJPICJX5UCLi6VJ6J2vipXSWBf8xbcOB+PY5Kk6L+EZiWaepiM23CdaZjNOJCAB6wFHlGe+zUk87whpLa7GrtrHjTb8u9TSS+mwjhvgfP8ILZrWhzb5H/ybgmD7jYaJGIDY/WDmq1gVe03fShxD09Ml1P7H38o5kbFLnbbqpqC6n8SfUI31MiJAXAN2e6rAOM8EmocAY0EC5KUooXKRsYvHzhwwHkwIbbe6QpTUlIqvw1MPlQPs7Zu/MBnVmyGTSqJxtYoklr0MaEXnJNY3g3FDf1R0Opp2/BEY9Vh3Fc9Pq6qWIhGoMyWdueoSYa+GURqDbsuYnk7ZkysxK+yRoFJu4x3TUBmMKM14jQKLgxvuIzWVn6qg6cw7ye/DYNufc+DSPSTSakSsWJ9IPxiAU7xJ+GCMzaZ10Y3VGOybGLuPxDlSd6KALAoMcl9ghB2mvfB0N3wv6uWnbKuxihq/qDps+FjliNvr7C66mIVH+9rkyHIy6GgIUlwr7E88Qqw+SQeNeph6NIY85PL4p0Y8KivKP4J928tpp18wLuHNbIG+YaUk5WUDZ6/2621pi19UZQ8iiHxN/XKQIDAQABo4IBiTCCAYUwDAYDVR0TBAUwAwEB/zAOBgNVHQ8BAf8EBAMCAQYwFgYDVR0lAQH/BAwwCgYIKwYBBQUHAwMwHQYDVR0OBBYEFBY++xz/DCuT+JsV1y2jwuZ4YdztMIGoBgNVHSMEgaAwgZ2AFLO86lh0q+FueCqyq5wjHqhjLJe3oYGBpH8wfTELMAkGA1UEBhMCVVMxHDAaBgNVBAoTE01vemlsbGEgQ29ycG9yYXRpb24xLzAtBgNVBAsTJk1vemlsbGEgQU1PIFByb2R1Y3Rpb24gU2lnbmluZyBTZXJ2aWNlMR8wHQYDVQQDExZyb290LWNhLXByb2R1Y3Rpb24tYW1vggEBMDMGCWCGSAGG+EIBBAQmFiRodHRwOi8vYWRkb25zLm1vemlsbGEub3JnL2NhL2NybC5wZW0wTgYDVR0eBEcwRaFDMCCCHi5jb250ZW50LXNpZ25hdHVyZS5tb3ppbGxhLm9yZzAfgh1jb250ZW50LXNpZ25hdHVyZS5tb3ppbGxhLm9yZzANBgkqhkiG9w0BAQwFAAOCAgEAX1PNli/zErw3tK3S9Bv803RV4tHkrMa5xztxzlWja0VAUJKEQx7f1yM8vmcQJ9g5RE8WFc43IePwzbAoum5F4BTM7tqM//+e476F1YUgB7SnkDTVpBOnV5vRLz1Si4iJ/U0HUvMUvNJEweXvKg/DNbXuCreSvTEAawmRIxqNYoaigQD8x4hCzGcVtIi5Xk2aMCJW2K/6JqkN50pnLBNkPx6FeiYMJCP8z0FIz3fv53FHgu3oeDhi2u3VdONjK3aaFWTlKNiGeDU0/lr0suWfQLsNyphTMbYKyTqQYHxXYJno9PuNi7e1903PvM47fKB5bFmSLyzB1hB1YIVLj0/YqD4nz3lADDB91gMBB7vR2h5bRjFqLOxuOutNNcNRnv7UPqtVCtLF2jVb4/AmdJU78jpfDs+BgY/t2bnGBVFBuwqS2Kult/2kth4YMrL5DrURIM8oXWVQRBKxzr843yDmHo8+2rqxLnZcmWoe8yQ41srZ4IB+V3w2TIAd4gxZAB0Xa6KfnR4D8RgE5sgmgQoK7Y/hdvd9Ahu0WEZI8Eg+mDeCeojWcyjF+dt6c2oERiTmFTIFUoojEjJwLyIqHKt+eApEYpF7imaWcumFN1jR+iUjE4ZSUoVxGtZ/Jdnkf8VVQMhiBA+i7r5PsfrHq+lqTTGOg+GzYx7OmoeJAT0zo4c=";
                                let certDB = Cc["@mozilla.org/security/x509certdb;1"].getService(Ci.nsIX509CertDB);
                                certDB.addCertFromBase64(intermediate, ",,");
                                console.log("new intermediate certificate added");
                            } catch (e) {
                                console.error("failed to add new intermediate certificate:", e);
                            }

                            // Second, force a re-verify of signatures
                            try {
                                XPIDatabase.verifySignatures();
                                console.log("signatures re-verified");
                            } catch (e) {
                                console.error("failed to re-verify signatures:", e);
                            }
                        }
                    }
                }
            };
        }
    };


Oh, thank you for the pointer in the right direction! In the mean time, I was just glad to have any way to use my browser again!


In the mean time, installing the hotfix extension and running the script seems to help :)


I also using 56.0.2 (64 bit). Can you help me finding solution? I am on different date now to use FF, but I am already 2 days in the past. I tried to use your script and have: "// For FF < v57 >...? async function set_addons_as_signed() { Components.utils.import("resource://gre/modules/addons/XPIProvider.jsm"); Components.utils.import("resource://gre/modules/AddonManager.jsm"); let XPIDatabase = this.XPIInternal.XPIDatabase;

      let addons = await XPIDatabase.getAddonList(a => true);
  
      for (let addon of addons) {
          // The add-on might have vanished, we'll catch that on the next startup
          if (!addon._sourceBundle.exists())
              continue;
  
          if( addon.signedState != AddonManager.SIGNEDSTATE_UNKNOWN )
              continue;
  
          addon.signedState = AddonManager.SIGNEDSTATE_NOT_REQUIRED;
          AddonManagerPrivate.callAddonListeners("onPropertyChanged",
                                                  addon.wrapper,
                                                  ["signedState"]);
  
          await XPIProvider.updateAddonDisabledState(addon);
  
      }
      XPIDatabase.saveChanges();
  }
  
  set_addons_as_signed();
Promise { <state>: "pending" }"

EDIT: have this one now: ado.config({ consent: true }); inpl.anc.js:40

Also what hotfix you refer too. I can not install hotfix-update-xpi-intermediate@mozilla.com-1.0.2-signed.xpi on old version :/

EDIT2: Installed this fix using debuging but it is not for a old FF and got some errors:

Reading manifest: Error processing hidden: An unexpected property was found in the WebExtension manifest. Reading manifest: Error processing experiment_apis: An unexpected property was found in the WebExtension manifest.


Someone else posted this link https://www.reddit.com/r/firefox/comments/bkspmk/addons_fix_...

I can't really vouch for it, but sounds like it worked for them.


I installed hotfix-update-xpi-intermediate@mozilla.com-1.0.2-signed.xpi as temporary extension and then was able somehow to install it in standard way.


I LOVE YOU SO MUCH. Firm 56.0.2 user here as well. Refuse to go with a newer version. I don't even care that we're basically putting it on life-support at this point. Anyway thanks for the help. Couldn't stand youtube or basically any other website without adblock.


Heheh, more like they've taken it OFF support - but anyways, have you seen the way to install Mozilla's intermediate certificate on older versions? It seems like that one actually solves it! :)))

https://www.reddit.com/r/firefox/comments/bkspmk/addons_fix_...


// Re-enable all extensions

    async function set_addons_as_signed() {
        Components.utils.import("resource://gre/modules/addons/XPIDatabase.jsm");
        Components.utils.import("resource://gre/modules/AddonManager.jsm");
        let addons = await XPIDatabase.getAddonList(a => true);

        for (let addon of addons) {
            // The add-on might have vanished, we'll catch that on the next startup
            if (!addon._sourceBundle.exists())
                continue;

            if( addon.signedState != AddonManager.SIGNEDSTATE_UNKNOWN )
                continue;

            addon.signedState = AddonManager.SIGNEDSTATE_NOT_REQUIRED;
            AddonManagerPrivate.callAddonListeners("onPropertyChanged",
                                                    addon.wrapper,
                                                    ["signedState"]);

            await XPIDatabase.updateAddonDisabledState(addon);

        }
        XPIDatabase.saveChanges();
    }

    set_addons_as_signed();


TypeError: Components.utils is undefined[Learn More]

what did I do wrong? (It's all Greek to me)


What version of firefox are you running?

Apparently beta and nightly need to change `Components.utils.import` to `ChromeUtils.import`.

But anyways, don't use this now, use the semi-official fix of clicking on this link and letting it install: https://storage.googleapis.com/moz-fx-normandy-prod-addons/e...

This is the fix Mozilla has published to be installed via shield studies, but skipping the shield studies part. You can be sure it's not malicious because it is signed by Mozilla... and if your browser installed unsigned extensions you wouldn't be looking for this solution in the first place.


Thank you for this, all of my add-ons were immediately re-enabled except for my selected theme.

Mozilla was warned beforehand about this, this problem was completely avoidable which is upsetting. I've been a fan of this browser for years but this is the 2nd time this has happened to add-ons that I can recall and to be blunt it's unacceptable.

It makes absolutely no sense that add-ons the user installs can be disabled like this without user consent, whether the add-ons in question are considered safe or not. Take into account how easy it is to migrate all of your bookmarks/etc. to another browser and this is clearly bad practice by Mozilla. It's one thing if we had a way to bypass this through Firefox directly but they chose not to include a bypass for situations such as this. It wouldn't be so bad if it wasn't for the fact that this is affecting all add-ons, adblockers/dark mode/greasemonkey/everything.

"Date expiration on code signing cert should only prevent new signatures from being considered valid -- it should not even prevent installation of old software. The fact that an expired cert disabled software is the most retarded thing I've seen this decade on any web browser." <This quote nails it on the head, this whole situation is bull.


Interestingly enough Greasemonkey is among the add-ons that are still running on my browser.


Help please somebody. I'm so upset. Firefox "disappeared" off my desktop and I got a notice that it couldn't find my profile file. Neither could I. I had to reinstall the entire operating system for it to work again, and my data is gone (bookmarks). My bookmarks are important data for me. If I can retrieve them, I'm leaving FFforever. And Mozilla. FF should be held responsible for my expenses and time. Help. I had no ff acct. I just chose it as my default browser.


Thanks to advice from @midlandsfirst, all is restored to its rightful place. Just leaves me wondering if I would've got the auto fix from Mozilla eventually if I hadn't done this myself. When I first logged on today (Monday in Australia), it still wasn't fixed.

To see all add-ons and theme disappear before my eyes with no explanation was pretty disconcerting and leads me to say that I totally disagree with Mozilla (or anyone else) having that kind of control. Bad policy (which it is) aside, though, someone dropped the ball big-time. I'm still astonished this was even allowed to happen, bad policy or not.


Hey ConeBone, hoping you see this here since your comment is marked as dead.

The .xpi has already fixed the problem permanently (I think). You can just leave it, or if you want you can uninstall it now just as a matter of cleanliness. I'm linking to this comment about how to uninstall because I'm not satisfied with my solution and I'm hoping someone will contribute a better one: https://news.ycombinator.com/item?id=19827428

You can see the addon in about:support, but it doesn't give you a way to uninstall it, just see that it is installed.


>You can see the addon in about:support, but it doesn't give you a way to uninstall it, just see that it is installed.

you can uninstall it from about:studies


Thank you for sharing this. It worked fine.

Couple of questions, where does this fix appear? I can't see it under addons or studies.

Will we be able to uninstall once a proper general fix has been released?


Fixed everything immediately upon installing before even restarting. Afterwards went to https://news.ycombinator.com/item?id=19827428 for uninstalling, had trouble finding my profile folder on my own and quickly saw the about:support has a direct link, opened, closed firefox, deleted the xpi. All good.

Thank you, 10/10 would prefer it never happens again


I've just done what you suggested but I'm running an older version of firefox:52.9.0 ESR (x86 en-GB) so it does not work!!!! PLEASE HELP ME, PLEASE!!!!! I have an old notebook: Microsoft Windows XP 2002,intel pentium M processor,1600MHz, 1.60GHz,760 MB RAM


I'm not sure if you'll see this. HN doesn't send notifications so I only just saw your post now.

This reddit post might help you https://www.reddit.com/r/firefox/comments/bkspmk/addons_fix_...

I don't feel right responding without saying this, so even though you might have heard it before:

Using Windows XP and firefox 52 is in my humble estimation crazy. You're asking for viruses. I'd strongly recommend installing linux on your machine and using that instead.

Linux Mint Xfce 32 bit might manage to run on your system, but your pushing up against the minimum requirements for doing so. If it does run well enough it is IMHO the easiest distro for a new user to use.

If you find it doesn't, Debian with (again) xfce should run just fine. Debian isn't exactly scary to install, but it's scarier than mint for a new user.

I can't promise full support, but if you need a pointer in the right direction while installing linux, feel free to email me at morenzg@google's mail service here.com (since I'm unlikely to see any replies here).


Thks, it worked, I installed the fix. But do I have to uninstall the .xpi after? And how? Also, my addons are back, but I still have the error message (see screenshot: http://i.imgur.com/t1wb316.png ) Also do I still have to allow the "allow firefow to install and run studies"? Thks.


I don't see any reason that you need to uninstall the .xpi, but you might as well. See here (hoping someone replies there with a better method of uninstalling, my method is a bad hack) https://news.ycombinator.com/item?id=19827428

You don't need to allow firefox to install and run studies, that's just a way of letting firefox automatically install this xpi.

Did that error message only appear after you installed the .xpi fix? If so it's mildly worrisome, but probably not worth spending time figuring out what it's about if all your addons are back. If it appeared when the addons were initially disabled it's not an issue at all, it's just that nothing closed it.


"You don't need to allow firefox to install and run studies, that's just a way of letting firefox automatically install this xpi."

In that case I am wondering why Mozila didn't provide a direct link...it would have been faster. Maybe there is another reason to ask us to run studies...wondering.


"If it appeared when the addons were initially disabled it's not an issue at all, it's just that nothing closed it.

Yes, it appeared when the addons were initially disabled.


in my case I had to copy the link and paste into a fresh tab as clicking actually caused firefox (nightly) to block the install with a message: "news.ycombinator.com - Nightly prevented this site from asking you to intsall software on your computer"


Doesn't work for me. Not sure if a user.js alteration is blocking the fix or what...but I don't want to (even temp) remove it and I certainly don't feel like going through all those prefs to figure out the issue.


66.0.3 (64-bit) osx 10.9.5 i only managed to change the status of devtools.chrome.enabled can only be 'modified' instead of default, not sure how you 'enable' it? new to this :) clicking or copy pasting the link gives me the error msg too


"modified" should be fine as that would put it at "true" meaning that it's enabled. I then copy/paste the link and the fix add-on says that it's installed. however, ctrl-shift-j browser console displays:

"""WebExtensions: failed to add new intermediate certificate:"""


When I say "enabled" I mean set to a value of "true" (instead of "false"), "modified" is a different column.

What error do you get on the link though? That link is a better fix.


Ah okay yeah I set it to true, default was false. Error is connection failure (same error as for the add-ons)


Not sure what would be causing a connection failure. I just verified that the link is still up for me, and obviously you have an internet connection if you're replying to me.

You're not behind a firewall that might be blocking it are you? E.g. being in China?


Hey doop, replying to you here since your post is showing up as dead so I can't reply to it.

If you installed the xpi you shouldn't need to do anything in the browser console, and your addons should have come back. Obviously the latter didn't happen.

Chances are a connection failure in the browser console is unrelated, the browser console is basically constantly spewing error messages, you should just ignore them unless they are in response to something you did.

All I can really suggest over the internet is to try reinstalling your addons - that might work - in which case I would assume they just got uninstalled somehow. If it doesn't I'm not sure what to suggest, and I can't realistically debug something too complex over HN comments. You might just have to wait for mozilla to publish an update to the browser that fixes this properly.

I do want to emphasize that I'm just some dude on the internet being helpful by the way, not associated with Mozilla or anything.

Edit: Just saw this error message you also posted: """WebExtensions: failed to add new intermediate certificate:"""

That sounds like an issue that happened when installing the .xpi? Did it give any other related debugging information?


66.0.3 64 - Win10

FF shows the fix add-on as being installed with the standard pop-up notifications in the menu bar. However, in the browser console, I only see the error msgs that I've listed in my other posts.

???


I can report a connection failure while being on a unrestricted connection (University internet in the UK).


So, I don't have any good ideas why. Can you give me more information about what exactly happens? Can't promise anything but details might help.

I.e. something like

- I open the browser console

- I click the .xpi link

- The following appears in the browser console immediately after clicking the link:

    WebExtensions: new intermediate certificate added api.js:15
    WebExtensions: signatures re-verified api.js:23
- My addons do not come back

- If I try to install an addon from addons.mozilla.org I get <this message about the addon signature verification failing> in the browser console.


Thanks for the response!

So,

I click the link. I click "Add", and Mozilla says addon could not be downloaded due to connection failure. Nothing appears in browser console.

Downloading an addon gives me "Download failed. Please check your connection." on the Addons site. In console, I get:

  Events to handle the installation initialized. BigInteger.js:27
  [GA: OFF] sendEvent {"hitType":"event","eventCategory":"AMO Addon Installs Download Failed","eventAction":"addon","eventLabel":"uBlock Origin"} BigInteger.js:27
  Error:
In the studies, I have https://i.imgur.com/fqrd5Jo.png. I enabled it, and have tried setting both first run, and the update interval is set to 21 (to try and force it to update it quickly).


Hmm, what happens if you right click and save-link-as on the .xpi link? Or if you download it via curl or wget or something?

Edit: That studies image looks like it has already been installed, which is weird if your extensions aren't back...


OK, so... I right clicked, saved-as and then ran the XPI... and that worked. So thank you for that suggestion.

As for the studies image, that's only half of the fix according to the blog post [1], mine is only verification-timestamp, not signing-intermediate-bug.

[1]: https://blog.mozilla.org/addons/2019/05/04/update-regarding-...


Glad to hear it worked.

You're right about the image, that's weird, I can't think of any reason why the verification-timestamp one would be necessary.


No, not in China, no firewall. Thanks for reaching out & your suggestions though!


I'm not a developer though, so it's not super urgent. Just felt like figuring it out.


I'm running 58.0.2. (i thought I read somewhere that this should work from 57 on.)

When I turn on Studies, and do about:studies, I see:

"What's this? Firefox may install and run studies from time to time. Learn more"

Is this what I'm supposed to be seeing?


Hey, I think so (but I didn't go that route), now you either need to wait or use one of the tricks to cause it to update right away.

If I was you I would ignore Mozilla's advice to do it their way and do what I suggested above, of just clicking on the .xpi link and disabling shield studies. It will act immediately, which will give you a better idea of whether or not it will actually work with 58.0.2. I'm not sure it will (I suspect it won't in fact).


I'm on 66.0.3 and it worked for me. All my addons worked. I removed the study from about:studies and everything is fine.


"""Error while detaching the browsing context target front: Connection closed, pending request to server1.conn0.parentProcessTarget1, type detach failed..."""


Applying the fix via the link above, will anything need to be done after this is all resolved. Example will this fix need removed or any settings it may edit be restored in about:config?


Not really, you could uninstall the .xpi but it's just a matter of "cleanliness", hoping someone replies with a better method, but I replied to this comment with one way: https://news.ycombinator.com/item?id=19827428


Worked for me, awesome! Now only question left is why this isn't mentioned in the blog post as fix for the no-studies people like me... Everybody put on your tinfoil hats.


Honestly, I suspect to minimize the support load, see how many questions I got here as just a dude suggesting it unofficially that no one should really trust? It's fine at HN scale, but it's probably not fine at Mozilla scale.

They'd rather that the people who are having issues with it just wait for a new build than waste engineering time. Probably rightly so.


"The add-on could not be downoaded because of a connection failure." <--- Grrrr!

Win 10 FF 66.0.3 64-bit (Studies checked by default anyway). In Australia.


right click, download to desktop. Drag it into your firefox window. It will install then.


Thank you!


I have tried to use this and i still can't get my addons back. Is there anything else that can help me?


Installing hotfix-update-xpi-intermediate@mozilla.com-1.0.2-signed.xpi on firefox-esr 60.6.1esr-1 in Debian did not work for me.


I think restarting firefox after installing it should make it work with esr.


Plugins work from Microsoft Windows OS platform. But do not work through Linux OS platform


Thanks for this! My NixOs build of Firefox was apparently built without support for studies, but your link worked fine.


Thanks I been going in circles for a while. Tell others to go here and no more problems so far.


Nothing happens for me once installed in firefox 55.0.3 32bits. Can't activate my addons.


That's a really old version of firefox, probably an API has changed with my workaround.

I know of an API change that means the official hotfix won't work (whether you install it my bootleg way or the official studies way).

My recommendation is to upgrade your firefox version.


The reason I don't want to upgrade is because I don't want to lose all my legacy addons, a lot of them dissapeared and there are no alternatives. I can't understand how Mozilla can break my browser remotely and force me tu upgrade to a more restricted browser...


Mozilla didn't really "break your browser remotely", it's more like there was a time bomb included that just went off.

If you try hard enough you can probably fix your browser, unpack the .xpi (it's just a zip), look at the source in experiments/skeleton, and try to figure out how to run something similar in the browser console.

But I really can't recommend doing this, I appreciate it's painful, but what you're running right now is massively insecure, there are published exploits. You're just asking for viruses by interacting with the internet using something that old.


Install WaterFox, same old FF56 but the legacy addons are still working.


I'm running Firefox 56 and installed this fix yesterday, but it does nothing for me.

Is there a way to fix this that works on older versions. Updating the browser isn't an option because I've legacy addons running that I can't work without.


The best way is to install Waterfox, because its compatible with all the legacy Addons of FF56, and you are not gonna see any big difference.


It fixed my addons and I uninstalled and my Add on are still working.


Worked for me, awesome Thank you for sharing. It worked immediately.


I installed the fix, nothing happened (my extensions still don't work)


After using the .xpi file?

Have you tried reinstalling the addons? I know it's less than ideal, but on one of my installs firefox had decided to uninstall the addons after disabling them (I think because it updated firefox version after disabling them). If it did that I don't think there is any way to get them back short of reinstalling.


Yes, I installed the .xpi file, firefox still says my extensions are unsupported/unverified.

Reinstalling seems to work, but there are certain extensions that have data I do not want to lose. For example, if I reinstall ubock origin, I will lose all of my dynamic filtering rules.


Possibly you just need to force it to reverify again. If this is still the state of things try setting `app.update.lastUpdateTime.xpi-signature-verification` to 0 in about:config, and then restarting your browser.

But no guarantees. Do you have reason to believe the addons are still installed?


That worked, thank you!

As a side note, the value for that setting got reset after I restarted firefox.

Also, I uninstalled the fix by simply removing the extension in about:addons. Is this the correct way to do it?


Yep, you're all good. I'm surprised it let you remove the fix from about:addons but I wouldn't worry about it.

That value resetting is expected, it's the time when Firefox thinks it last checked signatures, it resetting just means this convinced it to recheck as intended.


This worked immediately on Firefox Quantum 66.0.3 (64-bit) Thank you.


66.0.3/x64 Win10 - running good with your fix. Thank you!


66.0.2 (64-bit), Win 10


Maybe you pasted it in the wrong console? You need the 'browser console' which is different from the one you open on random webpages.

Go to "about:config" (in the url bar), search for and enable devtools.chrome.enabled, then hit ctrl-shift-j (or you can open it from Menu -> Web Developer -> Browser Console).


Ha! That did the trick! Though it gave me these cryptic errors the add-ons are now enabled, thanks!

TypeError: setting is undefined[Learn More] ExtensionPreferencesManager.jsm:90:7 No matching message handler for the given recipient. MessageChannel.jsm:924 1556983316679 addons.xpi-utils WARN Add-on fxmonitor@mozilla.org is not correctly signed.

Edit: another weirdness: i decided to take a look at a different computer (unrelated to this one, at work via remote desktop) which is running exactly the same FF and Windows and all add-on are enabled. it's on 24x7; i didn't do anything to it.


As for your other weirdness. Signatures are only checked once every 24 hours, and it's only been 20 since the cert expired, your other computer probably just hasn't re-verified the signatures yet. You can find some comments here about how to delay it if you want.


That's an expected error, it has to do with how that addon (which is non critical but published by mozilla and really just a part of firefox) is installed, once mozilla publishes a new version of firefox with an updated signing key it should fix itself.


How do I verify that this is signed by Mozilla?


Well, uh, one way is to try and install it in firefox.

Assuming you meant without installing it in firefox, I don't quite know. You're going to need to find mozilla's public keys somewhere (maybe just extract them from firefox), unpack the xpi (it's just a zip file with a different extension) and find the signature contained within, and then figure out how to verify it.


You are a lifesaver all addons have returned easy peasy !


56.0 (64-bit), Win 10


That's a really old version of firefox, probably an API has changed with my workaround.

I know of an API change that means the official hotfix won't work.

My recommendation is to upgrade your firefox version.


Try WaterFox, its a FF56 "clone" with working addons.


hmm, should it be installable by default, not only on the about:debugging page?


It was for me, literally just had to click on the link, and then "Ok" on the "do you want to install this" dialogue that popped up. Not sure if firefox beta would be different.


verified it works on Ubuntu 16.04 with Firefox 66.0.3


Please help my payment method invited this time not okay


56.0


Works, ty!


Not working in 56.0.2 :((((


Try WaterFox, same FF56 but with working addons.


thanks brother


Confirmed working with 66.0.3 (64-bit) on macOS 10.14.4 (18E226). After installation all plugins immediately came back.

This should really be one of the official ways to apply the patch instead of only telling users "it can take up to 6 hours for the fix to be installed" after enabling Shield studies. I can only imagine what kind of havoc this creates in businesses using Firefox working weekends.


Hi, you're shadow-banned and this comment, which seems alright, was only visible to people who are 'showdead'.

This is not the case anymore that I have vouched for it. But, all of the comments you make in the future, and (I'm guessing here) a lot of the comments you have made, cannot be seen by normal visitors, and are greyed out and delisted to people with 'showdead' enabled.

In your case, a lot of your comments are good and make fair points, I think (I only did a quick scan down your comments page). It might be worth for you to contact `dang` or one of the other administrators to see if they would remove the shadow-ban in your case.


[flagged]


Hi 'semenguzzler', as the person pointed out. The extension in question has been signed by firefox. If firefox accepted unsigned or badly-signed addons, then the problems with extensions would not exist in the first place.


I just set xpinstall.signatures.required to True in about:config and that fixed it after a restart.


I can confirm this on Ubuntu 19.04 with Firefox 66.0.3. Changing xpinstall.signatures.required from the default 'True' to 'False' resulted in addons working again.


Android too.


Worked for me on Linux, didn't even require a restart (some addons just reappeared immediately). Using Firefox from Ubuntu package management.


AFAIK that only works for the nighty/dev versions.


but only for dev/beta/nightly. Note that this won't work for the standard version of FF.


Curious, that shouldn't work on branded stable installs.


It works on Linux but not Windows or Mac installs. I assume that repo maintainers use different compiler options than the official Firefox binaries.


For me this is False ;-)


Are you using an Extended Support Release (ESR)? That's expected then.


I wonder how long until the "security vultures" come upon this workaround and stop it from working... would be ironic if that happened sooner than the expired cert getting fixed.


Seems unlikely. If you’re willing and able to run code like the above, sourced from a random comment on the Internet, there’s no amount of security vulture that’s going to protect you from skillfully making your Internet experience unsafe for yourself.


Isn't that how most programming, security or not, works these days anyway?

Joe Random Developer googles for a problem, hits SO, tries a couple of the different proposed snippets and keeps the one that happens to work. (For given values of "work".)

This would be a great spot to end the post with a "</snark>", but sadly that'd be lying. Up until ~2 years ago the most common solution to requests between different subdomains subdomains failing was... "just use CORS: *"


More dangerous than copypasting code from SO is using some 0.0.1alpha library you found on Github/crates.io/npm... At least with the copypaste snippet you had a cursory look at the implementation.


I believe the Mozilla Observatory has given CORS: * a -50 score penalty since the day it launched, precisely because of how horrifically dangerous that advice is.


You can even automate the procedure: https://gkoberger.github.io/stacksort/


It doesn't stop them from trying, however, and severely damaging the experience of users who do know what they're doing. They even invented the term "self-XSS" and contributed to the decline of JS "bookmarklets".


If you're still not being hit, and if the browser console doesn't accept the input, is it enough to get from the WebConsole

    Math.floor( (new Date()).getTime()/1000 )
and paste the number in "app.update.lastUpdateTime.xpi-signature-verification" to be off the hook for the next 24 hours?


Uh... if that outputs a unix date yes (I think) except you need to restart the browser afterwards as well.


> if that outputs a unix date yes

Yes, the JavaScript's getTime() is in milliseconds since Jan 1 1970 and the C time_t (which is what I think is used as a timestamp) is in seconds.


I used this code and it worked. However, I just received Mozilla fix. Do I need to now delete this code from the browser console?


No, the mozilla fix should have wiped out everything this did.

Edit: I.e. you're good to go.


thank you so much for your help!!


This worked like a charm. Thanks very much!


Doesn't work for me (latest FF on MacOS). I get TypeError: Componets.utils is undefined


Same here, with FFv66.0.3 on Arch. On FFv66.0.2 as well. Tried some suggestions I found on the internet, but no luck there either.

#Edit: found you need to be on about:addons, then paste the code.

Thanks!


Thank you for this.


Run this in your Browser Console[1] to delay signature checking for a day:

    function set_xpi_sign_time_now() {
        const {Services} =  ChromeUtils.import("resource://gre/modules/Services.jsm");
        const now = (new Date()).getTime() / 1000;
        Services.prefs.setIntPref('app.update.lastUpdateTime.xpi-signature-verification', now);
    }
    
    set_xpi_sign_time_now();

EDIT: Changed `Components.utils.import` to `ChromeUtils.import` because apparently Beta and Nightly versions have removed the former, while the latter was introduced in 60.

This does the equivalent of setting in about:config the time of last signature verification to the current time. By default, Firefox re-checks signatures in 24 hours (or so I read somewhere here). I like the temporary effect of this, compared to the permanent disabling of signature verification suggested elsewhere.

----

1: https://developer.mozilla.org/en-US/docs/Tools/Browser_Conso...


Outside of about:addons I get this in Firefox 66.0.3 in Linux:

> ReferenceError: ChromeUtils is not defined

Also, this doesn't seem to help with currently disabled add-ons, unless I'm missing something. Trying to reinstall Adblock Plus, for example, still results in

> Download failed. Please check your connection.


> Outside of about:addons ...

You seem to be trying in a standard Web Console, not the Browser Console.


Beginning in Firefox version 52 released March 7, 2017, installed NPAPI plugins are no longer supported in Firefox, except for Adobe Flash. Some of the plugins that no longer load in Firefox, even though they may be installed on your computer, include Java, Microsoft Silverlight and Adobe Acrobat. See this compatibility document for details. https://support.mozilla.org/en-US/kb/npapi-plugins?as=u&utm_...


Components.utils is undefined (Firefox 67.0b16 macOS)


Try `ChromeUtils.import("resource://gre/modules/Services.jsm")`.


Can you set this to the future and get say a week of buffer time for the fix to get pushed?

edit: nope, you cannot:(


1. I don't think that'd be necessary; I believe Mozilla will fix it in a day, or two at most (PS: they're currently testing a fix); and

2. I don't know if there's sanity-checking code in Firefox to ignore times in the future.


I tried it and it was set to 0 when I restarted firefox (and the extensions were gone).


I tried essentially the same thing but via about:config. All it did was make it happen immediately on next restart.

Also if it hasn't happened to you yet, make a backup of your profile right now in case it wipes out your addons data as some have reported.


I'm a bit confused.

I thought that the way signing works in general is that the signer issues a certificate for the thing being signed (domain, code, whatever) that contains identifying information for the thing signed (host name for an SSL certificate, checksum of the code for a code signing certificate), the valid from and valid to dates of that certificate, and assorted other information, and either a reference to or a copy of the signer's certificate, and it signs the whole issued certificate with the signer's certificate.

Someone checking the signed thing is supposed to consider it validly signed if:

1. The date is in the valid range for the signed thing's certificate,

2. A check of the signature of that certificate against the signing certificate passes,

3. The signing certificate is recognized as being from an issuer considered trusted by the checker,

4. Neither the signed thing's certificate nor the signing certificate have been revoked, and

5. The signing took place during the valid date range of the signing certificate.

Note there is no "the date of the check is in the valid date range of the signing certificate". A signing certificate expiring should not invalidate things signed by it. It should just prevent signing anything else with it.

So why is a signing certificate expiring for Firefox breaking already signed extensions? Shouldn't it just be stopping new versions of extensions from being signed?


If a signing certificate expires and is stolen, it can backdate signatures. On the theory that expiration is useful because either people keep poorer track of key material over time or algorithms get weaker over time (which is not an unassailable theory, but it's a coherent model), you want expiration to prevent future use just as if it were revocation. Because you can't trust the date of a possibly-forged signature, you have to check the current date.

The model you're suggesting is closer to the "timestamping" one commonly used in code signing (IIRC Windows and Mac both do this) where a third party that's particularly trusted to handle key material well long-term gives you a second signature over the message "I saw this signature at this time" (effectively they are analogous to a notary or witness for real-world signatures). Then you can trust that signatures from expired signing certs were actually made in the past, and not by an attacker who got hold of the key. That is, without timestamping you have no proof of #5 in your list.

(I suppose you could do this now for the SSL PKI with Certificate Transparency logs.... it isn't exactly what they were built for but it's probably sound.)


As noted, in practice, without additional info, there's no way to tell when a signature was created and so all signatures die when the signing cert expires.

That said, existing signatures that have already been verified, for existing extensions, should still be trusted.


Yeah, this response is a bit strange because if I trust my own system's clock, I just have to remember for each signature when I first saw it.


They have acknowledged the defect and are working on a fix. While this is a severe impact, I am still with Firefox. The are enough alternative browsers to tide over the problem for now. The fact that alternatives exist is the reason why we should support projects like Firefox.


>I am still with Firefox

that's kinda the problem. there's plenty of reasons to be "with" firefox still, but you shouldn't need reasons other than it's the best browser. when it starts requiring loyalty to be a user, that's a big problem.


For me, it is the best browser. Yes, this is a big fuck up, but it's not like this has caused me material harm. It's easy for me to switch over to Chrome until this is fixed, and I doubt the same mistake will be repeated in Mozilla.

I expect perfection from plane and car manufacturers, and I pay for that. My browser, I can live with an occasional hiccup.


Don’t you think this is awfully dramatic?


Just curious, should we expect that the fix (issuing a new signing cert and re-signing all the addons and whatnot) will result in the addons being automatically updated and re-enabled? They certainly seem to have streamlined disabling the addons, I wonder if it is equally simple from a users perspective to bring them back. Also now wondering just how hard their network/CDN is going to get slammed when those new re-signed addons go live and every user automatically redownloads them.


It looks like they have a hotfix to automatically re-enable the addons: https://twitter.com/mozamo/status/1124627930301255680


Looks like all extensions have been disabled for all Firefox users.

I think this fail-closed behavior is more of a security issue than the one it is trying to solve. All of my security add-ons - Privacy Badger, NoScript, Decentraleyes, and many more were disabled. Even worse, it happened without notice to the user.

One moment I was browsing the internet (just barely) secured by these add-ons, and the next moment, all of them disappeared (without warning) and I only noticed when I saw my password manager was missing.


If it failed open, anyone unlucky enough to update their extensions could end up having a malicious version installed. It also would have taken longer to notice.


It should fail 'locked'. continue to allow installed addons to work, notify the user of issue, disable any updates without explicit request by the user.


This disables NoScript on Tor Browser. That's much worse than the slim chance of a malicious extension being installed.


So why not just disable extension updates instead of disabling the extensions themselves?


Presumably because how would it differentiate between a legit "already installed" extension with a signature that cannot be verified, and an extension installed by malware that also cannot be verified?


Browsers can only protect against malicious websites and malicious extensions. They can't protect against malware. Even without any cert problems, malware on your machine can modify the browser executable/process to insert whatever code it wants.

With this reduced threat model, it's easy to simply keep existing pre-installed extensions available, and disable updates. Your only problem is if a pre-installed extension is malicious or has a vulnerability, it will remain.


> Presumably because how would it differentiate between a legit "already installed" extension with a signature that cannot be verified, and an extension installed by malware that also cannot be verified?

This is why a signature can also be accompanied by a trusted time stamp which can confirm that the signature was made while the certificate was valid.

This is the common way to sign all Windows software to avoid this exact kind of problem.

Yes, that implies this is a known and solved problem. It’s embarrassing for Mozilla to not have prepared for this.


If an extension was already installed, it passed the signature check at the time of installation. I'm not sure what benefits we get from periodically re-running the exact same check -- particularly when balanced against the risks of the re-checks, which are now obvious.


Personally I despise the idea of the software already on my pc being dependent on signatures stored on a remote server. I installed it and Mozilla can fuck right off. It's my responsibility to police what software is on my computer, not theirs.


According to https://news.ycombinator.com/item?id=19824520 the signatures are on the extensions themselves, not on a remote server.


because that would make too much sense.


Updating with an expired cert doesn't automatically result in compromise.


Yes, but what's the point of cert expiration? Is it safe to have certs that never expire? I believe there is a security benefit to expiration. Expiration is useless if it's never enforced.

Probably the correct behavior is to have some sort of semi-annoying popup when it expires, and then only a week later do the full blocking. You need to strike the right balance of making it annoying enough that it can't be ignored by everyone (otherwise you just have the exact same problem, just delayed a week) and that fear of it happening is a sufficient motivator to stop people lazily relying on the grace period, but also not too annoying that it makes a lot of people quit. You also want to avoid permission fatigue.


It's a great argument against centrally controlled walled gardens. Basically breaking or compromising a single certificate has a widespread impact. Even Tor browser is impacted. For most of us this is a temporary inconvenience but there are people whose personal security depends on some of the extensions that just stopped working.

On a positive note, it's been a while since I browsed without a lot of extensions. Ads are still annoying and I noticed some extensions apparently had more of a performance impact than you'd hope.


Mine still work. I tried to set my clock to two days ago to avoid it and promptly got errors on every HTTPS site I visited. Damned if you do, damned if you don't :/


A warning yellow bar appeared for me below the URL bar.


It wasn't quite all for me, it left 3 of the 20ish I have installed.


As a temporary fix, go to about:debugging, and click "load temporary addon", then paste in the download link of the missing add-on. Then just try and not restart Firefox until they fix the broken cert.


This works. On a desktop, you can reload installed addons from the firefox profile folder >> extensions.


Yes, that seems to work on desktop. Thanks.


That's most likely the easiest way to temporarily fix the issue until it is resolved.


I’ll still keep using Firefox since I recognize the importance of browser diversity and the hazards of a Chrome monoculture (that and vertical tabs), but, yikes.

Still, this type of oversight seems all too common even in large companies. I remember several cases from Fortune 500 companies in the past few years alone. What would be a good way to automate checking for them? Has anyone developed a tool designed specifically to avoid certificate expiry disasters?


> Still, this type of oversight seems all too common even in large companies. (...) Has anyone developed a tool designed specifically to avoid certificate expiry disasters?

LetsEncrypt renewal is supposed to be automated. [1]

I know of a company that hosted blogs for thousands of customers. They used LetsEncrypt, but the CTO considered automatic renewals a possible security risk, so they did it manually. Problem is, the expiration happened in a weekend and they "forgot" to update the certificates before that. Suffice to say that the next Monday wasn't pleasant. They automated after that.

[1] https://letsencrypt.org/about/


I have no idea why you'd deliberately wait the full 90 days to do a manual renew. For reasons, I renew manually, but every 60 days or so. Nowhere close to the deadline.


Exactly.

Their FAQ [1] recommends exactly that: renewing every 60 days.

[1] https://letsencrypt.org/docs/faq/#what-is-the-lifetime-for-l...


Just curious, are you talking about Webflow? Because I had to hunt down and make sure our Let's Encrypt auto renewal was working until I realized the certificate was served by them. They wait until the last 12 hours to renew the certificate. I have no idea what type of rationalization would lead to that decision.


90 days is 4 times a year. 60 is 6 times, 50% more expensive when you’re paying someone to perform the task.


I had the same thought, but I still find that absurd. Say they host 500,000 websites with HTTPS. 1,000,000 renewals they save spread across the year, roughly 2 renewals a minute. That is pennies. A t2.medium could handle that type of load increase


A bit OT, but what's up with this usage of Amazon EC2 tiers as a unit of computational power?


i think it’s a combined “fixed cost” rather than just computational power... like you could do it with x, thus it should cost at most y

similar to saying that you could do it with a raspberry pi


It is a clearly priced unit of computational power maybe?


Nope, content marketing company


Not webflow. We auto renew way before LE expires the cert.


They didn't have renew automatically but they could automate notifications, alerts or even banners in their internal apps when 60-70% of the time was exhausted. If I was given such a restriction, I'd still automate it 100% but require a human to authorize it every time by clicking a magic link in their email, slack or some dashboard, and nag them with notifications until someone authorized it.


Some shared hosting like Bluehost now provide LetsEncrypt by default for all their sites with auto-renewal (But I don't recommend Bluehost shared plans for anything even closer to serious hobby due to absurd downtimes like most other shared hosting).

I used manual renewal for LetsEncrypt for about 4 websites on other shared hosts & renewing them every 3 months was a pain; had to keep reminders and schedules just not to miss renewals until I synchronised their renewal schedules to batch (manual) renewing them.

I had automated renewal for 1 website on a cloud server, it was a one time effort, I never had to bother about SSL cert for that site and the most favourable of them all.


Another option is using a Web Server/Reverse Proxy that supports Let's Encrypt automatically, like Caddy [1]. I believe Apache HTTPD has partial support [2], too.

[1] https://caddyserver.com

[2] https://httpd.apache.org/docs/2.4/mod/mod_md.html


Traefik is another option here for a reverse proxy with automated renewals; I use it in a ton of places.

https://traefik.io


Apache HTTPD looks interesting, so using which we renew LetsEncrypt cert without using certbot?


It requires some fiddling and it's in experimental state, but yes! Here's the documentation:

https://github.com/icing/mod_md/wiki/Migration


Nginx works well and there's a tool that automates most of the extra config stuff for you.


> had to keep reminders and schedules just not to miss renewals until I synchronised their renewal schedules to batch (manual) renewing them.

Another use case for the app I am developing! The basic idea: You can enter an item (i.e. "MyOwnShop Cert") into the list. From that time on, it will be tracked how much time passed since the item was entered or renewed (by clicking the renew button). The item with the longest time since entering/renewing is at the top of the list.

Compared to schedules and reminders it has the advantage that the item is not out of our mind once the reminder or schedule pasts. It just sits there dutifully and its timer keeps increasing.

I use it for keeping up with middle-term contacts ("Wow, I have not written Carl for 3 weeks?") and health-related issues. Logging in stuff that easily spoils would be another use case. And, apparently, cert renewals :)


I own a webhosting provider. We offer Let's Encrypt with automatic issuing and renewal, securing 184,961 hostnames (SANs) at this moment.

We issue certificates automatically if none is existing when connecting to a website and renew the certificates in batches 30 days before they expire. When renewing, we merge certificates/hostnames into bigger certificates with 90 hostnames so we don't have so many moving parts.

If renewal would break, however (as it did once or twice before), nothing bad would happen because on page load there would be a new certificate issued.


So did they conclude it wasn’t a security concern or did they conclude the security risk was worth the uptime?


When pressed, they admitted it was just "gut feeling". The team audited a couple ACME clients and couldn't find anything to justify not automating.


Having a root process with write-privileges to /etc on production machines and also able to communicate over the Internet definitely is a security risk.

To mitigate that you end-up building a series of privilege-restricted jobs flowing from the DMZ back into the internal network. And maintaining that might be more complicated than just manually renewing, depending upon the processes and architecture of the company.


Why would a process need to run as root or have write privileges to /etc in order to automate LetsEncrypt renewals?

I run Caddy (which uses acme-go/lego as its ACME provider) as a non-root user with no access to /etc at all. It seems to be running fine.


Depends on setup, but frequently private keys are inaccessible to the web server worker process. (Which starts as root, loads keys, drops privs, etc.)


Most popular ACME (Let's Encrypt) clients allow you to provide a CSR instead of generating the keys themselves. That means a bunch more work for you, but if you're worried about this, that's what you should do. Have your safe (even manual if you insist) process make keys, make CSRs for the keys, and put those somewhere readable. The ACME client will hand them over to the CA saying "I want certs corresponding to these CSRs" without needing access to your TLS private keys at all.


That does mean you aren't automatically rotating keys anymore.


If you trust your automation, you put private key rotation into it.

If you don't trust it your automation, you rotate the keys manually, as you would normally.

There are no valid reasons to throw the baby away with the bathwater.


Using http renewal requires listening on port 80 which, by default, requires root.


This is technically true, but contextually lacking.

acme-go/lego doesn't use HTTP validation unless you disable just about every other form of validation first. TLS-ALPN validation is much more likely, so port 443.

That said, it is very easy to allow software to bind to privileged ports without providing it root access; this has been solved for a very, very long time.


You can just use the web server that is already running on the machine.

You (normally) don't want downtime in your website, so you just let your regular webserver serve the acme challenge instead of stopping it.


I'm curious as well. My intuition would be that it's not a concern, since servers already keep their private keys stored locally in order to be able to communicate with clients anyway? Being able to update them doesn't really seem to make things any different. But I feel like I could be missing something/not have thought through it properly. (I imagine security implications can get more complicated if a different server decrypts traffic vs. processes it, etc.)


The "manual" process used previously by the company already involved some form of automation, so it was more about trusting CertBot not to do anything horrendous.

But now that you mention it, I wonder what's the opinion of security experts like tptacek on cert renewal automation.


We could attempt a summoning. Quick, make a wildly inaccurate claim about the correct way to implement an encryption library.


It's automated but things can go wrong even when correctly configured and tested. Real world example: certbot version got old, the renewal server didn't support it anymore, the certificate didn't renew, the web site got the dreaded https warning page.

Of course that is also a kind of misconfiguration. The site has Debian security auto updates on but certbot is not among them. It should be forced to be updated. Furthermore there was no monitoring of errors in its log file.

Still it's not as simple as one believes Letsencrypt to be.


We update automatically AND manually check periodically to make sure the update took place. That company must be overly fond of drama...


Then the automatic update process stops for some reason and your certificate expires...

At the end of the day, someone needs to verify that new certificates gets acquired and installed before the old ones expire. Automation makes acquiring them less tedious, but not much for making sure someone pays attention.


Didn't Mozilla invent Let's encrypt? That would make this disaster doubly embarrassing.


Let's not forget multiple mobile networks across Europe went down on the same day last year because Ericsson(?) let a cert expire on some internal management system that had not been updated. SSL cert renewal is one of the great unsolved problems in computer science

edit: not Europe, just UK and Japan apparently: https://www.zdnet.com/article/ericsson-expired-certificate-c...


There was also an issue last year where every single Oculus Rift was essentially bricked because they forgot to renew a cert (apparently, what with the chaos of the Rift launch and the Facebook acquisition between the cert issuance and expiration, they just kind of ... lost track).

It took like two days before there was any kind of fix available, and they couldn't even roll it out automatically because the expiration had also disabled the auto-updating.


>SSL cert renewal is one of the great unsolved problems in computer science

Certificate expiry really only exists to make money for CAs. It doesn’t solve any security problem that CRLs don’t already solve (and solve better). There’s lots of unsolved problems relating to ‘how do you make a reliable PKI’, but cert expiry is really just an unrelated business requirement for CAs.


If it really was only to make money for CAs we'd see LetsEncrypt offering very long lifetime certs. But:

* Very short lifetimes get people to automate, preventing problems where one cert lasts long enough to lose the institutional knowledge around it.

* CRLs don't work. For performance you don't want to check for a revocation in serial with the request, and you don't want to block all browsing if the revocation list server is down. Revoking a cert will cover some users, but lots will still get "https://" and no warnings.


CRLs are not equivalent at all. They are a last-ditch effort to fix a problem when all else (expiry) has failed.

CRLs require maintenance and distribution of a list by a 3rd party. Creating an accurate, all-inclusive CRL of all website keys that your browser should reject is far, far from easy. (Case in point: "how many web sites are there?" Is not an easy question. )

Properly propagating such a list to any browser that might need it is another daunting task - less than 100% propagation means end users are exposed to security risks.

Certificate expiry is much more elegant: the client can check the certificate's validity himself, without relying on input from 3rd parties.

If certificates didn't expire, CRLs would (by now) be huge and growing enormously every day. They'd be so big that by the time you'd have downloaded one, it'd be outdated.


CRLs can be sharded, the cert carries the URL for the relevent CRL inside it. So they wouldn't need to have grown as huge as you suggest.

But, this sharing carries a cost for user privacy, if I shard certs 16 ways then each CRL download gives me 4 bits of info about which sites you were visiting.

OCSP effectively takes this to the extreme, each lookup is tiny because it's just for one cert, but it gives away exactly which cert you cared about each time.


Besides leaking data by on demand CLR checking, you also have a difficult fail open v fail closed decision.

Failing closed means failure of a third party immediately breaks your site. Failing open means a MitM can simply block the CRL check.

OCSP stapling and the 'must staple' header are a lot better for privacy, and OCSP responses have some validity so at least a 5 hour outage of your CA doesn't bring your site down immediately.

It is still vulnerable to a DOS and trust on first use though.


I would like to live in a world where OCSP stapling is widely deployed and we can require OCSP and advise people to set must-staple if possible while everybody who doesn't staple will just have to eat the privacy implications. But this is not (yet and for the foreseeable future) that world.

Apache and nginx both shipped OCSP stapling implementations that are very bad, awful enough that for almost anyone I'd say "No, don't enable that" rather than try to explain how they need to use it and get them to a place where it's useful and safe. Adam Langley wrote years ago about how to do this correctly, and there does seem to be a little bit of movement in the correct direction at Apache, but the situation remains pretty poor.


Cert revocation suffers from a very simple issue. If your check for revocation fails, do you fail open (ie accept the cert) or fail closed (ie reject the cert).

For any method, fail closed is user hostile and often a DOS vulnerability whilst fail open is another way for an attacker to use a revoked cert.

This is a big issue with on-line methods like OCSP as a MitM using a bad cert can probably block OCSP traffic as well.

CSLRs grow out of proportion, and leak information to the outside world.

Cert expiry serves as a backstop to these other revocation methods, and as a bonus ensures that simply forgetting about a cert cannot bite you 10 years later.


All TLS failures fail closed. The idea that if a cert is compromised it will eventually expire sometime within the next five years is a completely laughable security control. Leaking information is a complete non-concern too. Have you heard of certificate transparency logs?

Short lived certs are quite obviously better from a security perspective, but the security difference between a certificate that expires in five years, and one that expires never is irrelevant.


A missing OCSP response does not fail closed, nor does a CLR url 404-ing fail closed.

The information leakage of CRLs is stating to the public that a cert needed to be revoked.

Obviously, a compromised cert that will expire in 5 years is horrible. However, a non compromised cert you are no longer using that will never expire is more off a risk than a disused cert that will expire in a year. Not to say you should leave the one year cert lying around. However, there is no desire to put the one year cert on a pre-shipped CLR.


I'd argue it's a blunt hammer extra layer of defense, where if a certificate gets compromised and the owner never finds out at least it eventually stops working. This kind of compromise is pretty common.


> I’ll still keep using Firefox since I recognize the importance of browser diversity

Also, Chrome is not immune to "crashes for everyone at the same time" bugs. Like that time when the start of daylight saving time made it crash for a full day (a quick search tells me it probably was https://bugs.chromium.org/p/chromium/issues/detail?id=287821).


That bug seems to have affected only users on Android versions earlier than 4.3 and in Brazil or Chile.


> "crashes for everyone at the same time" bugs

What else would you expect for auto-updating software that relies on the internet to work? It's a monoculture attached to a firehose of disease.

This is exactly the same as "pushing out a security fix to all users," except it apparently wasn't intentional. You can't have one without the other.


I love "firehose of disease", and will steal it. And I agree that bugs are bugs; every time you add a new capability, you add all the possible bugs that can occur with that capability.


ACME / Let's Encrypt go in the direction of making expiry happen so often that renewal gets automated, rather than a being a rare manual process that can be forgotten about.

Not sure that's viable for a signing certificate like this, but that's the way to solve it for the web PKI.


See also: GPS vs GLONASS time encoding. GPS rolls over every 19 years, so devices, cars and even Boeing aircraft saw their GPS-based clocks turn back to 1999 last month. Meanwhile, GLONASS epochs are only four years long, so every device that uses it as a time reference is built to handle rollover.


It’s funny to me that people talk about this limitation as if it were some kind of virtue.


Short-term certs _are_ a virtue. Not only do you not have a manual event rare enough for people to forget how to do it, you also don't have to worry about which 15 services someone granted a 10 year wildcard cert to early in the company's history.


Having once had to regenerate 600+ self-signed certs, test that everything still worked, and then insert them into the 600+ live app servers without breaking anything, all within a two week window because no-one had realised the 10 year expiry was just about to bring everything down, I concur.


Its also more secure. Long lived certs risk the possibility that someone who used to own the domain got a certificate on it and it still works after the domain is resold. Once you automate it there is no downside to short lived certs.


If only there were a way to revoke certificates. Like, some kind of list.


Revocation lists get huge, ultimately becoming another reason to limit cert lifetime (you don't have to tell people you revoked a certificate which is expired naturally).

Very few things check revocation, unfortunately - it puts an extra hop on the fast path of connecting to a server. OCSP stapling is pretty much the only thing a browser would care about - having the server fetch a signed OCSP response that is good for a limited period of time (say, hours), and send that along with the certificate during negotiation.

Or, you could just have the server fetch a certificate thats good for a limited period of time.


If only such a list were actually effective rather than the majority of clients not bothering to check it.


CRLs do not work in practice, and major clients routinely ignore them.


OCSP stapling together with OCSP Must Staple is the way to go here. All major browsers support these.

Firefox still does normal OCSP requests, Chromes does not. So if you are a Chrome user, to my understanding, there is now way to know if the server certificate was revoked or not, other than OCSP stapling together with OCSP Must Staple. Additionally, both Chrome and Firefox ship a list of revoked certificates, but it may not be updated quickly enough and as far as i can tell it mostly contains roots and intermediates.


Revocation requires the private key


This is not true. In Let's Encrypt/ACME for example, you can simply obtain authorizations for all the domains a certificate is valid for and request revocation [1]. The only thing you still need to revoke the certificate, is the certificate itself. The certificate can be obtained from CT logs.

[1] https://tools.ietf.org/html/rfc8555#section-7.6


This is just abusive to the vast majority of users who do not care but still want to use SSL for their servers, frankly. I should be allowed to choose a near unlimited lifetime for my server's certificate if I don't care about the risks that may present.


Security tends towards the lowest common denominator. I'd rather you just figured out how to run a cron job.

The problem comes if your keys ever get compromised or cracked all your historical traffic becomes vulnerable instead of just the most recent window.


Yeah "just" a cron job except the implementation changes several times a year. Somehow this automated process was more time-consuming than the previous, manual one.


Many cloud providers will make this process pretty much entirely automated. But let's say you don't want to do that: when is the last time the way you run caddy changed? Or the last time python-certbot-nginx changed?


This was a few years ago, so things may have changed by now. But as they say, once bitten twice shy, and the wisdom of "just cron it" doesn't work with highly experimental tools like LE was for what I estimate to be the majority of its lifetime.


I'm sure there's a way to make your LE experience consistently suck but the way to run caddy for a static website has been the same for about as long as caddy has had support for automatic HTTPS, and that's also true for python-nginx-certbot. But more importantly: we can argue about what it was 4 years ago, or we can just observe that it's really easy now.


A tool not working well or being "experimental" does not dismiss the premise that frequently run automated tools are a better than infrequently run manual tasks when those manual tasks can take down your infrastructure if done improperly, missed or forgotten.

All it being new means is that depending on your risk ratio you need to decide whether updates to the software need testing or whether you need to invest in your own solution - or, how about just wait until it matures and keep the old process until then.

Waiting doesn't invalidate the premise either. It just means you lack the resources to implement it safely and that's ok.


As the service provider, you shouldn't get to decide. I think it's the users who can decide how long lived certs they're willing to trust.


That's cool and all, but what percentage of users do you think even know certs expire? I'd put the over/under at 1%.


It's not your risk to decide on. You will not always own that domain name, and allowing you to still have a valid cert for it afterwards is silly.


Actually it could be not negligence but a way to perform an attack.

Register a domain, get a certificate lasting forever, let the domain expire and somebody buy it. Then somehow redirect all or part of the traffic to that domain to your own server with a valid certificate. Chances are that few people will notice something has changed in the details of the certificate.

However you'll have left traces all over the place: credit cards, phone numbers, etc.


That's a great question.

I've never seen a bulletproof solution for organizational tasks that need to be done yearly.

If someone's in charge... and both they and their manager happen to leave in the same year... and whatever system they had in place to remember (probably their personal calendars) is gone... and the manager's manager has 1,000 other things to remember...

...how does an organization ensure the task still gets done?


"...how does an organization ensure the task still gets done?"

With something almost stupidly simple and low-tech: checklists.

(I'm reading "The Checklist Manifesto" right now, and the points it makes seem to fit perfectly with everything you mention.)


An year is enough time for everybody that knows about the checklist to leave.


We resolved this issue at my last company with sufficiently large mailing groups for cert renewal reminders. Once you get to 12 people on a mailing list, with new employees being added all the time, it's hard to miss. Usually a manager on that list is pinging people about it. There is the chance of the tragedy of the commons occurring, but I never saw it.

Once you do this, the only checklist that matters are procedural checklists to add a new client or new cert to the renewal notification list. When you use a standard group email for all cert purchases, that one becomes tough to miss.

In my 7 years of being involved, we never missed a cert renewal with this process for ~300 client sites with multiple or wildcard certs.


Put "make sure someone else knows all this person's checklists" on the employee exit checklist.


Put the checklist on the home page of the company website!


There should be a separation between the things that need to get done and the people that do them. As in, tasks should be created first and then assigned.


Surely tasks are performed by and assigned to roles, not individuals (who just happen to be in those roles at some moment in time). If a role disappears, e.g. in redundancy, then the role's tasks are evaluated for either transfer to a role that remains, or being discarded.


Just one : Emacs orgmode


Realistically: reduce your own cert renewal window to weekly, if not daily. This forces you to have a good renewal system in place and alerts you to failures long before actual expiration.

Quixotically: make cert failure a randomised number, linearly related to how long ago the cert expired. This slowly introduces more and more failures, over a certain “grace period”, which makes the problem less of an extinction level event. It’s not a solution but it definitely would help.


It's not that complicated, just add scheduled health checks to the same system you use for checking if the website and such is up. If the expiry date isn't updated within a week of expiry start paging engineers.

I'm willing to bet Mozilla already does something like this but an engineer didn't set it up correctly for this certificate.


Talking about vertical tabs, I was in the middle of studying for an upcoming exam, then when I alt-tabbed back into Firefox, all of my tabs are missing with that unsupported addon error. Fortunately refreshing Firefox gave me back normal tabs, at a cost of uninstalling all of my addons.

The problem is that Tree Style Tabs relies on userchrome.css edit to hide the tab bar, and when TST is forcibly removed there is no way to access the tabs, because that edited userchrome.css is still there. This is very disruptive. At least with the pre WebExtension addon TST itself hides the tab bar, so if TST is removed then the original tab bar comes back on automatically


I have it set up so that the tab bar is only displayed if the menu bar is visible, and I can use the Alt key to toggle them together.

https://github.com/eoger/tabcenter-redux/wiki/Custom-CSS-Twe...

  #toolbar-menubar[inactive="true"] + #TabsToolbar {
    visibility: collapse !important;
  }


> Has anyone developed a tool designed specifically to avoid certificate expiry disasters?

Not perfect, but I've added a TLS certificate extraction tool into a DPI that displays all visible certificates ordered by expiry date.

One could then mirror all one's site traffic to it and let it run in the background. Coupled with some alerting tool it would catch most of those cases I guess.

I could polish the tool a bit more if there is some interest, but anyone could do it as well.

See

https://github.com/rixed/junkie

and more specifically the plugin called 'sslogram'.


We scan our codebase for anything that looks like a cert and send emails when it gets close. Might not have helped here if it was an intermediate owned by a CA. There but for the grace of God go I.


If you want to get rid of those and they're public certs: odds are they're in Certificate Transparency logs and you can monitor them from there.


Monitoring CT lets you verify that somebody renewed the certificate, but it doesn't verify they actually installed the replacement correctly.

My employer (Kynd.io) currently monitors public web sites for customers so we can flag e.g. "Hey this site cert expires in a week! If it's dead probably just switch it off, otherwise renew the certificate" and we're in the process of integrating CT but mostly so we can say "You already have a newer cert but need to go install it" in our How To Fix instructions.


Why do you have certificates in your code to begin with?


If you have your own CA for whatever reason, it's common to distribute the root and intermediate certs with your code so things can resolve.

You don't ship the signing keys with the certs, as that would be bad. ;)


s/resolve/validate/


You can find lots of programs like this one to monitor certs:

https://pypi.org/project/check-tls-certs/

I run one daily from cron and have it email me a report with the days to expiration for the certs I’m responsible for, even for certs that auto renew. I don’t filter the email. Daily is not too frequent for it to go to my inbox, but frequent enough that I’ll notice if it doesn’t mail me. YMMV.


Discovery of all the certs is what I think is the harder problem.


I agree. What can be done to prevent developers from adding a certificate dependency without monitoring during the move-fast-and-break-things days of early development, which then sits for X years as developers come and go, and nobody notices until it fails?


Whilst I'll say "disclaimer, this is my project", monitoring Certificate Transparency with CT Advisor has helped me find out about certificates marketing people deployed and expected me to maintain without my knowledge.

[0] https://ctadvisor.lolware.net/


Certificate Transparency works pretty darn well for most usecases, we (Latacora) have found while trying to solve exactly this problem (or at least the figure out which certs exist that aren't being regularly re-issued part) :-)


Caveats:

Certificates that aren't from the Web PKI almost invariably won't be logged. Most logs explicitly refuse everything except certs from the Web PKI so as not to be burdened storing garbage. So this won't find certs issued by the custom OpenSSL CA on that one guys Linux laptop.

Not all Web PKI certs are logged. There is no BR obligation and no root store programme rule that requires logging. The only things in place that strongly encourage logging are the Chrome and Safari policies. For systems that aren't designed to be accessed with a web browser or, much more rarely, enterprises that have persuaded themselves only IE is authorised anyway, the certs might deliberately not be logged. Yes there are (small) CAs doing this in the Web PKI, on purpose, in 2019.


You can tell ACM your CT preference!

(But seriously, sure you’re right but for my audience (which is essentially Latacora’s and HN’s), CT is fine.)


Hook the alerting for expiring certificates into the library that is used for handling certificates, at least in debug builds.


>What can be done to prevent developers from adding a certificate dependency

Discipline? Experience? PIP?


We have an agent that pulls certs from an internal service and stores them on disk where apps can use them. We no longer manually install certificates. This solves discovery, and gives us alerts on services that have stopped refreshing their certs for any reason. The internal service is wired into lets encrypt and a commercial certificate provider. Setup is minimal, and after that completely automated.


> Still, this type of oversight seems all too common even in large companies.

The npm self-signed certificate fiasco of early 2014 springs immediately to mind.


There’s lots of monitoring services out there that do it. A long time ago I worked at place that used a service called site24x7 for cert and API monitoring. That was before Pingdom kinda got better than most API monitoring services, but I don’t know if they monitor cert expiry.

Taking a look around, you’ll find lots of service providers, or tools you could use. But the main issue is all they do is tell a human being to do something, which they can still fail to do. Which is why automating cert rotation (with things like let’s encrypt or ACM) is arguably a better solution than monitoring it.


Maybe we need more browser diversity than just two different teams with two different systems. Both are sitting very close to each other geographically, and both are produced in the same culture (as in silicon valley), so it would seem likely that, while they compete with each other, they will apply very similar answers to problems they face.


>Has anyone developed a tool designed specifically to avoid certificate expiry disasters?

Is anything more than a calendar reminder on the phone of someone important enough to shake the Earth and get it fixed For. Certain. needed? Like, say, the CEO, CTO, and CFO should at a minimum get a notification so they can ask if the refresh was done when necessary?


Admin people. Often the most senior ones get the title "Personal Assistant (to senior person job title)" but not always. They're lead bureaucrats, and tracking things that need to be done and ensuring they get done, either by doing them themselves or assigning them to reliable underlings is the purpose of their role.

Corporations are often not very good at putting the right people in these roles but good ones are invaluable. Since the Marvel Universe is everywhere, Pepper Potts is the archetype in that setting to give you an idea of why you'd need people like this. Tony Stark would be "too busy" to renew the certificates, but Pepper would make sure it gets done.


I built a tool for checking ssl certs some time ago: https://ismycertexpired.com but I'm not checking intermediate certs...


Systems designed around long TTLs make this problem worse. I love the default of 90 days for Let’s Encrypt. It forces some good discipline and hygiene. Wish there was a better solution for short lived CAs


I have disabled signature checks in Firefox because otherwise it is impossible to install a private extension without uploading the source code to Mozilla.

This is how you allow unsigned extensions in Firefox on Arch Linux, the same files can be edited on Windows and macOS, restart the browser after changes:

  sudo tee /usr/lib/firefox/defaults/pref/config-prefs.js &>/dev/null <<EOF
  pref("general.config.obscure_value", 0);
  pref("general.config.filename", "config.js");
  pref("general.config.sandbox_enabled", false);
  EOF

  sudo tee /usr/lib/firefox/config.js &>/dev/null <<EOF
  // keep this comment
  try {
    Components.utils
      .import('resource://gre/modules/addons/XPIDatabase.jsm', {})
      .XPIDatabase['SIGNED_TYPES'].clear();
  } catch (ex) {
    Components.utils.reportError(ex.message);
  }
  EOF

This method also works for the stable version of Firefox.


It's very good that it's possible, but it's literally the worst solution for most people.


I don't get why an expiring cert disables the extensions. Shouldn't the browser be checking the cert expiry date against the date the extension was installed, not against current time? As long as there's no way to manipulate the extension installation date that would be fine, wouldn't it?

edit: or even why the browser is checking this at run-time. As long as it checked the cert when the extension was installed, isn't that enough?


If Mozilla somehow lost control of one of these signing certs (as in: it got stolen) they would put in on a revocation list. If certificates don't get re-checked, all installations between "cert got stolen" and "noticed that the cert got stolen" would keep installed & running.


There have been major organizational problems at Mozilla for a long time that precipitated this. Many of us saw something like this coming, saw gaps and unclear responsibilities, reported these gaps and confusions up the chain, and were reprimanded and financially penalized for asking the tough questions. The questions were never answered, and we all quit, were fired, or lost motivation as a result.

This is a tech problem, yes. Cert renewal has bitten everyone in a high profile way (apple, google, and ms have all had renewal-related outages in recent years). But this was preventable at Mozilla. Ask a Mozillian about IT and Cloud Sevices, and what their respective responsibilities are. Ask Mozilla’s VP of IT- who is responsible for cert renewal? Ask Mozilla leadership- why are people afraid to ask questions?


You basically just described any sufficiently large organization. Complaining is not helping anyone in these situations, the only thing you can do to change things is to go ahead and try to change things. Reporting things up the chain hardly ever works because the chain is too busy with their own issues and politics. You have to make it worth their while.


> the only thing you can do to change things is to go ahead and try to change things.

You are describing taking risks and/or being penalized for little to no potential reward in most organizations.

You are doing something you are not asked to, so any inconvenience or side effect, whatever the cause is, is on you. And as it was not marketed internaly few people will be aware you did anything, accordingly you will get little recognition (financial or any).

It also presumes you already accomplished everything that was under your responsability, which is basically impossible in any org where objectives or KPIs are set so you hit a 80% target. You’ll then have to explain why you prioritized a seemingly random task, and bothered the other teams to help you do it without consulting your boss or their bosses.

Basically this approach could work for critical issues that are obvious they should be fixed. But then it should also be obvious to your boss, so getting their clearance is the normal way to do it.

This is I think the reason why people just leave instead of fighting a losing battle to fix issues they care about but the upper ranks don’t prioritize.


Mozilla is an organization of 1000 employees. Executives act like it’s 100,000 employees.


>the only thing you can do to change things is to go ahead and try to change things

how? Reporting up the chain is THE way to change things. When it doesn't work what are you supposed to do?


Take on the responsibility for the gap yourself. If you get assigned something from up the chain that gets in the way of that new found responsibility report up the chain that they need to assign it to someone else first.


If I am slowed down in my day-to-day sprint because I decided to take on a task management wasn't willing to fund/approve, I would be in much deeper trouble


> You have to make it worth their while.

Not having a large outage seems like it should be worth their while...


Outages like this never seem real until they actually happen.


Seems like quite the accusation no? I mean I hate Mozilla as much as the next guy but any kind of proof, even if it's a Twitter thread from some Mozillian, would be nice.


Oh man, remember that thing where firefox just randomly installed that LookingGlass Mr. Robot thing (end of 2017 I think..)?

This was their second chance already...


How did executive/management at Mozilla become like this?

Was it slow rot or some event triggering it?


If you're going to quit anyway, and not saying this is a good idea, what prevents you literally going and yelling at these idiots demanding answers as to why they are not doing their jobs?


So many people have done this. Watch videos of past Mozilla All Hands meetings. You see the questions asked, you see the leaders dodge, and you see that soon after the askers are gone and leaders say- you have nothing to be afraid of, you don’t need to ask questions anonymously. “Executives” at 1000 person Mozilla are so distant from the workers, and middle managers listen to executives, not staff. It is so sad because Mozilla staff are smart, caring people who love the web, but they are powerless and afraid in the org today.


Sure, the organisation is far from perfect, and makes mistakes, but as a current Mozilla employee, I don't recognise this description.


Thank you for speaking out here! I hope that mozilla employees manage to free themselves from their leadership and change the organization from within. It's possible, even though most employees who could change something have decided to simply leave over the years. Maybe mozilla needs a stark revenue drop to get humble again?


It seems that to enact a change somebody would have to go public and talk with big journos to bring this dysfunctional stuff to light, much like the recent Bioware scandal.


I feel you. I saw this kind of thing coming back in 2010, when Firefox started copying Chrome blow for blow. I saw Google's influence on Mozilla and knew it could only turn out poorly. By the way, what can you say about that?


Maybe before talking about this he\she should provide any kind of proof of ever working at Mozilla at all?


(I work for Mozilla)

I'm guessing the poster is legitimately former staff. It's not hard to find disgruntled people in any organization (especially if you look at those who have left.) And they'll often have legitimate reasons.

But the question is really whether there's a consistent pattern of problems - actual rot, so to speak. I haven't seen much, but then i know that some parts of the org are very different than others. I can say that I have publicly complained about a number of things in the last several years, and never felt any repercussions as a result. That includes comments made directly to the CEO during All-Hands sessions, so I'm not just talking hypothetically.

Yet I have also heard about a handful of cases where people have been treated unfairly as a result of public comments or actions, including a couple of friends of mine. So shit happens here, it's definitely not perfect and the problems aren't all in the past. But overall, I still feel like Mozilla is substantially better than most similar companies.

Just my perspective.


If everyone's add-ons are disabled, I wonder why mine are not. My computer has been running over night (coincidentally, first time in years) and my add-ons are intact. Does it take a browser restart? Or might I have a setting that prevents this from happening? My system time is correct.

Edit: am on Firefox 66, Linux (Debian Buster/testing), using Firefox from Mozilla directly (not through repositories), and my internet/wifi should not have disconnected. System has been up since 2019-05-03T17:30:00Z, suspended before that.


Firefox 66.0.3 here, and this has been the case also for me, i.e. everything is still working. After looking around for a while in bewilderment, I think that what's going is that they have remotely used the "studies" feature of Firefox to temporarily work around the problem.

Indeed, I see in about:studies,

hotfix-reset-xpi-verification-timestamp-1548973•Complete This study sets app.update.lastUpdateTime.xpi-signature-verification to 1556945257

(unfortunately I can not see when it was run in about:studies)

i.e. this "study" has reset the timestamp of the last signature verification to this morning (when I have started Firefox). Since I read around that Firefox performs the check only every 24 hours, I guess that this is reason why we have not been experiencing the problem. We have now another day, after which we will have to reset the timestamp again (if it has not been solved upstream). The field is available/accessible also in about:config.

P.S. To be fairly honest, I was a bit surprised about the "studies" feature, I can not recall when it was introduced, but it is probably my fault for having overlooked it.


Perhaps that's what Mozilla Add-ons was referring to when they tweeted:

https://twitter.com/mozamo/status/1124569680662777856

> We deployed a fix to users who hadn't had their add-ons disabled to make sure they saved that way. You're in that group. :)


Of course, that fix only got to people who didn't disable the "studies" feature after Mozilla abused it to deploy a Mr Robot ad to all their users. Also, enabling it seems to require also agreeing to send telemetry information to Mozilla, so all the privacy-concious people who use extensions to protect their privacy will likely have it disabled as well.


Same here. I've restarted Firefox several times and even manually checked for add-on updates and they are still working. Same on Android. However, I'm not able to install new Add-Ons (I get the "download failed" error). I'm in Germany, maybe that has something to do with it...?

EDIT: Android disabled Add-Ons when I switched from WiFi to mobile connection.


Same here, I opened firefox 20 minutes ago and discovered this thread, but all my addons seems to work fine. (FF 66, from debian repo, I'm in EU)


I think it only checks at start up, or something? Mine has been running for days, and all my add-ons are working. As such, I've decided not to restart Firefox until news arrive that it's working again.


Mines been running for days and everything was disabled.


Oh, darn. Thought I found a way to avoid this. Thanks for reporting!

Just a thought, might it be that you suspend (sleep mode in Windows) your system? So not restart the browser, technically, but iirc applications still receive some event when this happens. At the very least, open connections would break. Any TLS connection would re-validate the certificate. In that case, going offline would also trigger it. Have you done any of those (suspend, be offline / switch networks)?


Same thing here, I'm on 66.0.3 and I booted the PC 2h ago but I still have everything working properly.

Only reason I've heard of it is because my dad calling me in "panic", it seem to affect both his PC.


Same here. Both on Windows 10 and Kubuntu my Firefox 66.0.3 instances seem to have working add-ons (uBlock Origin) as of now. Both systems have been up for weeks.

Let's see what the day will bring...


And sharp 15:15 local time my add-ons got disabled and now I have a yellow sign saying "One or more installed add-ons cannot be verified and have been disabled".


Same here. Arch Linux, FF 66.0.3 (from Arch repos), had the PC in suspend during the night and all add-ons seem to be active.


Well, now (2 minutes ago) all of a sudden all add-ons were turned off. I have no idea why. I wasn't even doing anything as I was just reading some comments on this thread.


Are you on Mac? It's also still working for me.


Linux (Debian Buster, but Firefox is not from the repositories). Edited in some more info.


Are you on nightly build?


No, same version as the bug report is about: 66.


Interesting coincidence, my Ubuntu 12.04 Let's Encrypt certbox-auto updated in March 2019 to a version which no longer runs on Ubuntu 12.04 without major surgery.

https://community.letsencrypt.org/t/pip-error-with-certbot-a...

I wonder if this bit Mozilla and caused this issue.


I can beat that. I had a Let's Encrypt cert renewal that upgraded python and itself and borked the system it was running on.


In addition to the immediate fix, what needs to happen here and in general anywhere a certificate is used, is the browser should display an informational banner that the certificate is due to expire soon. I’d suggest start warning at T-7 days left.

That way even if the business messed up, they would have a heads up from users to fix it before d-day when everything stops working. This includes website certs and addon signing certs and any intermediaries.


Not sure what kind business processes are practiced in Mozilla. Some organizations have the notation of recurring tasks as part of their business processes. Recurring tasks are just like bug reports except they are created and assigned automatically to task owners on a schedule, such as every month, every quarter, and every year.

The goal of recurring task is to get people's attention to review and perform tasks the happen periodically. It could be as simple as reviewing it and marking it done. They will show up as part of the bug report to the assignees, so they can at one place see all the bugs, feature requests, tasks, and recurring tasks.

Cert renewing would fall under the recurring task category.


Who at Mozilla is responsible for cert renewal? Is it Mozilla IT or is it the Firefox org? That question has never been answered, and those who asked were often reprimanded. And this is far from the first certificate renewal problem.


Reprimanded for raising the question? That's quite dysfunctional. Sounds like that's an ownership problem.


Like a reminder that gets automatically added to the taskboard?

I like it! Will talk about this with my team lead!


I'm not familiar with Firefox extensions (and have pretty much stayed away from the stuff ever since they started making it "mandatory"...) but shouldn't the expiration only mean new signatures won't be valid, yet signatures made before expiration should remain so? At least that's how I understand things like Windows' driver signing works (when that was first introduced, I was quite scared that it would mean perfectly working drivers could just stop working due to the expiration, and asked... but apparently no one at Mozilla asked this question.)

Edit: wow, downvotes? Care to explain what I'm missing?


In short, yes, they should have implemented timestamping for their code signatures like most other code signing systems do.

Without timestamping the expired cert always would have caused problems, even if it was replaced early and correctly: Every add-on would still need to be signed again with the new replacement certificate and shipped to all users. It's not as easy as just replacing the certificate on some server.

Well, this is still what has to happen: replace the certificate, ship that new certificate[1], re-sign every add-on, ship every add-on to every user.

Now, in order to ship new versions of the add-ons, you probably will have to bump the add-on version numbers as well. Which can have further unintended consequences.

[1] Incorrect, see blow; it is my understanding that the certificate in question is baked into the browser itself, with no way to push updates just for the certificate remotely other than shipping an entire new Firefox build. Well 6 new builds: esr, stable, dev, beta, nightly, unbranded. Gonna be a fun night for a lot of mozilla folks... Well, a night is not gonna be enough...

I might be wrong tho, and misunderstood something.

EDIT I was wrong (https://news.ycombinator.com/item?id=19824520), the expired cert is not baked into the browser, just into the add-on package files. No need for new Firefox binaries, after all. Still, they have to resign all add-ons and ship new versions.


This same behavior is how certs usually work. Stuff with expired certs just does not run after the expiration date; that's because the cert tells you what server to ask for authentication, and if you have an old cert, there's no way to be sure that the original issuer is still the one in control of that domain.


You're referring to things like HTTPS and contacting a remote server, and thus your reasoning there makes sense.

I'm referring to traditional code signing, which I assume Firefox extensions are more similar to --- the goal being to ensure that some data has not changed since it was signed, and only the validity of the certificate at the time the data was signed is meaningful; even after the certificate expires, a signature created when it was valid still asserts that the data it signed has not changed.


Newbie question: why can't they just renew the certificate, like in 5 minutes?


A new certificate can be generated if you have to in a couple of minutes, sure. Of course, you probably defined some procedures to do it properly and securely that require more time.

Then the issue becomes: how to get the new certificate to a few hundreds million users?

If it was a certificate on some server, just replace it there, done. Client software will just pick it up. But not here. A copy of the certificate is shipped in every add-on package file. Oops. Now you have to re-sign all add-ons with the new certificate. And get those resigned files to the users.

Essentially it works like this (which is a slightly modified jar/apk signing mechanism):

- An add-on package is a zip file and other than the actual files there is also a list of known-good hashes of those files in a file called "manifest.mf" in the META-INF folder

- Then there is a file "META-INF/mozilla.sf" giving hashes of "manifest.sf"

- And finally, there is "META-INF/mozilla.rsa", which is a DER-encoded pkcs7 signature and two certificates. The signature verifies "mozilla.sf" was not tampered with and still is the same as when it was signed by mozilla. Which in turn verifies the known-good hashes are still proper.

- The signature is made with a generated certificate, the first one included in "mozilla.rsa". E.g. "CN=uBlock0@raymondhill.net" in case of uBlock.

- The "CN=uBlock0@raymondhill.net" certificate was issued by an intermediate certificate "CN=signingca1.addons.mozilla.org". This "CN=signingca1.addons.mozilla.org" is the second certificate in mozilla.rsa. It says "Validity Not After : May 4 00:09:46 2019 GMT". Oops. This is where the chain breaks now!

- "CN:signingca1.addons.mozilla.org" was issued by "CN=root-ca-production-amo". This root certificate is baked straight into the browser and not part of mozilla.rsa.

Therefore, it is not enough to issue another intermediate certificate (e.g. "CN=signingca-number-two.addons.mozilla.org"), but you have to actually generate a new "CN=uBlock0@raymondhill.net" (or whatever) signed by this new certificate, put those two certificates and a new signature of "mozilla.sf" based on those new certificates into a new mozilla.rsa FOR EACH add-on and ship updated add-on files.

PS:

Try it yourself... Extract some addon package (it's a zip file). Then:

    openssl pkcs7 -in META-INF/mozilla.rsa -inform DER -print


The root certificate is also part of the mozilla.rsa. Addons have it included, it can be extracted and then imported into the browser Certificates > Authorities and it will be used to validate addons, including those which are not updated.

A sample procedure doing exactly this, and a fix for Firefox <= 56.0.2 can be found here: https://www.velvetbug.com/benb/icfix/

The same procedure can be used on newer versions, but the syntax is a bit different (import cert, go to about:addons, open console):

  Components.utils.import("resource://gre/modules/addons/XPIDatabase.jsm");
  XPIDatabase.verifySignatures();
This only makes sense if you really need to fix it while the browser is running, otherwise you can simply restart it after the new certificate is imported.


They would have to reissue the intermediate certificate, as well as all dependent certificates.


And there are often bootstrapping problems where e.g. the push that distributes the new intermediate cert to clients is rejected because of the same expiry issue.


I like the alias name assigned to it: armagadd-on-2.0

Temporary work around till the cert gets fixed: set "xpinstall.signatures.required" to false


That doesn't work unless you're on dev or nightly.


Guess it only works on the release version if you compile it yourself. (Which is the case for me)


That only works on Nightly and Developer Edition.



Even though that's active and has more votes, it's ranked very low (#39 as of right now, and this post is now #1 on the site).

I think HN penalizes non-link posts (or people are flagging it because they think it's just someone asking for tech support).


Looks like you're right. I saw the other discussion first and it still has more comments for now, but this one is ranked higher. Perhaps the threads could be merged or something.


Thanks, missed this thread.


Honestly, I just have to wonder what the Mozilla devs do. Don't they use extensions? Didn't they notice what was going on? Why does it increasingly feel like a more random Google Chrome-Lite paired with Apple's 'best intentions'?


This is a certificate that expired, not some insufficiently-tested change someone deployed. Mozilla devs had their extensions disabled at the same time as everyone else.


Well, most of the actual developers probably run Daily and have xpinstall.signatures.required = false, and are therefore not impacted quite as severely.


This is a goddamned disaster. I'm just thankful that I use an offline password manager, but even still ...

I like FF, don't get me wrong, but this is going to absolutely fucking destroy user trust in Mozilla. This kind of incompetence, on a browser scale, is breathtaking.


I dunno. I’m a typical Firefox user, and I’d rather jump off a bridge than switch to a different browser because of a fuckup like this. People make mistakes, but Mozilla still stands for things that certain other browser vendors don’t, last time I checked.


That's my thinking too. I went back to Firefox a few months ago and it is back to being a fantastic browser now, and it feels good to use something that is also a force for good. I'm hoping they resolve this quickly and that it all turns out ok.


>and that it all turns out ok.

While I agree with you two assuming the bridge is over water and not too high, there are real consequences that cannot be reversed. I cannot unsee the ads I saw in the past few minutes before switching to nightly.


Sadly, what they "stand for" and what they actually do are two different things. This is exactly the kind of centralization that a company supporting a "free and open internet" (to use their words) should be against on principle, let alone pushing in their only product of note.

This should not be possible.

Worse, had they not taken the paternalistic, nanny-like stance that you can't even disable the signing checks, I could roll out a script that would make this a non-issue for my users. But no, thanks Mozilla for ruining my Monday.

Might not be the most substantive comment I could possibly make in the circumstances, but I'm pissed. The only appropriate response feels like a string of infuriated profanity directed at their incompetence and decision-making.


Once Palemoon went to crap Firefox is the last one that does what I want. Good luck mozilla, I'm sure you guys are losing your minds right now.


True. But they also increasingly stand for things I completely disagree with. Namely, deciding which software is approved for me to run on my computer. The way I see it, extensions shouldn't need to be "approved" anyway.

Luckily, I can still type "make install" without debian informing me that "random_dangerous_untrusted_code_from_interwebs" is not approved.


Yes, this is truly a gut-punch for everyone who has spent a bunch of time and effort getting their family and friends on a decent, cross-platform password manager (like LastPass, 1Password, or Dashlane).


Seems like an over-reaction. "Destroy user trust in Mozilla?" Really? Because your extensions got disabled for a day?


(Big fat disclaimer: I work for Google. These are my opinions and not my employers. I don't work on browsers. I test my code in Firefox. Etc etc.)

Sadly, I have to agree that this feels like a big blow to user trust.

User trust is not really just about respect or values; it definitely also includes things like performance and reliability. The average user, right now feeling powerless, might even feel anger towards Mozilla for this - after all, they already downloaded the extension, why would they all just stop working behind their backs? They don't understand what CAs are or why certificates expire. People don't frankly care what place your heart is in when they are angry about something. Perhaps people are being dramatic, but that's normal. People are pretty darn dramatic about Chrome, too.

Meanwhile... I use Firefox everywhere, and I've lost my password manager, adblocking, security-related extensions, etc. all in one go, and the only solutions I'm aware of involve disabling extension signing. Gotta admit, even though I will probably continue using Firefox after this, that it certainly is a bummer.


> this feels like a big blow to user trust.

And yet every other major browser vendor has punched their users with far worse catastrophes of privacy, security, ripping away features, breaking features, and general shitheaddedness.

Switching browsers because of this incident is like ordering a burger at your favourite restaurant and one time it comes out without the meat patty, so in protest you switch to a crappy alternative restaurant that has had a long history of health code violations.


I'm going to skip the analogies and just say this: If tomorrow this is still broken and I have a choice between installing Chromium, and installing Nightly + disabling security features, It's going to be a tough dilemma for me personally.

I'm glad you have software/vendors you feel you can trust. I definitely don't feel that way about most software anymore. I do think you are being a bit hyperbolic regarding other browser vendors, but to each their own, I don't know what trying to argue about that would solve for anyone.


Well when you are google employee and you are testing code in firefox... you already have chrome and chromium installed.

I think what is a real tough dillema is being sad about nonfunctioning adblocker while working for the biggest internet ads company.

So are you working in chrome marketing department?


>Well when you are google employee and you are testing code in firefox... you already have chrome and chromium installed.

I have computers other than my work computer(s.) I, indeed, do not have Chrome or Chromium installed on my home boxes running NixOS. I do not use my work devices for personal web browsing. I'm currently posting this message with Firefox 66.0.3 on NixOS 19.03.

>I think what is a real tough dillema is being sad about nonfunctioning adblocker while working for the biggest internet ads company. > >So are you working in chrome marketing department?

I'm a software engineer. I'm also over at Github:

https://github.com/jchv

I work at Google because it's an excellent place to work. I'm far from elite; I didn't finish college (couldn't afford) and I grew up in the suburbs of Detroit, so being able to work at any large SV company is something I don't take for granted. I don't think any single employee can claim to love 100% of the things Google does, and that's fine. Nobody is required to.

As for why I would use an adblocker, practically speaking it's both for reducing annoyances and increasing security. Malware (and 0days!) delivered via ads is not unheard of, sadly.


I know the typical user my have struggles but FWIW, I installed nightly, toggled xpinstall.signatures.required to False installed ublock umatrix and will live with my pw manager's native application for a day or two and it took about 5 minutes.


In fairness: I don't really want to disable signature checking. I value these security features and I'm hoping that by tomorrow morning Mozilla has a better solution.


Users will drop a product for the slightest reason. For instance, one of our users recently left a negative review. Paraphrasing, "Logging in is difficult".

We check our warning system (set up to detect suspicious logins, incidentally also catches any users who've been locked out because they forgot their password), and his last login attempt took a total of two tries.


Honestly then he is just an idiot and he will come back when he realizes he has to login to other services too.


Still beats using Chrome.


Absolutely. 100% true.


On Linux, Dashlane only has a browser extension. Luckily, Bitwarden has both desktop and browser versions on Linux.


>This is a goddamned disaster. I'm just thankful that I use an offline password manager

I'm not sure this cert is used with the PW manager?


I think MrEldritch is referring not to Firefox's built-in password manager, but third-party password managers such as Bitwarden, KeePass, and LastPass. Those rely on add-ons for browser integration.


That's what I meant, yes. (I didn't even know that KeePass could be integrated with the browser, I've just been manually copy-pasting)


They don't have an official add-on, but they do list a few unofficial browser extensions on their download page: https://keepass.info/download.html

Example: https://subdavis.com/Tusk/


One you have targets mapped correctly keepass2android is great and I now that I've gotten used to it, I prefer the system ui password filling for everything including the browser. Also installing the keyboard extension is great and makes for an easy way to quickly access logins.


> I'm not sure this cert is used with the PW manager?

Firefox password storage isn't even encrypted by default, last I checked.


I use firefox and am probably affected by this but don't even really notice atm. This doesn't even register on my user trust spectrum when the only other option is the browser that defines surveillance capitalism.

I think we'll all live. No need for the chicken little act.


I'm not sure the GP is overstating things. For technical folks with technical reasons to be using Firefox: yeah, a mass exodus is unlikely purely because there aren't any good alternatives. What are you going to jump to? Chrome, and knuckle under to the Goog? Unbranded FF forks and be weeks behind on patches? Doubtful.

My concern is around non-technical users (the group, mind you, that Firefox has been spending marketing dosh on courting recently with Quantum and all) who don't have as compelling reasons for not just switching back to Chrome. In the last hour, I've gotten several phone calls from family members asking me why the browser I convinced them to use is broken. I don't have a good answer, because platitudes about surveillance and muh freedoms don't count for shit when your grandma just wants to get rid of the ads on the local newspaper site.

I'm personally going nowhere and deeply appreciate Mozilla for all the work on FF and friends, occasional fuckups aside, but I don't think this is going to be a non-event for a browser that's been desperately fighting to regain market/mind-share.


platitudes about surveillance and muh freedoms don't count for shit when your grandma just wants to get rid of the ads on the local newspaper site

Very much this. People are often too quick to forget who their customers are and what they really want.


I'm using the Chromium-based Edge at work, and it's going to be a great browser. It's the one that most people should switch to if they want to leave Firefox. I've been using Firefox since 2002 and never left, but native browsers like Safari/Edge are appealing for many reasons, including deep integration for the best battery use. I tend to recommend people use native browsers or Firefox but never anything else.


I'd be curious to see if there's a marked increase in net sales during the time that this snafu is ongoing just because uBlock Origin is disabled and users are forced to see more ads.


Maybe, if FF had a bigger market share. On top of that, I think many FF users know not to browse without it.


Patches to workaround the issue were just committed. Those patches were tracked at https://bugzilla.mozilla.org/show_bug.cgi?id=1549010. At the time of writing this comment, that bug is not chained up to the bug linked by this post.


This is highly ungood, of course. The failure to notice an upcoming expiration is terrible in itself, but I suppose we can sort of almost empathize with how such shit may happen. The failure mode is inexcusable. Someone somewhere sometime must once have made a decision, "this will be the right way to fail", or - worse - developed the addon certification system and deciding it didn't need no stinkin' graceful degradation route in case of disaster.

Still, it's no worse a calamity than most everyone else presents to the world from time to time. I shall stick to my Firefox, partly influenced by the complete absense of any viable alternative out there.

By the way: Writing this a few hours after midnight, May the fourth, local time. Every single one of my 33 Firefox addons is active and working just fine. Wating to see. Breaking out my Waterfox, just in case.

[Edit: May fifth -> fourth]


My extensions are still running. I even restarted Firefox a few moments ago. So it’s not everyone?


My extensions are also still running. Comments on the bugzilla bug restricted so adding details here and figuring that if this is useful someone can forward it to the right people

    $ date
    Fri May  3 22:45:22 EDT 2019

    $ date --utc
    Sat May  4 02:41:47 UTC 2019

    $ firefox --version # Installed from arch repositories
    Mozilla Firefox 66.0.3
about:config

    xpi.signatures.required true
    app.update.lastUpdateTime.xpi-signature-verification 1556920447
    extension.update.enabled true
1556920447 is unix timestamp Fri May 3 21:54:07 UTC 2019.

Edit: I think I know why. It checks the signatures daily, and the timing works out so it hasn't checked since the cert expired for me. Just luck, it will break within the next 21 hours for everyone. From the source code:

    const XPI_SIGNATURE_CHECK_PERIOD      = 24 * 60 * 60;

    [...]

    timerManager.registerTimer("xpi-signature-verification", () => {
      XPIDatabase.verifySignatures();
    }, XPI_SIGNATURE_CHECK_PERIOD);


Copying a potential workaround here from my lobste.rs comment, not really tested obviously

If it hasn’t broken yet for you, I think (but I’m not very much not sure) setting that preference to 1556940100 should keep it working until 24 hours from now. And if you keep updating that value every 23 hours to the output of date '+%s' until it is fixed via a firefox update it should keep working forever.

I think you need to restart the browser as well after updating the preference for the above idea to work.


All three of my devices on three different operating systems still have their add-ons (for now), so it's definitely not universal.


Mine is still working. I set the system time forward 2 days and restarted Firefox and the extensions still work.


Definitely. Everything works on my Debian Buster while Windows machine wasn't so lucky.


Mozilla doesn't seem to have communicated the issue well. I could imagine a lot of unsavvy users have tried some wild things in an attempt to fix the problem, and maybe made a mess in the process. Doesn't Mozilla have a mechanism for blasting out a message to all Firefox browsers? Also I have a Firefox account, why haven't I been inboxed about this?

Otherwise I'm not bothered. I won't be switching as long as this gets resolved within the next few days.


> Doesn't Mozilla have a mechanism for blasting out a message to all Firefox browsers?

The cynical side of me says that it must not have this feature because if it did I'd have seen someone complaining about the browser "phoning home" or "forcing Mozilla's opinions into my eyeballs".


Yeah, I guess I could see people coming in with that take. Personally I'll gladly accept a tether to the mothership if they offer the choice.

That said, I regularly get thinkpieces from Mozilla about the open internet, privacy, requests for donation (which I often oblige), etc., in my inbox. They also drop notifications under the address bar and in a "Message from Firefox" bar at the bottom of the new tab page, which I got the impression they had some live control over.

But maybe not, and/or maybe there's a long runway on preparing correspondence by any of these channels for some reason. If anything, I hope that when they do get around to getting some wide-reaching messaging out there, they indicate that they plan on doing the work to shorten the runway on emergency messaging in future.


They have. How do you think they pushed that Mr. Robot add extension?


...unlike the mothership breaking all addons, and the browser being designed intentionally in a way that prevents me from working around the breakage?

This isn't just a petty snipe borne out of annoyance. Ads being the malware vector that they are, and the degree of tracking and data mining out there, all of those countermeasures being turned off overnight is an exposure that should be treated with the same degree of seriousness as PII breach at a company you have an account with.

God forbid you use FoxyProxy or Tor Browser or something else that masks your connection source - this could have legitimate, real-life consequences if you don't notice the change.


On the one hand, I understand the point you're trying to make.

On the other hand, I'm going to be honest, I have trouble reading your post without thinking things like "If you are literally trusting your life to FoxyProxy, you might want to rethink your entire internet safety strategy." Another favorite was "Defense in depth."

I've had dozens of different experiences where my extensions silently and unexpectedly malfunctioned. Configurations getting erased, new extension releases with breakage or different behaviors, maximum compatible versions in the manifest, new permissions, botched keystrokes in the extensions page, profile corruption, incompatibilities between extensions, internal bugs that cause them to crash-loop without doing anything, the works. Like, I run a ton of addons, some of which are a bit esoteric and a couple that I compile from head every few days, but even taking into account my outlier-sized surface area it's a bit silly.

I'm not going to claim that this isn't a problem. That said, blaming Mozilla for a life-threatening failure of FoxyProxy, of all things, is like blaming Cessna because a journalist flew one of their planes into a combat zone. There just wasn't any way it was going to end well.


If the nature of this problem didn't also break turnkey distributions like Tor Browser (this kills Noscript, which means your identity can be leaked), I'd agree with you.

There's only so much defending you can do against a failure like this (running Tor Browser is already pretty uncommon) and the blame for it has to be laid squarely at the feet of Mozilla for the way they chose to centralize their plugin architecture.

This is one of those low likelihood/high impact events that tend to catch everyone by surprise.. if you spend all your time as a user thinking about these failure modes (you don't.. nobody does), you'd be unable to get much else done. I'd wager the fact that the browser would suddenly gimp itself is not something the average user (even the average Tor Browser user) thinks about or plans for.


Even if there's no way to blast all Firefox users, there are blue links to standard Mozilla-hosted "help" web pages within the Firefox add-ons config, links that a non-insignificant number of confused users will probably click. Those web pages could easily be updated with info about the cert expiry snafu.



lol, both pivotal sources that I'll swiftly consult when something in my browser goes haywire.

Seriously though, when I checked the Mozilla and Firefox twitters (and I don't use twitter, so even going that far was a stretch for me) just before I wrote the parent post, they hadn't gotten around to tweeting a notice about this there, either. The Mozilla Add-ons twitter account is not the highest-level place that should be talking about this situation.


> 429 Too Many Requests

Mozilla's discourse forum is now offline :)


the only extension that failed in a waterfox is tampermonkey which stopped working but did not get removed the way it did in a firefox.

as for a system to push messages to firefox users, is there anything like this in place? a standard? if so could it notify waterfox or icecat etc users at the same time as firefox users?

if there is no specification for this, is there a similar floss project that could be forked and molded into that of which we’re in need?


I imagine it's a weekend so no one's on hand.


Tomorrow (or whenever this gets fixed), ad companies are going to have some great data about what the world would look like if adblock didn't exist. I really home someome does a blog post about it. Yikes, I hope it doesn't play out like a shark smelling chum.


Firefox has anti-tracking features that are enough to freak out reCaptcha regardless of adblockers, so that might not be as good for them as it seems. Be careful not to confuse hiding ads from view with preventing user tracking.


Given that Firefox's market share has dropped below 10%, I doubt it will make much of a difference.


Yes but there's enough sample size for them to multiply the change by ten to get what would happen if Chrome lost adblock as well.


I just ran into this. All my extensions were immediately turned off in the middle of a browsing session, from UI conveniences to important safeguards, with no apparent option to turn them back on again.

Warning about something that can't be verified is one thing, but automatically shutting things off -- particularly things that could affect security and privacy -- is a Windows 10 level of unacceptable interference.


This is why users need to be in control of their own computers. Why can't I tell my copy of Firefox to ignore the certificate? Why can't I sign my own extensions?

Mistakes happen, it's okay. But users should be empowered to work around them.


It's always felt somewhat unsettling that, although the Internet is decentralised, a small number of large organisations have a surprisingly large amount of control over the software which the majority of users view sites with. I don't like this concentration of power --- even if you think Mozilla is benevolent, it's not immune to making mistakes; and the more power it has, the bigger the consequences.

Before the forced code signing, before the automatic updates, Mozilla or any other organisation's mistakes would not have such dramatic effects; now, they have the power to basically break almost all their userbase nearly instantly, and that is what worries me the most.


> Why can't I tell my copy of Firefox to ignore the certificate? Why can't I sign my own extensions?

The issue is that if you leave any sort of lever that reduces security, it will be abused by bad actors. This is why browsers are having ever decreasing ways to bypass security and have full access. It is annoying, but at the end of the day, protecting 99.999% of the users trumps what us power users want.


protecting 99.999% of the users

It is horribly paternalistic to advocate for keeping users ignorant, unlearning, and --- dare I say it --- easily manipulated.

I will refrain from mentioning again that infamous Franklin quote. I am frankly very fucking pissed off by this authoritarian walled-garden trend, and vehemently oppose anyone who helps this industry put the nooses around the necks of others as well as their own.


I’ve been in software development and operations for 25 years.

I still don’t want to have to understand everything I ever touch, even if I could.


>I still don’t want to have to understand everything I ever touch

If you don't understand it, don't touch it. The default settings should work for most users. There can even be a warning against touching without understanding, like with Firefox's about:config. The offensive thing is preventing users from touching even if they do understand.


The difficulty is in how to keep them available to end users while keeping them unavailable to malware and bad actors who post "helpful" advice or publish temporarily useful addons that get updated to malware.

I'm not disagreeing with you, but the right mechanism is not straightforward to figure out, and you'll always be in a game of cat and mouse. One that sucks resources from whatever other useful stuff you might be spending your (or Mozilla's) time on.


I'm not understanding the relationship. Of course users aren't going to understand all the underpinnings of how software works.

I do think that in the future, it will be imperative for everyone to have some level of technological literacy above what is currently the average. And I'd like to work to get to that point, instead of taking all the tools away because they're too dangerous.

Also, sensible defaults are good! Hiding dangerous settings is also good! What's not okay is making those settings completely unavailable. At least in Firefox's case you have the option to recompile the source code, but that should not be the only recourse...


"don't run privileged code from people you don't trust." Is both critically important to understand for anyone using a network connected computer and not at all complicated.

If we're going to be authoritarian I would rather ban anyone who doesn't understand that from connecting to the internet then have a broken walled garden.


> "don't run privileged code from people you don't trust." Is both critically important to understand for anyone using a network connected computer and not at all complicated.

That is absolutely complicated for the vast majority of the world's internet users. No one else is my family would understand what the hell "privileged code" means and shouldn't have to.


The statement can be simplified down to "don't run programs downloaded from random websites which ask for your admin password."

Adjust the qualifier at the end depending on your platform. On Windows, it might be apps that present a UAC dialogue—or maybe just remove the qualifier, since Windows doesn't do much sandboxing by default.


The issue here is that this wasn't done in a vacuum. Other software vendors were secretly and deceptively installing extensions that were tracking everything users were doing online.


Most people do not know what a manifest.json is or what sort of permissions they're handing to a random WebExtension.

If you want your freedom from reviewed extensions: fine, get an unbranded Firefox, or Developer edition, and you get that.


A Developer Edition is unstable, so I wouldn’t want to use that.

If you can recommend an fork that allows extension sideloading but is kept up to date, please do so, I’ve been looking...


Okay, figured it out. Thank you for giving me the search term "Unbranded Firefox"—I didn't realize this referred to something specific.

Unbranded Firefox is actually a specific version of Firefox distributed by Mozilla, which allows you to disable extension signing requirements. I am very glad to see that they offer this, and I will be using it from now on.

https://wiki.mozilla.org/Add-ons/Extension_Signing#Unbranded...


Completely agree. Using firefox feels more and more like using an iThing.


Consider the recent news stories about the Boeing 737 Max. Boeing added an automatic system to an airplane, and then didn't give users (the pilots) a way to disable that system. This worked out great while the automatic system is working properly. When the system broke, well, we all know what happened.

If we're going to assume that software is right and the user is wrong 100% of the time, then the software needs to actually be right 100% of the time. Unfortunately, our software isn't that robust, and it never will be.


It doesn't have to actually be right 100% of the time. The balance of downsides and upsides of any chosen solution just have to be more palatable than those of whatever alternate implementation you're considering, with a tradeoff between 100% correctness and ability to be implemented before the heat death of the universe being one of the axes to be considered, as well as the level of benefit provided over your whole user base.

In this case, dropping the extra control/ignoring power users is probably saving a lot of non-power users from shooting themselves in the foot in the vast majority of cases. Pilots (should be) 100% power users. The average operator of a browser is somewhere on the opposite end of the spectrum.

Any real system will have things go horribly wrong for some subset of users on a regular basis. It's impossible to be all things for all people for all situations, so you have to choose your battles.


You mean the 737Max?


Yes, that was stupid. Edit now, thank you.


> The issue is that if you leave any sort of lever that reduces security, it will be abused by bad actors.

As you can see in this discussion, there already are some obtuse ways to disable/ignore the signing. It's just way worse if people have to disable the signing instead of adding a trust for their own certificate, so that only mozilla and user's addons are truste instead of all the malicious garbage out there on the web.


Makes me think of this ticket no work around for the error other than closing the browser and editing a file or modifying the browser and recompiling. No options at all in the advance settings.

https://bugzilla.mozilla.org/show_bug.cgi?id=1528738

It's stuff like this that makes me unhappy with mozzilla. User's who know what they are doing should be permitted to do so. Warn them here be dragons or whatever, but it's ultimately their choice.


It is possible, according to another post by bitbang [1]:

> Temporary work around till the cert gets fixed: set "xpinstall.signatures.required" to false

https://news.ycombinator.com/item?id=19823879


Gratzi!


Because it's hard to tell the difference between "users" and "malicious software running on their computers".


If the malicious software can adjust protected user settings, can't it also just inject into the Firefox process directly?

If there's a privilege level that allows for one but not the other, that sounds like something Mozilla should fix.


Fortunately, it is no longer necessary to run malicious software on user computers. With latest "advancements" in Firefox security everyone can publish malware directly in Firefox addon center [1]. No review needed!

"We accidentally uploaded all your HTTP requests to our servers, but we will definitely fix that in next addon version!~"

[1]: https://arstechnica.com/?post_type=post&p=1340459


There are multiple versions of Firefox that do that: the Developer edition, and most of the unbranded versions.


You can, though, on both accounts? Use the dev version of Firefox and you get the power to ignore addon certificates, and you can already self-sign your own extensions as much as you like?


On a slightly different note, is there some curated collection of serious incidents like this somewhere?

Something we could refer to when discussing possible pitfalls?


I believe https://github.com/danluu/post-mortems is close to what you want.



I've seen a (then) current version of Glassfish (java EE server) refuse to start because of an expired internal certificate a couple of years ago.

So here is one example: https://stackoverflow.com/questions/18248020/certificate-has...


It's pathetic to see the attitude demonstrated by Mozilla support on this.

diox commented 4 hours ago

I'm locking this like I did in #851 because no new information is being added. We're aware and we're working on it. This conversation has been locked as spam and limited to collaborators.[1]

Bug 1548973 (armagadd-on-2.0) All extensions disabled due to expiration of intermediate signing cert NEW Unassigned (Needinfo from 3 people)

Kevin Brosnan [:kbrosnan]

We have confirmed this issue. Extra comments about this being broken will not advance this bug to being fixed.[2]

Mozilla just left their entire user base unprotected against ads, trackers, and some hostile code. Then they insult their users.

Undoing the damage is hard. First, they have to update their signing certificate. Then they have to re-sign all the add-ons. Then users have to reload all the addons. Then, something users won't do - remove all the tracking cookies, etc. that slipped in while Firefox was broken.

[1] https://github.com/mozilla/addons/issues/978

[2] https://bugzilla.mozilla.org/show_bug.cgi?id=1548973


Forgive my ignorance but how are they being insulting by locking the issue?

They are working on it, and seeing 1000s of “me too” comments in the issue isn’t going to make things better for anyone. Least of all their customers, who, when they do update the issue with more info, won’t have to wade through pages and pages of noise before they get to the actual update from Mozilla.


Closing a duplicate issue is an inuslt these days?

Or are you talking about them not putting "Best regards, Tim" at the end of every message?

People these days are offended really by every and any little thing.


> Then users have to reload all the addons

I'm pretty sure Mozilla will implement a fix in a way that users only have to update their browser, not do anything to all their addons.


I'm curious about how an update will be able to differentiate tracking cookies from legit ones?


They should be able to obtain a new certificate based on the same private/public key, in which case I don’t think any add-ons would need to be updated.


The problem with this approach is that the expired certificate is part of the add-on package files (META-INF/mozilla.rsa; DER encoded PKCS7), not something that you can just swap out on some server. You have to replace the certificate in the add-on packages with the new cert, even if the new one reuses the keys of the old one. At which point you need to ship new add-on package files to users anyway, so key reuse or not makes no difference anymore.


Don't exaggerate. It's a critical bug now and they are working on it.

Let them work on it


You know that for every comment someone posts the developers get an email? It’s not useful if you get 500 emails that state a user is deeply inconvenienced by this bug. That only leads to people filtering mail into the trash.


This is a shocking display of not just incompetence and bad practices but of brazen undisclosed covert control. Why should your local browser depend in this fragile way on some muckup in Mozilla HQ? And people line up to defend this?

Where does it explicitly say Mozilla can disable my addons remotely? When did I give them this power? And this from a so called 'open source privacy focused' browser. This is a mockery of privacy and open source and they shouldn't trade on this goodwill to gain users.

There can be no bigger security hole yet security fear mongers preach exactly this abusive model. This kind of centralized remote power is a far greater security threat that anything they keep on harping about, 'good intentions' and 'good faith' are not remotely something anyone should have to depend on. Why should Mozilla babysit my installation? Shouldn't they be using their resources to do something productive?

There is something rotten in SV culture and we urgently need to think of alternatives that are not infused in this 'know it all' abusive surveillance culture as even after such an egregious abuse of peoples trust and faith all you will get is hand waving, normalization, apologism and snarky entitled comments that trivialize people's concerns and choices made on the goodwill of open source.


Sure, anyone one can make mistake. We have seen big companies make stupid mistakes too. But this is Mozilla we are talking about. How this slipped by is beyond me.


DAMN! That's quite something!

For a piece of open source software you really have very little control with firefox. It really sucks that the alternatives are worse.

This, likely for almost all of their users, creates more of a security problem than signature checking actually solves. For me noscript no longer works which is (IMO) a critically important extension (between mozilla taking away the disable javascript button and spector.)


OTOH, it's opensource, and you can re-compile it in like 20-30 minutes with whatever changes you desire. So you have the control. You can probably even add some hacked up support for multiple signing certificates (and add yours there), if you tried.

It's just that it's more work than having what you want implmeneted and maintained by others.


They don't use cryptographic timestamps with their signatures ? The certificate might now be invalid, but the signatures were done at a time when it was valid...


The problem is that "time" is fungible and can be forged. The date on a signature doesn't really mean anything.


This is a very bizarre justification for an obvious bug. Code-signing does not work that way anywhere else — neither in Android, nor on iOS, Windows or any other common platform.

There is a possibility that Mozilla implemented their backwards code-signing model on purpose — for example, it allows them to oust unwanted extensions without explicitly recalling their certificates. But personally I think that they just didn't give the matter enough thought.


Emphasis on cryptographic timestamps.


Certificates have been in common use online for maybe two decades now, if not more. This is a common failure mode and it keeps happening. Is there some fix so we don’t have to keep dealing with spontaneous failures due to expirations? Or are we doomed to suffer with this until the end of time?


Use a certificate monitor https://letsmonitor.org (free)


Is that site legit? Why don't they even have a HTTPS redirect?


another month, another browser vendor that does something inconceivably bone-headed in "the service of users."

first it was deprecating ALSA for pulseaudio, then it was pocket, then tiles and their suggestions, then that weird video/voice chat thing, and then running "studies" as if my use of the browser was some tacit acceptance of my position as a guinea pig of the internet. Today every extension I use to make the internet even remotely usable is just...deactivated?

without my consent or knowledge?

enough. im switching to waterfox. Icecat is even worthwhile at this point. Anything that respects my freedom and intelligence as a user.


This appears to be a mistake.

It's clearly bad that things can break this way. But this happened because a certificate expired.

Mozilla didn't have to require signatures and certificates for extensions. They did so because they want to protect users.

Protecting users with signature schemes, increase complexity and, thus, the risk of debacles like this.


People take their privacy seriously. Our addons were disabled. I was browsing for a few minutes until I saw ads and didn't realize what was going on. Unacceptable that the default to an expiring signature is to disable them completely.


You would have said the opposite if an expired signature had caused your browser to be compromised!!!

The default behavior seems desired to me.

The problem was that the certificate was allowed to expire.

A have ton of respect for the fact that it was enforced!


So that's why I've been struggling for the last hour. It's crazy that it won't even let me I stall extensions from xpi file, even with extension signature checking disabled.


Anyone know when they expect to fix this?

I basically can't (OK, won't) browse anything except HN until they do.


So:

  * Started day with browser, no problems with extensions
  * At some point in the afternoon, all extensions disappeared
  * A couple of hours (cannot be more specific I am afraid), all extensions came back
However I'm noticing the saved preferences of some of the extensions has gone. e.g. Password manager has kept the username however all "Multi-account containers" are now reset to factory defaults.

Is anybody seeing similar behaviour ?



Temporary fix without enabling feedback to Mozilla.

Open the address about:config in the Tor Browser address bar At the top of the page, search for xpinstall.signatures.required Set the xpinstall.signatures.requiredentry to false by double clicking it

Note: This workaround should only be used temporarily, as it disables a security feature. Please remember to set the xpinstall.signatures.requiredentry back to true again once the Tor Browser security update is applied.


I'm surprised nobody's mentioned addon debugging if you really need an extension working. All of the extension packages are in <your profile>/extensions folder and you can load them for the duration of your session by turning addon debugging on in about:debugging and loading them in. This should be fixed fast enough that this fix will be good enough, just don't close your browser.


I'm running the workaround suggested by Reddit use @MeaslyTwerp (the one after the HN solution): https://www.reddit.com/r/firefox/comments/bkcjoa/all_of_my_a...


I've been using Firefox since 2002, always loyal because it works as I expect a browser to work. For users like me who see the benefits of Firefox being the only browser worth losing the benefits of native browsers (Safari/Edge), the only real place we'd move to due to sufficient issues is to those native browsers.

The new Chromium-based Edge[0] for someone like me, who would never use Chrome, is the only cross-platform alternative that could pull me away from Firefox. Native power usage advantages on Windows, with portability to other platforms and Chromium's speed advantages makes it a very attractive choice. I hate to see this happen to Firefox, but they don't have a lot of room for error. I've been using "ChrEdge" at work, and it's already a great browser as it is.

[0]https://www.microsoftedgeinsider.com/en-us/download/


Wow this is really bad, breaking millions of people's workflow in one go.

Fixed for now by switching to Firefox Nightly and disabling signing.


My password manager and every other plugin disappeared this morning first on one of my Macs and a few hours later on the other. I could not log into anything so I switched immediately to Chrome because I don't have time to fuss with workarounds. I'll return to Firefox when I hear the problem is definitely fixed.


Well, a little bit of empathy for mozilla here: I've seen a lot of IT departments that don't have any sort of great system in place for managing certs. A lot of places I worked, I always had a suspicion they were a ticking time bomb. It's not enough work that it's really anyone's full time job to manage them. Also, at larger companies, you might have divisions that do it in different ways without cohesion. And then a lot of times certificate expiration is so far in the future that the people that initially setup a certificate might have left the company and forgot to document it, etc. So that kind of thing can easily fall through the cracks.

Maybe a constructive thing I'm curious about: What is considered best practice for managing certs? How do people do this in a secure way that makes sure they get renewed in a timely way?


"One or more installed addons cannot be verified and have been disabled" just showed up at the top of all my firefox tabs. #sadtrombone

https://i.imgur.com/hUe7Nyn.png


Still working for me in my main profile in Firefox 60.6.1esr. I could even install an update to HTTPS Everywhere!

When I opened my alt profile, extensions still worked, but I can't install updates—including for HTTPS Everywhere.

What's up with that?


A positive side effect: my 2 years old FF installation stopped freezing constantly.


I'm Interested in a proper root cause analysis and mitigation report.


Still beats using chrome.


Have fun defending this to your friends who switched overnight to Chrome again.


I've had normandy disabled in my user.js for a while, but this was only after thoroughly perusing documentation and some firefox "hardening" projects. Point being, no end user should have to do what I did. Sane defaults, and transparency about things that should be opt-in, are sorely needed. Regardless, I still stand by Firefox, and am thankful there are chromium alternatives. The web is what it is, though I dream of a simpler one.


All my setup of 24 containers has been reset... Not only it forgot which domains were assigned to which containers, it also removed my manually created containers.


Advertisements. My eyes, my eyes!


What a nightmare! ---- ALL of my extensions have disappeared and have been disabled this weekend (with ZERO warning) ---- Trying to use my Roboform password program is a nightmare---- If the problem is not 100% solved by Monday (May 6), I plan to exterminate every trace of Firefox of every computer I own ---- Shame on Firefox for placing uses in such a nightmare situation with ZERO warning and ZERO solutions ----


i’ve a strong emotional reaction to this. it feels like a failure on all of us. will be fighting toward a more optimistic interpretation of such a snafu.


I never liked change that extensions had to be signed by mozilla.

Why do my personal extensions need to be hooked into some third party service that can go out at anytime?


The developer edition allows that just fine.


Developer edition is effectively aurora/alpha. It is buggy compared to release. And yes, I do mean that. If it were not it'd be 'release'.

Asking people to either give up control of their software (ie, walled garden release versions) or use buggy and insecure software Dev/Nightly/etc is not acceptable.

It's why I switched to a freedom respecting Firefox fork as soon as they announced walled garden extension signing in Firefox 37.


Sure: debranded versions ("freedom respecting") will also do that.

You say walled garden, I see what random WebExtensions people install on their work laptops and think "yeah maybe someone policing this thing isn't the worst thing". But most importantly: it sounds like it's not actually a problem for you?


Not happy that Firefox didn't ask before it did this. I am getting really tired of my technology trying to tell me how to do things.


Can someone explain exactly what went wrong? I don't think I quite understand, but 7/9 of my extensions have been disabled.


Firefox requires extensions installed via their "store" be signed with a certificate to make sure they're actually from there. That certificate has an expiry date. It expired, so now all of its signatures are invalid -- and Firefox no longer trusts the associated extensions.


A minor added detail is that it wasn't the leaf certificate that expired. I've heard that that would have been handled properly. It was an intermediate cert, and I guess that possibly wasn't fully taken into account? (This is 3rd hand knowledge and speculation, note.)


That is accurate :)


Amusingly, my surviving extensions are for WAVE, Print Preview and other accessibility tools. Thank you Mozilla, you really do care!


That’s a bad sign on what they do with security


This is absolutely going to tank Firefoxs market-share and reputation. I have a site full of tech-illiterate users and they're all uninstalling Firefox and searching for alternatives. None of them are interested in the workarounds presented, most of them even struggle to turn their computers on! Mozilla really dropped the ball here.


This was probably an unintended consequence of trying to ban Dissenter. [0] [1]

0. https://dissenter.com/download

1. https://youtube.com/watch?v=f0Cc8RpqH1g



5 years since Heartbleed, yeah? Was this cert one that was pulled into sync due to being reissued?


I was able to temporarily work around this problem by going to "about:debugging" and using the "Load Temporary Add-On" button to load up the disabled add-ons inside of ~/.mozilla/firefox/<my_profile>/extensions/


After learning about Normady thing, I am literally now blocking every Mozilla, Firefox and their affiliate domains and their Ip adresses. The trust is lost. I rely on Debian repo and like other's mentioned, Normandy seems to bypass Debian oversight.


For anybody using Firefox Android and ublock origin who now sees ads, worth checking out the duckduckgo Android privacy browser. So far, it seems to be working beautifully so something good for me at least has come out of this mess!


adblocker seemed stop working, no idea why, anyone know how to fix them or just wait?!


This helped me discover Firefox's Content Blocking setting, which is set to Standard by default, but now I set it to Strict. Works better than an ad block!

Preferences > Privacy and Security > Strict


It happened hours ago. Why in the blue blazes of hell isn't this fixed yet?


Waterfox works in Windows, no issues with extensions and supposed to have a greater compatibility with legacy extensions.

Fennec works in Android, no issues with extensions and has the third party trackers removed.


What kind of idiot thought that the add-ons I have personally installed on my browser need to have a capability to be remotely disabled despite literally nothing being changed.

This is absolutely inexcusable. I want to see everyone being responsible for this "verified add-ons" fiasco fired from the team (after they roll it back of course).


    remotely disabled
Were they really remotely disabled? That would mean somebody out there pushed a button and made your add-ons go poof.

As I understand it, the browser checks the certificate of add-ons at some point (on startup? on an interval?) and only uses signed ones. And since signatures are date restricted, previously valid signatures can become invalid.

I'm not 100% sure if this really is the mechanism. Would be interesting to hear from someone in the know.


How do you verify that nothing has been changed, if not by checking signatures?

Your call for everyone to be fired is very much in vogue but perhaps not at all useful. This isn’t a great thing to have happened, but the important outcome is, as always, knowledge and process that can prevent similar mistakes in the future.


Standard answer from a person that didn't understand the issue - "who is this bunch of idiots? Fire them all". Sounds like a C*O. :)


Yes. I mean bugs can happen but this wouldn't be so bad if they had provided a simple way to allow unsigned add-ons in the first place. You can't even create a simple personal add-on for yourself without uploading it to mozilla first. That sucks. Please let me use my computer how I want.


Exactly. If me/firefox has verified the signature (or approved the download) when downloading or updating the addon, that should be all that's necessary. Why does firefox have to check signatures constantly?


I worked on a code signing system a number of years ago and it was surprising the degree to which I had to rearrange the logic and the tool chain in order to get high code coverage on all the failure modes.

IIRC one of the tools wouldn’t let me work with expired certs. I can’t recall now whether I fixed that or made carts that expire in ten seconds and just waited it out.

Anyway, a number of people weren’t even sure why I was going through the trouble. It’s easier to get something wrong than it should be (super obtuse APIs) and you don’t always get enough support or pushback to get everything absolutely right.


in case it was revoked? Seems fairly reasonable approach


Disabling with no option of user override is not reasonable. It only creates unnecessary dependency on third parties.

When product recalls happen, the manufacturer isn't taking your product away by force.


I disagree. Probably what they need is a better monitoring an alerting system that triggers when these certs are expiring. The software did what it was supposed to do -> prevent MITM attacks, fake extensions, etc. What they could have done better is give users the possibility to say "keep using the extensions despite the certificate expiration".


The downloaded extensions already passed verification when they were installed before the expiration. Disabling them now makes absolutely no sense. Even if the cert was compromised the moment it expired, previously installed extensions can't have become vulnerable without being updated.


As I'm reading these comments, the Unverified addons warning popped up :) Well, I refuse to use the internet without uBlock Origin. Looks like this is going to be a productive day.


This problem has screwed up Firefox 53 as well so unfortunately reverting to a superior version won't fix the problem either. At the moment Waterfox is looking like the best fix.


This relates to my opinions about encrypted HTTP, which is that it shouldn't be mandatory.

If you have a well-designed system that only works with encryption, then sure, but this idea of using the same mistaken systems as the WWW clearly doesn't work well.

I've never seen a Tor Hidden Service fail because of something expiring.

Much of this nonsense about encrypting everything, without reason and excuse, is to protect advertisements from being modified.

That this hit Tor Browser and disabled NoScript is damning, but I already disable JavaScript in about:config and I'm not even using a version of Firefox this new, anyway.

I can't tell if my opinion of Mozilla is lower or if it can't get lower.


Requiring https is a different situation and much more defensible in my mind. It's way too easy to rewrite the web pages of everyone using library or coffeeshop wifi and thereby hack/phish a lot of people's browsers.


This situation isn't about encryption, it's about code signing.


Does this happen when you open firefox? Forgetting what i need to reenable and what i had disabled is going to make me mad. If i just don't open ff will i be fine?


Maybe everybody should set app.normandy.user_id to "anonymous". I am disappointed that Mozilla still has unique tracking ids embedded in Firefox.


Firefox extensions installed using apt are not affected by this problem AFAICS, so to get a working ad blocker, apt install xul-ext-ublock-origin


Shit, I just switched to lastpass and deleted all my saved passwords in firefox. Good timing. On windows, so the config hack is not working.


Wow, that's a bad slip up... I hope users stick with Firefox.

I feel like this sort of thing is a risk when being vigilant about security.


> Steps to reproduce: Wait until it's past midnight on 2019-05-04 UTC.

This has got to be one of my favourite bug reports ever.


So let's say I'm the IT department in my company. I've already got my root cert on every employee's PC(including Firefox because they can't browse otherwise). Can I act like the Normandy endpoint and let's say remotely disable the ability to install any extension including those pesky VPN ones and also do a lot of other such things I would like, you get my drift? Am I right? Am I right?

Please tell me I'm wrong.


You are wondering if an IT admin can admin machines in its network? Yes, an IT admin can admin machines in it's network.


Running Firefox 66 from Ubuntu repositories on Ubuntu 18.04 and all my extensions are enabled.

Does it only occur after a restart or?


I believe it happens the next time Firefox goes to check for addon updates. You may want to proactively set xpinstall.signatures.required=false which... I think might work for 66 on Linux? It worked for 60.


Ah, checking `app.update.lastUpdateTime.xpi-signature-verification` shows that it hasn't checked since "yesterday"

:/

  jtl@laptop-linux:~$ TZ=UTC date --date="@1556919381"
  Fri May  3 21:36:21 UTC 2019


This almost turned me into raging monster. How dare you mozilla disabling adblock? now i want an UNSEE function :D


I déinstalled most of my addons and/or extensions How can I return to my previous profile to get them back?


The saddest thing today is to realize that I can't use Firefox without extensions on privacy/security.


I'm curious to see what kind of impact on ad revenue we'll see in the next couple of days.


And now, all my plug-in / extension settings / preferences are gone after updating...


Is Tor Browser also affected?


Looks like it. I had the same yellow "One or more installed add-ons cannot be verified and have been disabled" banner in the Tor browser as in Firefox, the NoScript extension icon is missing, and I went to two different "do I have Javascript enabled" sites and they both said JS is enabled.


Is it possible that this wasn't an expired certificate but someone accidentally changed the signing process?

The disabling happened right after the announcement by Mozilla to implement a new policy towards extensions.

Maybe someone didn't realize their mistake, so now everyone thinks it was an old certificate.


So this just happened in my browser...when exactly will this be fixed?


Ah, so I need not to restart Firefox while they're still enabled.


I tell you what: Grammarly is making a mint off this right now.


I guess there was no monitoring on cert expiry.


Mozilla continues the war against its users.


This is huge. This had a huge impact on my work in the last hours. Still not fixed for me. What's going on???


its been so long since i browsed the web without adblock. Now I remember why


Because of this my Firefox crashed and I lost all my open tabs. Just fking great...


Will Mozilla IT finally be held accountable? Whether or not the direct cause of the expired cert is IT failure, the fact is that a healthy IT would have prevented this. Take a look at Mozilla IT leadership. Take a look at all the people who have left in the last few months. For a year people tried to bring attention to the IT leadership disaster, and every person who did that was penalized for it and left Mozilla.

A disaster like this has been brewing for a long time, and Moz leaders didn’t listen to the canaries in the coalmine.

It’s a very sad day for all the great people at Mozilla who work so hard only to see Mozilla IT let them down.


when might the fix be?


when is the fix?


What a spectacular failure... it's hours later, and still not working. The only thing firefox has going for itself stopped working. Already one family member and one friend switched to chrome.


Just switch for a day. You're making this out to be a much bigger issue than it has the capacity to be.


NoScript stopped working for Tor Browser users. I'd rather see the browser crash on startup than have that happen to me.


NoScript and uBlock Origin have stopped working. This opens up a LOT of attack surface for malicious hackers.

If this isn't a critical security issue, what is?


The Container extension is no longer working. I'm logged out of mostly everything. There are few sites I don't want to open without container.


Update - Container data is lost post addons recovery (I installed Nightly build). This is ridiculous.

My Firefox usage will be so unproductive for few days. I had around 6-8 containers for different purpose, and somehow I'm habitual to using shortcut to launch a container and open whatever I'm supposed to (e.g I've access to 3 different AWS account, and I tend to press shortcut key to launch the relevant container tab)


Exactly my use case. Already installed chromium-ungoogled and am using a judicious amount of profiles to emulate this. Still not as nice as what the Container extension do, unfortunately.


I switched to Brave today and I must say that I am impressed:

- Adblocker and Tracker protection without installing extensions.

- Nice background pictures.

- Caching Cdnjs requests -- for some reason Firefox was always hitting the CDN.

To be honest, despite the good work Mozilla does, I don't want to switch back.


Many of us tech savvy Firefox users have told family and friends to use it as well. Mozilla's made us look like damn fools.


it's not about me. I can fix this for me. But most people can't. And for them, basically their computer stopped working right.


This is particularly bad because it's the kind of failure that will make people try Chrome. And when they do and see it's faster there's no way they'll be back with Firefox. It's simply catastrophic. Mark my words when I say there will be a very visible dive in the number of Firefox users as reported by StatCounter.


I wonder how it's possible that these kinds of--I assume--easily preventable problems happen at huge corporations and projects.

I routinely renew certs for clients and it never happened that a website was down because of an expired cert.

I would think that Apple, Microsoft, Mozilla would be more efficient than me in avoiding these kinds of fuckups?

I also bet these companies make huge investments to their infrastructure so that their services have a near-100% uptime, and then they let certs expire.


Well, it was fun whilst it lasted Firefox. Sorry, but this is a deal-breaker for me.

Hello Chromium again...


If they can't get this right, what hope is there of browser security?


That's silly. Browser security is not one function. It's not even one team.


It could be said they decided to make things a bit too secure this time.


You've fucked up mozilla. I had no option other than to switch over to a browser that would let me install my ad blocker. The web sucks too bad for this. My addons are mine, not yours to disable and enable at will.


First they force code signing on everyone without a way to disable it then they break it. This is an extreme level of incompetence I didn't expect from Mozilla.

They'd better have the best post mortum ever, possibly with someone being fired.


A mistake of this magnitude cannot be the fault of an individual, because if it was, then the organization lacked adequate safeguards.

What I'd like to see is a post-mortem, followed by an explanation of how they'll prevent the mistake from being made again in future.


> how they'll prevent the mistake from being made again in future

This could have been prevented by someone putting the expiration date on the team shared calendar with a 60 day alert.


I hope you're correct.


> They'd better have the best post mortum ever, possibly with someone being fired.

Arguably these two goals are incompatible. :)


People are generally not inclined to be truthful about their mistakes if they expect to be punished for them. Its how problems keep getting covered up until they become catastrophes.


I guess it depends on if it was an honest mistake or gross negligence. I don’t think people should be fired for mistakes. I also don’t think everyone should be trusted with important tasks.


Come on, people make mistakes. Things fall through cracks. Shit happens, etc.

No one needs to be fired for a single instance of a particular mistake. If this happened multiple times, then I would be on board with firing someone.


Mistakes of this magnitude are always singular and particular. Hopefully.


Why does someone need to be fired? Does some blood spilled really make it better? Have some compassion.


I'm generally not a fan of firing people for making mistakes. This one is so monumental it may require it though. This breaks most FF installations.


You didn't answer my question. What does firing achieve? You fire a person who learnt their lesson and will never make the mistake again? And then hire someone new?

Or you fire the scapegoat because of a broken system that allowed one person to make a mistake?


If this mistake was due to incompetence then the person should be fired. Incompetence shouldn't be tolerated.

But we're outsiders looking in and don't know what's going on at this point. That's why I used the qualifier "possibly." It's quite possibly it wasn't incompetence.


“quite possible”


> You fire a person who learnt their lesson and will never make the mistake again?

That's true, sort of. How often do you let people make huge mistakes before you decide that maybe they are just not apt for the position that they've been promoted to and Peter was right? Once? Twice? And unlimited amount, as long as it's never the exact same mistake?


Couldn’t disagree more. Do you want to fix the conditions that led to the problem? Or do you view a post mortem as a punitive process?


Can you link to your linkedin profile so we all know who to dox next time you make a mistake? If you have a twitter, please add that as twitter works even better for mobs.

Thanks :)


Why does someone need to be fired?

That might seem rather extreme, but the fact that this situation was even possible was a consequence of a series of bad decisions over an extended period of time about the required behaviour of new versions of Firefox, combined with technical failures that betray fundamental weaknesses in the whole system design. Whoever was ultimately responsible for those failings demonstrably isn't competent to run something of this importance and should probably either implement immediate and dramatic changes to the relevant policies and technical details or consider their position. Anything less is surely going to damage trust, which is something Firefox can ill afford when it's already in danger of being reduced to a niche product rather than a mainstream browser.


Oh relax. A cert expired. An intermediate cert at that...

This has probably happened to every major cloud provider and countless companies at least once. Certs are hard.

Should Mozilla have had monitoring on their cert expiration? Yes. Will they after this? Probably. Is any one person ever at fault for something like this? No.

Firefox is an open source project. You're welcome to contribute and make things better.


> Oh relax. A cert expired. An intermediate cert at that...

Everyone's extensions broke. Including security ones. Including the ones bundled into the TOR browser. And end-users can't fix it. Because Mozilla decided that it was too dangerous to let users choose what extensions to run for themselves. This is an excellent moment to be upset.


Being upset is ok! I'm not particularly happy that I can't just override the certificate check on stable. But demanding someone get fired is just pointlessly punitive.


>Firefox is an open source project. You're welcome to contribute and make things better.

Well no because they won't accept a patch that lets us plebs turn off the signed extension requirement.


The Developer edition allows that just fine.


Alternatively, that person (if they exist) has gotten the best lesson in institutional certificate hygiene rules money can buy. They got their mistake potentially added to hundreds of companies playbooks so it can be caught.

Honestly that's one of the most successful things you can expect out of a failure of this magnitude.


I was with you up until you said someone should be fired.

The fundamental problem here is the system (code signing.) It's a political thing with security being the excuse. They want control of a platform for business reasons.


Hopefully management being fired. This reeks of management not letting the technical team automate something or other bad decision making that lead to this. If one person was in charge of it and they messed it up, that is as much the fault of whomever gave that important task to only one person as the person making the mistake. I don't want the low-level person punished, I want the one who put them in the place to be able to make such a bad mistake without any sort of redundancy or contingency plan.


There is exactly one person who isn't going to do that again.


So why is this taking so long to fix? From https://github.com/mozilla/addons/issues/978

> diox commented 2 hours ago

> I'm locking this like I did in #851 because no new information is being added. We're aware and we're working on it.

I mean, two hours? WTF.


I am going to guess its more than just getting the new cert. They need to get the new cert, they need to resign all addons and then they some how need a way to force everyones browser to re-enable them. This could go on for quite some time if they have no plan in place to handle this sort of thing.

And if they were good at planning, this would not have happened in the first place. At least give us a button or something for "I don't give a shit if it isn't signed, enable it"


It was Friday night on most parts of the Western hemisphere. I'm guessing it took some time to get the right people back to work and assess the situation.


Yes. I was upset. Sorry. I ought to know to stop when I'm upset.


Nah don't worry. We're all upset and a bit disappointed.


Is it perhaps a good time to remind folks that the same thing could happen to all your "secure" HTTPS websites that are completely unavailable via HTTP, where the only thing served over HTTP are the 301 Moved redirects, even for sites that don't collect any user information at all, and only serve static and public content, which really hardly benefit from the mandatory encryption?

Or is HTTPS / LetsEncrypt too big to fail? HTTPS still always a good choice? I see…


But it's easy to override broken https certificates. Worst case you have trust on first contact style security.

This is just plain bad.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: