Actually, in normal usage the word "sideloading" means transferring files laterally. That is to say, not in a client-server relationship.
In the world of android, "sideloading" became parlance for installing applications because historically you used adb on a computer to sideload the app on.
Did this change? Android-wise, I've always interpreted "sideload" as the adb method... just installing an apk in the UI isn't "sideloading", it's "install from apk" vs "install from play store".
The original article only discusses a specific, non-standard (to me, at least – probably made more sense for IT departments) method that Mozilla is calling ‘sideloading’ will be disabled. Per the article, ‘sideloading’ is defined as putting an extension file in a special folder that results in all instances of Firefox on the machine automatically loading that extension.
In the page yorwba linked, it shows how to generate a signed XPI file that can be distributed and installed locally from that file in a more conventional manner (e.g. drag & drop the file onto Firefox’s Add-ons page).
Yes, the older method of extension side-loading, supported at some point by Edge, Chrome, and Firefox, was for IT departments who created OS deployment images with software (incl. extensions) burned into them. Usually this was combined with Group Policy / Device Profile settings that made the extensions impossible to deactivate or remove ("force enabled") and potentially blocked any extensions other than those preinstalled.
I believe that all the browser makers have, at this point, reached a consensus that deploying extensions directly as on-disk files this way makes it 1. too easy for malware to just set itself up as seemingly "deployed by Group Policy"; and 2. makes it too hard for these extensions to be updated as often as they might need to be, or disabled if the browser-maker declares incompatibility with an API in them, etc.
What all the browser-makers seem to favor nowadays, for IT departments who want to do OS-image deployment, is an approach where the burned-in Group Policy will just list out a set of extension IDs that are to be force-installed and force-enabled; and then the browser itself will do the work of retrieving and installing them (but will still treat them like any other extension as it goes through the install process, vetting it for compatibility, upgrading it through its registered upgrade channel, etc.)
This means that every extension in these browsers now has to live in the browser's extension store—even if it's a private, just-for-your-own-company extension. (Which, honestly, there's not much to be said against; the "enterprise deployment" parts of app-stores don't usually force developers to go through pre-vetting before new versions are published or anything. It's just cloud hosting—with the proviso that, due to having download logs, the app-store can see if your "enterprise extension" has an install profile that looks more like that of a virus rather than an enterprise, and then blacklist it.)
Here's Google's documentation on the Group Policy settings that modern Chrome looks for, for comparison: https://support.google.com/chrome/a/answer/7532015?hl=en. Probably Firefox wants to move toward a similar model. And more power to them, honestly; right now, Firefox's extension ecosystem is far too easy a target for malware authors.
From what I know (and I've got quite a few self developed addons running), installing unsigned XPI files has not been allowed for a while. I've been uploading them to my profile in the add-ons developer portal and getting back the signed XPI file. I could be misreading this though so apologies if I jumped the gun here.
Signing is only required on official release and beta builds -- users on developer edition, nightly, and unbranded builds can opt-out of the signing requirement by flipping a setting in about:config.
I worked around this with a script that patches the omni.ja zipfile to change the setting to require signing.
It needs to be unpatched for updates - I don't allow my normal day-to-day Firefox instance to write to it's own binary anyway. Firefox tells me when there is an update, and I then restart into my special script that unpatches, runs Firefox with permissions to modify itself and a profile that I don't use for day to day browsing, only for updates, and then patches it again.
Developer edition has updates but it's a beta. Unbranded builds aren't a beta but don't get updates. There is no version that's just like normal firefox except without signing enforcement.
Part of the problem is the very delineation between developer and end user.
How about Firefox just be a powerful open platform that anyone can develop for easily with as few roadblocks as possible? We should all be one step away from being developers.
I am so glad I was on the old Firefox 12 years ago. I'm glad that I was able to quickly whip up extensions and share them with my friends. Wanting to automate bypassing my school's wifi captive portal, or wanting to change how the bookmark menu was displayed encouraged me to experiment.
The fact that Mozilla now sees only an audience of consumers that need to be protected from themselves and marketed to is the problem. Mozilla is afraid of their precious precious brand being sullied.
The point of my post was that there should not be such a hard delineation between regular users and developers. A developer edition is not a helpful solution.
What’s the trade-off here? Browsers are trying to protect users against a metric ton of malware trying to exfiltrate login credentials, and the vast majority of users have no clue what an extension even is.
Last week my dad thought he had a virus, but really it was just a BS spam site that he had accidentally allowed to send him notifications. The screen in Chrome to revoke notification access was like 8 clicks deep.
Browser extensions are a powerful, beautiful, dangerous bit of tech. Is it asking too much to put some guard rails in place that really aren’t too much trouble to follow?
> Is it asking too much to put some guard rails in place that really aren’t too much trouble to follow?
No, but that's not what Mozilla is doing. A confirmation prompt is a guardrail. This is a fence.
> Last week my dad thought he had a virus, but really it was just a BS spam site that he had [...] allowed to send him notifications.
That's his own fault. Not an ideal outcome by any means but a private organization has no right to restrict people's freedom just to protect others from themselves.
This would not have protected your father from any of that. If hostile code can inject an extension into your Firefox profile, it can also install a keylogger or read your unencrypted Firefox password store. There is almost no protection against your credentials being exfiltrated. Neither would it protect you against unwanted notifications. It will however greatly reduce the functionality of Firefox.
This is security theater.
> Browser extensions are a powerful, beautiful, dangerous bit of tech. Is it asking too much to put some guard rails in place that really aren’t too much trouble to follow?
There are many layers of guard rails already. The problem is that now they want to also inspect every extension that I use, even if it is for completely private use and will never be available to the public. And Mozilla does not exactly have a good track record with trust.
What gave you the impression that XPI files would not be supported?