I can subjectively confirm that Firefox feels snappier recently and deeply appreciate and enjoy all improvements.
But I have some itching concerns about that methodology with heavy anonymisation and stuff there:
Couldn't it be that the web itself got faster, and folks have better hardware?
I mean, yes, generally these sloppy script kiddies with bloated frameworks produce less and less effective code … but I can imagine, that when some heavily visited page deploys a new better optimized and leaner version (I assume such things happen in reality, don't they?), there is no way to tell it apart from telemetry data telling that everything got slightly faster on average. Or with people getting better hardware, or with OS and driver updates, etc.
I've used an older firefox version from 2020(the why I did is a long story) and only updated after the webp security issue. New version felt immediately faster.
As older Firefox versions are still in use, too, it’s possible to check whether the improvements come from updates in Firefox itself or the web in general.
Seems like the easiest path to increased speed is to hold software constant whilst processing and networking continue to improve.
Historically, software has continued to behave like a gas, expanding to fill space and consume resources. The resources generally do not belong to the software authors.
Software isn't eating the world, it's eating your computer's resources and the bandwidth you pay for every month.
I mostly use Firefox but I have to use Chrome on a regular basis[1] and Chrome feel slower than Firefox despite having only one open tab. On a pretty old Linux computer.
[1] Because Slack still doesn't support WebRTC on Firefox after 8 years…
The web is getting faster, yes. This manifests as a constant (albeit slow) downward trend in global aggregate metrics. We think this trend is mostly due to Google pushing performance metrics being linked to search rankings.
However the data presented in this post shows obvious step changes in performance that correlate with browser version rollout. It would be disingenuous not to attribute this to a concerted effort on performance improvements from the Firefox team.
I bet it's difficult to normalize because performance depends on so many things. CPU cycles, but also other factors like RAM speed, swap usage, CPU cache sizes and speeds, background tasks, GPU and GPU drivers, OS, display servers (X11 vs Wayland on Linux), window managers, available fonts, and probably a ton of other things are other factors that will play a role.
I like seeing Mozilla actually improving Firefox instead of just shuffling UI components around. I've been a longtime user, and am happy to support the underdog here to try and keep some balance of power on the web. Having Mozilla focus more on their tech, and less on politics is a good thing. Regardless, Firefox is a good piece of software and I have no major qualms with it.
I'm happy for Mozilla to keep shuffling UI components around - if it gets them more users then that's good, and it's vital that Mozilla be willing and permitted to take risks, instead of always being a step behind in every aspect.
For instance, FirefoxOS seems to have been a failure, but most longshots are a failure, and if you scrimp on long shots then you end up like Microsoft, realizing that some kids these days don't even have a laptop and their phone sure as hell isn't running Windows. So I'm fine with more FirefoxOS-like projects.
Spending time on Firefox OS honestly would have been a productive use case and it'd probably also be a decent revenue source for Mozilla rather than abandoning it and letting KaiOS pick it up.
Especially given that it's now become the third largest mobile OS with over 100 million active users worldwide.
Since upgrading to 118 I've had random hangs of firefox every few days. Not clear why, no CPU or swap. All windows just freeze and need a "killall firefox", which works - shutting firefox down cleanly.
Of course it's impossible to find anything anywhere about this due to search engine spam
Was fine for years until 118 and is rare enough to not be able to reliably reproduce it (and thus do things like disabling extensions, running the pain of a new profile, etc), so I guess I have to live with it, as life's too short.
Firefox developer here. There are a few ways to diagnose this, depending on your setup. If everything is frozen on Linux, it's probably the parent process that is having an issue. The "parent process" is the process that oversees all the tabs, it's the top process in a Firefox process tree, the one that has lots of children.
If our crash reporter is enabled, you can also just kill the parent process with `SIGABRT`, this will produce an entry in `about:crashes` that you can look up on restart, and submit. This will then give a link to our crash reporting front-end.
If you have `gdb` and `debuginfod` on your distro, and you're using your distro's Firefox, attach `gdb` to the parent process of Firefox, let it download symbols, and do `thread all apply bt`, this will dump the stacks of all the threads of your parent process.
If you're using one of our Firefox builds, you can use https://firefox-source-docs.mozilla.org/toolkit/crashreporte..., that will integrate with our own symbol servers. Depending on your distro this might or might not work on your distro's Firefox (some distros submit their symbols to our infra to help diagnosing crashes).
Once you have some info about the crash, you can open a ticket on your distro bug tracker, or https://bugzilla.mozilla.org/enter_bug.cgi?product=Core to reach us directly, attaching the stacks, Firefox version, crash report, about:support, or anything else that you think is relevant.
Oftentimes, those freezes (on Linux) are caused by a bug between Firefox and the desktop environment, or by graphics drivers, that kind of thing, and stacks allow narrowing down the issue greatly, sometimes it's just a package update or a configuration flip away!
Holy crap, this is the most helpful answer I've ever seen from a Firefox developer on HN. When I was having a memory exhaustion issue, I was just told "enable the page file, running without overcommit is a recipe for disaster" (looks at my 40GB of memory). I should figure out if that issue still occurs.
Running without overcommit is a recipe for disaster! Modern JITs make good use of virtual memory for both security and performance reasons.
Firefox on my laptop currently consumes almost 400GiB of virtual address space, while consuming "only" 458GiB of RSS. And that's not a bug, that's simply the browser making good use of the virtual memory system to provide significant advantages to all users on systems without overcommit disabled (which I'm guessing is 99.99% of people).
The difference in response you're seeing is simply because ta1243 is reporting a Firefox bug and you're reporting user error.
> simply because ta1243 is reporting a Firefox bug and you're reporting user error
Don't tell me Firefox needs 40GB of virtual memory to keep a tab open for longer than a couple weeks. Or that it's the JavaScript engine when closing all tabs or windows didn't free the memory, only restarting the entire browser did. It wasn't the extensions, either.
If Chromium works properly without leaking memory, and Firefox leaks memory and calls it "user error" to not have overcommit enabled, I'm going to use Chromium, simply because it actually respects my computer's resources.
Also, you're citing ways in which Chromium's V8 uses virtual memory, when V8 does not suffer from this problem. Clearly you can use virtual memory in that way without having a memory leak.
Also remember that Windows does not have unlimited overcommit like Linux does, because it has no OOM killer. So if Firefox were to use 400GiB of virtual memory, that would require the page file to take up the remainder of that.
I did have issues with my page file "automatically" growing to 64GiB with Firefox running, so maybe it literally does do this and it actually uses up hundreds of gigabytes of space on Windows machines. But that is not acceptable behavior from Firefox and it is definitely not user error to not want to give up that much space.
My laptop has a mere 32G of ram and 6G swap (currently unused but I did reboot last week)
The advice used to be 1.5*physical ram for swap back in the days when I had 4MB of ram and 170MB HDD - well under 5% of disk space Doing that today would be 20% of my disk space which seemed excessive.
Perhaps I could create a 24G swap file as a ram drive, giving me 8G of ram and 24G of swap. Why would that be better than just 32G of ram and no swap? Is it better than 40G of ram and no swap?
Yep. This was a rule of thumb which I don't think still holds with current RAM capacities.
I don't have swap on my 32G laptop, not sure it's right I guess I could use a 2G-4G swap partition when running a few XWiki instances and a Java IDE as well as a browser with several dozens of tabs instead of having the IDE frequently OOM-killed.
Well, that was just the wrong answer, since their software is supposed to be releasing memory back to the OS properly. Overcommit is just a coping mechanism, you should be addressing the root cause.
I have since enabled the page file for other reasons - LLMs demand up to 50GB of memory sometimes, and my new desktop only has 16.
The change of machine is why I should probably try Firefox again to see if it behaves.
Overcommit is allocating virtual memory without any backing. Swap is allocating physical memory backed by disk.
Overcommit is useful in some cases, for example to preallocate a large heap without immediately making it all resident. Or to allocate 'guard' pages to fight buffer overflows. On Linux, overcommit is commonly assumed and as such disabling it tends to break some programs, as it's not out of the ordinary for something to allocate 100s of GBs of virtual memory.
Read up on Windows. Windows does not do overcommit whatsoever, unless swap is enabled, in which case it only allows overcommit up to the size of the swap file.
Overcommit cannot be enabled without a swap file, whatsoever. This differs from Linux that can tend to have overcommit enabled without swap.
Yes, Windows doesn't have overcommit. (Also not with swap, since overcommit is unbacked virtual memory, which Windows still doesn't allow. The only thing it allows is disk-backed virtual memory).
But as a user, I don't care (except that I don't have to worry about an OOM killer because an allocation will just fail). The only real difference is that application developers need to be careful with allocating memory without using it, unlike on Unix-likes.
Because software on Linux runs on the assumption of overcommit, you shouldn't disable it, even though the lack of overcommit on Windows is not problematic.
It takes up disk space that I thought I could save. I have it enabled now, as a side effect of working with LLMs on my new computer that only has 16GB of memory (working on fixing that).
I still wouldn't be comfy letting Firefox fill it up, because no matter how large the page file is, your system will always crash when Firefox fills the entire thing. I do not know if my new computer will have this problem, though.
If the memory is never released back to the OS after it is done being used (or when it is not going to be used), then the browser has failed to release it back to the OS.
Are you using anything other than a Linux distro’s stable, extended-support release? Most of the complaints on HN about Firefox instability seem to be either Windows users, or Linux users upgrading to the latest and greatest, often outside of their distro’s packaging. Meanwhile, I have been running Debian stable’s Firefox for many years and simply have never encountered the bugginess that gets described on these HN threads.
I use Debian's Firefox ESR, and for at least five major (ESR) versions there's been a bug where sometimes if multiple tabs try to do a web push notification at once, the entire browser process hangs maxing out a core and nothing I do will recover from that. (I use X11 with PulseAudio.) One of these days, I'll get around to reliably reproducing it. (I haven't tried `pulseaudio -k`, which might fix it if it's an issue playing the sound: videos hang kinda like that, though more recoverably, if the sound isn't working right.)
Same here, I also use Firefox ESR from Debian packages and I've observed Firefox to lock up hard a few times. Often to the point it affects the entire desktop so not quite sure if it's a bug in Firefox itself or possibly elsewhere e.g., Wayland.
I have not: I don't know how to do that. (I guess you can do it with a Python script?) The X11 utils reckons that the notification windows belong to the same client as Firefox, though, so I doubt this is the issue.
And I on the other hand use Mozilla's latest & greatest all the time on Debian, and "simply have never encountered the bugginess that gets described on these HN threads" either.
Does it happen while typing? Might be https://bugzilla.mozilla.org/show_bug.cgi?id=1844505 I'm affected by that bug, but I hadn't considered to search for a ticket until now. Guess I'll have to start using plain Mozilla binaries to reproduce...
Are you on linux? For a while I've been running into issues with FF where it will randomly halt repainting after switching tabs. The process is clearly still responding to keystrokes and mouse events, but the ui is mostly frozen.
I've also been running into an annoying issue where hover menus will randomly stop working---it seems like the mouseout event is firing before the click event is handled.
This is good news, and makes me want to start using it again.
Was a user from the Phoenix/Firebird days with the "unzip it somewhere" fancy installation process, used it for almost 20 years, and then switched to a Chromium-based browser due to perceivable speed differences.
I wasn't running benchmarks to confirm my suspicions or that performing some action was x milliseconds slower in Firefox or anything. But when you're using something all day, every day including as part of your job, it is easy to get a feel for.
With these multiple reports recently of performance work, along with real-world metrics ... I'm thinking of sticking with it for a few days again. If I don't notice the difference, or it actually seems faster, I'll switch back.
I want to switch back, because we need to ensure there are alternate browser engines, and I don't want standards committees to turn into "this year, Google deems we shall be doing the following" sort of events.
the user.js file is a very underrated feature as well. i have a few computers and some of them are dual boot which means i have a lot of firefox installs so its great just being able to drop in the user.js file and have everything set up the way i like it
heres a few general ones. on my linux computers i have a bash script to install firefox, then it sets up symlink of the user.js file from my own config folder to the default .config folder. on windows i just manually add it
user_pref("browser.aboutConfig.showWarning", "false"); // disable about:config warning
user_pref("browser.startup.page", 3); // restore previous session
user_pref("browser.ctrlTab.sortByRecentlyUsed", "true"); // cycles tabs in recently used order
user_pref("signon.rememberSignons", "false"); // dont ask to save passwords
user_pref("browser.search.suggest.enabled", "false"); // disable address bar suggestions
user_pref("toolkit.legacyUserProfileCustomizations.stylesheets", "true"); // enable custom css
The killer app for Firefox is it still allows some flexibility in tab management. Sidebery and TreeStyleTab have been my anchors in the FF ecosystem. The experience is so vastly superior for tab hoarders and tab-todo methodology that I really can't imagine using something else. I also use FF on android because it has ublock origin and dark reader addons. This make browsing the web on mobile far better. I actually hold little allegiance to FF as whole, I just haven't found any Chromium based browser that works as well for me.
> I just haven't found any Chromium based browser that works as well for me.
The native vertical tabs in Edge are also pretty good. Not nearly as feature-rich, basically just vertical tabs with automatic unloading and tab groups; but in return it's incredibly stable and bug-free.
Is there a reason Chrome hasn't adopted this? Tabtree is the only reason I'm using FF (not that I'm unhappy with FF I just use Chrome for work cuz the devtools is better).
In regards to treestyletab did you get rid of the top tabs or do you have both on your screen. I feel like i lost quite a lot of real estate on smaller screens with both.
Gotta say I’ve been using Firefox for years as my daily driver and it has been great. Maybe it is because I’m running on higher end machines for the most part, but I’ve had no complaints with it across Linux, windows, and Mac.
I've also been using it for several years and almost completely agree with your sentiment. The only areas that have given me trouble are in the dev tools. On my machines the debugger is significantly slowed down when opening very large JS files, source maps compound the debugger slow down, and I can't always inspect variables' values when using source maps (possibly a build tool config problem).
In the Speedometer 2, Firefox is much slower than Chrome, which is a bit slower than Edge. But who cares! In real life, this makes no real difference. I respect and trust the privacy of Mozilla.
Features like containers and the ability to run Ublock and Tampermonkey on mobile are priceless!
It does still matter for people using low-end computers. My MacBook's screen cracked while on vacation recently and I had to buy an emergency laptop. I picked up a device with a Celeron 6305 and 4GB of RAM. I loaded up Firefox like I normally do, and it was so slow and laggy to the point it was unusable.
I then switched over to Edge and it performed significantly faster, and was using less of the 4GB of RAM. I was surprised at how significant the difference was, but there was no denying it. Edge performs much better than other browsers on low-end PC's.
Chromium based new Edge. I hate all of the Junk Microsoft has added to it. But if you are willing to take the time to turn it all off, it is a much more efficient browser than Chrome or Firefox. I'm not sure what Microsoft has done to optimize it, but on a low-end system it is very noticeable.
I trust the Firefox team knows what they're doing and came to the right conclusion re: performance.
That said, I have a small nit. I would have liked to see how performance changed relative to the same time last year, to control for seasonal effects. Image 1 is showing changes on the scale of 1/100 of a second. I'm not an expert in the field, but when you'r looking at signal of that magnitude controlling for noise is more relevant.
for me there seems to be a clear winner in every category but it is not always a particular browser. Seems they perform better in different areas. However chrome seems to be better in more areas.
I’m curious what seasonal effects you hypothesize might be in play here. I don’t know enough about the browser performance space to understand what might change: do people really consume a different mix of web technologies through their browsers seasonally?
It looks to my naive eyes like the trajectories here are consistent and that the “signal” looks bigger than the variance. The measurements seen consistent with a bunch of smart engineers working to optimize for them during the time period they were measured.
At the same time you’re right: that big August improvement might well have something to do with Northern-hemisphere people switching to lighter vacation reading rather than just the Firefox 116 release happening to drop August 1. I’d share your interest in the longer timescale.
To the first question: Yes. An example would be travel sites and shopping sites getting way more use during holidays. A big one is always December: end of the year holidays mess with usage metrics a ton given how many people take off work or spend time away from their computers, to the point that most companies I've worked at tend to ignore data from that month or try to be very careful about only comparing it to data from December in other years.
Please anybody still clinging on to Chromium based browsers on perception of speed being an issue, at least give FireFox a try. I haven't looked back and really enjoy it these days, and it's really important to the future of the web that browser diversity remains strong.
I'm using Brave with rewards, vpn, wallet disabled and around 20 flags modified. It works faster, better extension support for me with fewer bugs, less memory usage when fewer tabs are opened. I chose brave because it can sync.
I'm impressed with Firefox and tried a fork called floorp. It has some useful additions and find it better than regular Firefox while still supporting sync and was using ungoogled chromium before.
Unfortunately some websites still work better with chromium browsers in my experience. This is anecdotal but I also feel some negative fingerprinting from Google owned websites on Firefox, more memory leaks as well. It's only a matter of time before I change to Firefox (forced manifest V3) but I will stick with brave for now.
I really could not tell if it is better or not. But I've had 2 problems that seems to never be resolved:
- using hardware accelerated video decoding is bizarre if the video uses VP9 (even disabling it or forcing AV1) it stutters from time to time;
- watching a video in the background and playing game at the same time, with hybrid graphics, freezes/hangs kWin, then, the video that was playing in the browser becomes green and everything that used either iGPU or dGPU starts to struggle and stutters a lot, forcing me to log in and out, i. e. restart my session manually;
The last one may be because of Nvidia (yes, my fault for having this card) prime offload in Wayland, but even so, I did not find any topic related to this, that has not otherwise been solved in bugzilla, and honestly I am kind of afraid to report this and be met with judging questions; I'd like to debug some more, but I don't have the patience.
These are the two ever lasting headaches that I have with Firefox. None of this bothers me enough to leave Firefox though.
Anyway, my congratulations to the team. Today, I have successfully converted 5 people to use Firefox.
Hardware decoding is sadly somewhat of a mess in Linux... I'm guessing that with most of those issues, Firefox is interfacing with the various video coding APIs correctly, and some driver or something is just buggy. If you're on a desktop, turning off hardware video decoding altogether might be worth it.
Pro Tip: There is a Firefox extension that forces YouTube to use H.264. Even though my machine does hardware accelerated VP9 I've found it to be unreliable. Switching to H.264 makes YouTube videos play smoothly. All other videos sites use H.264 by default so this only really needs to be done for YouTube.
That second point is something that drives me CRAZY. For years I would keep YouTube or Twitch streams open on the second monitor. Now I deal with constant fps loss, video stutters or freezing. Absolutely bizarre regression in performance.
While I'm glad the data is trending in the right direction, isn't this what you'd expect to see from these metrics as people adopt faster hardware and better internet connectivity?
I know it's hard to control for those things while maintaining anonymity and doing aggregate analysis, but this would be a much stronger argument controlling for at least some level of available compute.
You mean only displaying and discussing the 95th percentile in a blog post?
There's no reason to assume that what is brought up in a blog post is going to match what engineers are looking at. In this case, I'm a developer at Mozilla, and I would say that I agree that it's worthwhile to look at the 99th percentile as well. And the median (50th percentile). And other platforms. And segregate it by website, but we don't collect that, or by country, but although I think we might be collecting that we avoid correlating too many things before discarding the non-aggregate data. There are too many German users to worry about identifying one by knowing they're German, but there's aren't that many German users with >100 tabs running on Windows 7 on slower hardware, etc.
I wouldn't want to pile up any more data than necessary in a blog post, though. The point would get buried. Man Bites Dog Whose Litter Mate Once Skipped A Veterinary Visit Because Owner Was Out Of Town At A Wedding Between Two People Whose Names Start With D
(Yes, we do consider many different percentiles when making decisions. We kind of have to come up with arguments for what matters, given a change we made or are contemplating. Some things improve the 50th and regress the 95th, for example, and that's a useful clue. Telemetry tracks half a dozen different percentiles.)
I'm sure internally folks are looking at more stuff, but blog post wise, it makes it look like you're intentionally omitting it (see https://youtu.be/lJ8ydIuPFeU?t=1239)
But what is the reason behind the choice to only show the 95th percentile and not the mean or median, that undoubtedly are more understandable by a vast audience?
p100 would include those where the uplink is so slow that the page takes minutes to render completely. Those users are useful to see, but the browser is much less of a bottleneck for them, and the data should be broken out into a separate bin.
I just found out today that Firefox does not yet support the css has() selector, while Chrome has supported it for over a year now. I get that there are a lot of reasons for this, but it surprised me, I guess.
I use Firefox because its fullscreen mode is far better than all the other browsers I've tried on Linux, but it does seem like it's clearly slower to perform and slower to be able to adopt new web 'quasi'-standards.
In the last five years, my experience with Firefox has always been far superior in terms of performance than when using Chrome. I just can't bring myself to justify using a Google product for something as important as browsing. I guess it's ideological, but I just want Firefox to succeed.
Unfortunately Firefox still has a lot of issues when it comes to font rendering on macOS. For example, it's been two years and it still renders San Francisco incorrectly: https://bugzilla.mozilla.org/show_bug.cgi?id=1721612
Not to mention all fonts appearing bolder than they should compared to Chrome and Safari. It might seem like a small thing but I'm a stickler for typography and it really stands out on a Mac.
I hope they get around to fixing these issues but two years is a long time for such an obvious bug. What confuses me is how few people have noticed given that it seems to affect all Apple silicon devices.
Interesting, must be only happening on some native MBP displays. I'm on an M1 MBP and haven't tested it on an external display but it's definitely noticeable on mine, although not as big of a deal as the letter spacing thing
I confirmed the letter spacing is still an issue by comparing the "How I digitize books" paragraph on Chrome and FF. I had to override the CSS as the site has switched to "Newsreader, serif".
I don't see the fonts appearing bolder difference (on my 4k screen). Actually that's not true, I have seen differences on lower DPI displays. I get around that using BetterDisplay to enable HiDPI on the external. I've also never noticed San Francisco font rendering improperly which I only would if it didn't fit in a clipped box.
Now that performance is largely improved, maybe they'll get around to fixing this one.
Really cool to see the real world performance as well as a new Speedometer. I think anyone who's against the telemetry collected is really nuts, as long as it's done responsibly - the idea that Firefox could compete without that data is just a fantasy.
I switched to Firefox from Chrome for personal browsing several months ago, after the Manifest v3 debacle. So far, it's been good!
Very few sites only work on chrome. Firefox is fast, and the auto tab discard works better for me than Chrome's equivalent (or the Marvelous Suspender), I really like container tabs, and the whole experience feels pretty snappy. There are some product affordances I don't get with Firefox (ex. being able to run math expressions in the address bar), but it's small potatoes.
I started using Firefox as a main driver a year ago due to me interfacing with Linux distros a lot more often. It's the browser that is usually always installed from the get and offers the best experience after logging into my Firefox account.
They also have some other really good tools they support such as firefox relay which is basically a proxy email address for your main email so you can post your an address on the airwaves w/o compromising your main email.
Pocket for saving articles you read on the internet.
Sounds wonderful, but they've been claiming speed increases for years, causing me to go and check if they're true, and being deeply disappointed. I don't use powerful laptops with fancy hardware for my browsing, but how stupid can a browser be if it only works well even with heavy use if your hardware is great? Not everyone has that luxury even if they'd like a bit of privacy features.
As much as I detest and avoid all things Google in so many ways, Chrome remains preferable because at least it can deliver 50+ open tabs on a completely ordinary laptop without completely freezing everything to shit.
You've had so many years to get this basic thing right Mozilla, yet you keep failing, and for a company that claims to track less than the others, then what the fuck is your browser doing to keep being so goddam slow?
Firefox is definitely on par with Chrome these days, so if you’re seeing issues like this, I would first suspect that it’s the web sites you have loaded in those 50+ open tabs instead of the browser itself. So many sites use bad JavaScript and that’s often what causes the problems.
> So many sites use bad JavaScript and that’s often what causes the problems
Then Firefox needs to handle that better. As an end user, I don't care about the quality of the JS of the sites I want to browse - I just want it to work, and if it works in Chrome but not Firefox then I guess I'll switch to Chrome.
Exactly this. Often with complaints about FF performance someone comes along and says something like the above, about JS or my laptop's hardware, or etc, but if Chrome, with all its shitty tracking, can make those same sites load just fine on nearly any machine, then FF should be able to do the same if it wants more users. I don't care about your browser's problems and exceptions to decent performance. I shouldn't be expected to filter the sorts of sites I visit for the sake of making sure they don't harm my delicate little FF instance. I just want the damn thing to work as other browsers apparently can.
> Firefox is definitely on par with Chrome these days,
The GP is likely going to be triggered by those words, because... I've seen this repeated every X months/years... "Oh... yeah, you had problems in the past, but it's so much better now"... and... it rarely is. GP will just need to try out again at their own schedule and make their own determination.
I had this standard issue with desktop/laptop linux for years... "xyx doesn't work well"... "Oh yeah, it's so much better now - no problems now". Try the new recommendation - still broken.
I had a f2f with someone at a local meetup, and we got in to "xyz is subpar/broken on linux" (some gnome thing possibly). This happened to be a linux user group, and someone challenged me with "no, you're wrong, it's fine now". So... I pulled out the laptop and fired it up and showed the irritation/bug. The response - after showing that what I was saying was a bug - was a shrug and "Oh, I don't care about that - doesn't affect me". The verbal equivalent of "WONTFIX". In person.
tldr: telling people who've been burned - often for years - that things are "better now" is generally not all the productive. Most of the time, people who have specific/legit issues will figure out if/when they're fixed or tolerable enough.
This is valid but in both directions: people will say "Firefox is slow" because they don't have the time/interest to say "Firefox is slow on this specific webapp that is very important to me on my specific laptop". Of course the responses to the first statement are going to interpret it generally rather than specifically.
Has anyone at firefox used the iOS app? I want you to track what the workflow is to access a password in the app. Tap tap tap left side, right side, can't get out of the settings to check the website while doing so. It's maddening. Besides that... it's lovely.
In iOS you can add Firefox as a password manager auto fill option. Then when you tap in a password field, the relevant passwords should prompt to auto fill.
I haven’t tested this with Firefox’s password manager, but I did test in firefox and Bitwarden is auto filling correctly, and in iOS settings I see Firefox is listed as a password manager.
It’s settings->passwords->password options. I don’t think apps can automatically enable auto fill, you might have to manually enable this.
Yeah, it is hidden away on Android too. They tried to mitigate this by adding an awkwardly named "Passwords shortcut" app action (long press the app or add it to your home screen) but this just opens the password settings, still another click to actually get to the passwords list.
There not much harm as long they're making sure users are aware of the telemetry and can disable it if they choose to. Opt-in is best, but collecting data with consent isn't a problem
Been using Firefox back since version 2 way back when.
After all these years, it still does what I need, how I need it done. That's the best recommendation of a product I think I can give.
Just about my only complaint is I wish there was a way to more quickly cancel saving a duplicate file. When the prompt appears with "this file already exists, do you want to overwrite?" it's much faster to just click "yes" than to click "no" and then drag the mouse down to "cancel" to close the window. Do this once it's no biggy. Do it hundreds or thousands of times and it gets obnoxious. Waste of bandwidth and disk writes, or waste of time.
I know that sounds dumb, but it's one of the only pain points I have.
I've made the switch. The difference is so minimal at this point it's unnoticeable. And regardless of telemetry and who sees what, I simply cannot stomach touching anything even remotely related to Google anymore.
I regularly have 100+ tabs open depending on what I'm doing, and killing tabs in groups of 10+ sometimes without any lag. But then I'm on a 5950x CPU + Linux so maybe that plays some role. What hardware are you using?
Are you swapping out? I think a lot of people don't realize how heavy web pages are, often taking up hundreds of MB of RAM (at least) for each tab. I highly recommend the Auto Tab Discard plugin. I'm terrible with leaving tabs around, and tab suspension plugins let me continue that bad habit.
So firefox was ahead of the pack on the add-ons/extensions game, but then didn't push it or support it after a while. Now after years of chrome having a few integrations or add-ons that make my life easier, i have a hard time switching back to firefox even though chrome is such a giant memory hog. These aggregator moats are real that stratechery talks about.
Great for them. The mobile app, at least for me, is getting unusuble. I have a crappy phone and reopening a page takes 3x the time opening a new tab and writing the url does.
Even then it's still significantly slower than chrome, even with ads.
I think it's time I find a different mobile browser with adblocking.
It's a real pity that speed keeps being the golden metric by which browsers measure themselves. I'd much rather have Mozilla focus on making Firefox usable on many popular webs (including HN) than on shaving another millisecond.
Personally I just find it the least aggravating browser. It doesn't suffer from the performance issues and site incompatibilities of Firefox, the bugs of Brave, nor the incessant nagging of Edge. It just gets out of my way and let's me browse.
I don't know if it's the fault of Firefox or Cloudflare WARP but I've been getting a lot of "PR_END_OF_FILE" errors (something like that) when browsing.
For normal use it does feel the same speed as Safari. But the font rendering seems a little of on macOS compared to Safari/Chrome, anyone else notice this?
Firefox's analytics allows Mozilla to analyze changes in the real world for Firefox users. Testing things like startup time and load times for actual users as opposed to synthetic benchmarks.
I don't know what their definition of a "real user" is. But everyone I've met who uses Fiercefox does so for philanthropic or political reasons. I have a hard time believing some non techie in Minnesota has noticed performance improvements. Idk, the data looks very pseudo sciencey, just feels like another marketing ploy from the blue haired people.
Generally we say "real users" internally as a comparison to benchmarks, as in "this improves JetStream by 5%, but I don't think it will make any difference to real users".
When more and more parts of FF are written in rust, I thought FF will get a lot faster. But for the past 1 year, I can only see marginal improvements w.r.t speed. And occasionally FF hangs when I have 20 tabs opened.
> And occasionally FF hangs when I have 20 tabs opened.
?
That's really suprising to read when on my low-end laptop from 2012, Firefox runs with 20 open tabs without ever facing any issue (I have something like 30 tabs open right now).
They use the same compiler backend and you can generate pretty much any LLVM IR you want with Rust. You can argue about the performance characteristics of the kind of code the language encourages you to write, but a blanket statement like that isn't true.
Eh, no. A rewrite is pretty much always better. You now know the problem you’re actually facing. Drop all the little wrong assumptions. And in general are better equipped than the last time.
I vaguely remember reading here about a company that slapped a backend together in Python. When they took off, the compute costs stung, so they rewrote in Go. Saved most of the bill. Some time later that started creaking, so they rewrote again. In Python, but managed to save most of the bill again.
What about memory usage? For some reason Chrome gets a lot of flak for this but Firefox is significantly worse in my experience. I'm considering switching just because of that. :/
I think it's meant to be that due to the differing ways tabs are handled firefox uses more memory with only a couple of tabs but once you increase the number it uses less than chromium.
I have been trying firefox for last 1-2 months it was running great but last week or so it suddenly freezes my whole computer. This happens both on windows and a different linux pc.
Synthetic benchmarks or tests. For example, improving performance of an empty loop is useful for starting somewhere [1], but does not show how fast the browser is on a real page with complex JavaScript such as a vue/react component used to render a tree-based select control with searchable items.
I was a consistent Firefox user, probably for the past 15 years, but I recently switched to Brave and Edge after too much frustration with power consumption. Firefox makes my laptop shriek, while the other 2 normally don't cause the fan to start at all.
Does anyone actually care about loading speeds (apart from the necessary ad blocking)?
> Does anyone actually care about loading speeds (apart from the necessary ad blocking)?
It's kinda like money imo: You think you don't care about it until you don't have it. The only time I've ever cared about loading speeds was when I stayed on LTS Firefox for 2 years to avoid switching to WebExtensions, because I wanted to keep using several addons that were discontinued in Quantum. And near the end...yeah I cared about loading time. Felt amazing when I finally did upgrade (although I'm still pissed af that they never re-added the ability for extensions to access tab-specific history & will never get over this unless they do one day add it).
So I do think it's important to pay attention to even when no one cares, just to make sure that no one starts caring.
In MouseGestures extension, you can bind right-click scroll-up & right-click scroll-down to whatever you want. I bound this to back & forward (within this tab), respectively. So you'd get a little context menu at your cursor to be able to navigate however many ticks you wanted. Insanely useful, especially when websites bullshit autoredirected you once, so navigating away from them requires either super quick reflexes to go back twice with hotkey or mouse gesture, or actually moving your cursor all the way to the back button @@
I still actively get mad that I can't use this shortcut every single time I have to go to the physical buttons.
With laptops having taken over computing aside from a small sliver of professionals and enthusiasts and browsers being one of the most consistently used categories of software, one would think that energy efficiency would be a bigger priority for browser devs… I mean who likes their laptop’s fans screaming and finite battery cycles being torn through?
It’s rarely ever mentioned though and not something that significant gains are regularly made in, which is a bit strange to me. Browsers across the board have crossed the threshold of diminishing returns when it comes to speed, I’d personally rather they shift focus towards battery friendliness.
A lot of laptops don’t even have fans, and as a Firefox user who has owned ultrabooks for the last several years, I can’t say that this choice of browser has had any noticeable detrimental impact; I still get the many hours of battery life that ultrabooks are known for. I suspect that the devs, faced with limited resources and the need to prioritize, would consider this a matter for the hardware and OS to deal with.
I'm mostly working on a desktop computer so I don't care that much about energy consumption, however I do notice that a lot of websites have some timer going off at very high frequency, as part of some framework or whatnot.
Personally I don't see why Firefox can't just stop JavaScript execution on tabs that aren't visible after say 5 seconds, unless the user has enabled background execution.
All browsers already heavily throttle js timers execution in background tabs. You also have specific apis like requestAnimationFrame to decouple UI timers from non-UI timers like setInterval.
Stopping it entirely and then suddenly restarting on foreground would be probably too much of a breaking change for the websites' developers. I mean, I guess it could be possible but would require properly thinking it through. Some browsers aggressively freeze background tabs but AFAIR they do full reload on unfreeze.
The breakage area of shipping that kind of change into the web ecosystem of a major browser is huge.
I had to switch from Firefox to Safari on my M1 Max 16" MacBook Pro because the battery drains in a few hours. I experimented a lot with turning different plugins on and off, but the power draw remained almost constant.
Interesting. Up until recently I was on a intel Mac and Firefox was always in the list of "Apps Using Significant Energy". I don't know if it coincided with some Firefox improvement, but after switching to an M2 it has never show up since.
Fortunately it never affected any need to switch to another browser since I'm plugged in 95% of the time.
I'm absolutely serious. Apart from using my work laptop to visit an ad-ridden page, doing stupid timed sign-up activities like going to Disney World, or using my older work desktop that was so old and slow that it spent 10 minutes every morning at 100% disk utilization before calming down, I can't think of a time when I've experienced noticeably slow webpage loading in years, maybe a decade.
> We’ve been motivated by the improvements we’re seeing in our telemetry data, and we’re convinced that our efforts this year are having a positive effect on Firefox users.
Mozilla gets a lot of flak (especially around here!) for their sometimes heavy-handed usage analytics, but it's nice to see that used for its stated purpose! Great use of data here.
I'm not a fan of telemetry in any browser (I love Lynx because of this), but Mozilla is definitely more trustworthy than Google or Microsoft.
Edit: I'm not saying that Lynx should be a daily driver or that it's more secure, but it's a neat little project that avoids some of the bad patterns in modern browsers.
Taking stock of the connected devices and software that I am familiar with, I'd say there is a strong correlation between detailed user tracking and worse UX. It seems weird at first glance but I think there are some solid explanations for why that might be.
Data analysis is difficult to perform and understand well. It is easy to draw mistaken conclusions or to twist results to show the conclusion a person wants, and using detailed numbers can lead to a false sense of confidence in the results.
Companies are first and foremost optimizing for their benefit, not the user. Detailed tracking can uncover interesting ways for a company to make more money at the expense of the user.
Others have answered this, but I just wanted to point out the software devs have been managing to understand how their products are used for improvement purposes from long before telemetry was a realistic possibility.
Telemetry doesn't make it possible, it makes it less expensive.
Do you think it is the same asking a small subset of users than having info on all the users? I work as a Product Manager, and trust me, it is not the same.
How are companies that aren't software vendors and aren't able to spy on their customers able to do it? Did software companies not have good ways to do this before spying on their users?
1 and 2 are problematic because it's very hard to get representative data from either one. The people who have time for user studies or post on your forums are not representative users.
Only listening to data from 1 & 2 results in the sort of angry posts you frequently see on HN complaining that devs aren't listening to "real users" or have the wrong priorities.
You end up needing data from additional sources, telemetry being one of them.
You do not need it. This is a really weird attitude. Until like the late '00s "telemetry" was, full stop, spyware (still is, for those of us who didn't shift our attitudes with the prevailing winds). I wouldn't say that responsiveness to user needs and desires has improved since then, in software design.
But what is the problem? That I can know that you press the print button? That you chose the Edit menu? I really don't see the problem. Please, explain, I really want to understand.
You don't see the problem of someone recording the actions you take using your own computer in your own home or office? It's like having a stranger sitting over your shoulder watching you. It's creepy and weird, and it's gross that people try to do it at all.
It's one thing to argue over whether basic user facing software like an image compressor or a text editor should have telemetry, but a web browser is one of the least controversial scenarios for telemetry I can imagine. It is constantly sending and receiving data on your behalf with hundreds or thousands of servers spread across the internet as a user agent. Your usage patterns - i.e. is it crashing, is the feature you're trying to use failing to work for some reason, is it rendering at a good framerate, is it running out of memory, are you having trouble finding the information you're looking for - are going to be incredibly complex and specific to you.
Significant bugs can affect only 1% or 0.1% of a browser's userbase but at Chrome scale or even Firefox scale that's like a million people. If you don't have telemetry it is REALLY hard to hear from those people about their problems and understand them. There simply are not alternative solutions that work half as well as opt-in (or opt-out) telemetry. People who say web browsers don't need telemetry are simply ignorant of what it's like to ship one and try to keep it working in the face of a constantly shifting environment - broken drivers, broken VPNs, malicious websites, malicious extensions, broken hardware, and users who are confused or tired or simply just bad at using software. No one is speaking on their behalf, you have to dig their suffering out of the data by looking at crash reports and performance metrics.
Shipping a web browser used by a million (or a billion) users means that you have a responsibility to do a good job. If your browser is not well engineered and reliable and responsive to users' needs that can result in data breaches or third-party server outages when your browser misbehaves or incorrectly channels user intent.
I'm personally a fan of making usage telemetry opt-in instead of opt-out, but browsers are a case where I don't opt out because I know how important the data is for browser vendors to make informed decisions.
This is of course different from sending your browsing history to Google, Microsoft, or any other company. I encourage people not to opt in to that stuff and not to sync their history/bookmarks/etc to those companies.
> It's one thing to argue over whether basic user facing software like an image compressor or a text editor should have telemetry, but a web browser is one of the least controversial scenarios for telemetry I can imagine. It is constantly sending and receiving data on your behalf with hundreds or thousands of servers spread across the internet as a user agent.
It's probably no accident that spying on users got popular just as this became the case. Constant network traffic while web browsing didn't start to become the norm until late in the '00s, either. If you weren't clicking links, you could often open Wireshark or sniff with Netcat and see nothing. Not from your browser, not from anything. Certainly ~nobody was collecting heatmaps of where you move your mouse, or firing a network request if you selected text. Or recording entire user sessions for playback, or so you can watch them live (god, those tools are creepy as hell)
The prevalence of "every app you use is a web browser now" is absolutely a catastrophe for user privacy and software reliability for this reason, IMO. Every tiny component now has a thousand moving parts that can spy on you.
> But what is the problem? That I can know that you press the print button?
When the internet was young, and most people were using dial up connections, just collecting the dates and times that a person was online and using a program was (and still is) a massive violation of privacy. Software "phoning home", even just to check for updates (collecting IP addresses, timestamps, and version numbers) was enough to get your software branded as spyware.
No software company needs to know which hours I'm awake, when I'm using my computer, which hours I work, which hours I use their program, how long I use their program, how long it's been since I last used their program, etc. It's intrusive, entirely none of their business, and it's insane that they all feel entitled to that kind of information.
If I print something, don't print something, or what the things I print are is also none of their business. Neither is what I'm printing it for, where I put the printout after I take it from the printer tray, or if I use tape or a thumb tack to secure it in place, but you can bet that if software could easily collect that data it would and somehow it would be considered impossible to write good software without that information.
From a privacy standpoint telemetry is always invasive, which is why I disable it any way that I can. Even without the privacy aspect telemetry is a bad idea. I don't want program updates that remove features just because I (and others) don't use them very often. I don't want updates that constantly shuffle the UI around according to how they think "most" people have been using it this week. I don't want my workflow disrupted every few months because it's uncommon. I don't want the way I choose to use the software on my device to influence how other people are expected to use it either.
Telemetry is much better when it's limited to reporting errors and bugs, but even that should be opt-in only.
You don't need spyware just to improve a product. Dev teams were able to produce great software before we were constantly online.
If a team is so unfamiliar with their product and customer base that it cannot take action without telemetry, maybe they're not the right team to make that product. Statistics are not a substitute for domain knowledge.
By reading up on those decades-old bugs in the issue tracker, by making said issue tracker easier to vote on and pleasant to look at, by making other easy feedback submission mechanisms that don't become black holes themselves, by many other options mentioned elsewhere
Mozilla, the legally registered non-profit foundation with a mission statement[0], for sure is more trustworthy than a for-profit data behemoth whose sole revenue comes from collecting as much data a possible, or a for-profit tech company with a history of corporate abuse and user hostile behavior.
That's the Mozilla Foundation, the Mozilla Corporation is the for-profit developer of Firefox that's owned by the Foundation. If Mozilla never established the Corporation I'd give them more slack, but from a "it's nonprofit" perspective it's on the same level as IKEA, which is also owned by a nonprofit foundation.
Technically, google doesn't sell people's data. It uses data to train AIs to predict people's behaviour, modify that behaviour, modify attitudes/beliefs (it's an ad company), and eventually replace people
Thanks, I updated my original post because how they profit from the data is immaterial to the fact that they want it and they coax people into letting them collect it.
I'm not trying to be a contrarian, but Google paid Firefox lots of money to force Google as the default search. Likely an offer they would refuse at their own peril, but I really liked how my search engine settings persisted when I reinstalled. Now it defaults to google.
There's also a ton of promoted garbage on your homepage and privacy switches that need to be toggled off by default. Those settings don't carry-over when you sync your account settings.
I still prefer Firefox, but they are not immune to the encroaching enshittification.
I agree they're not immune whatsoever. In fact I hold them to a higher standard than the others because it's their mission to do it, so their failures sting much harder.
But I hold the others to zero standard. There is less than zero trust there. I expect to be abused by them because their mandate requires them to ignore my wishes. It's not a failure but a success to them.
Well wait, I don't think jeffbee was saying it's bad to enjoy things, but rather that the person they were responding to was implying something, namely "Lynx is (in some way) better than Firefox because it doesn't take telemetry data."
Lynx definitely takes less telemetry data than Firefox, but it also gets substantially fewer updates, including security updates. I think text-based browsing is pretty fun but I don't really use it in no small part because of the infrequency of updates.
I can see how the post could be interpreted that way. I've added an edit at the bottom to clarify that I'm not suggesting people actually use it as they main one.
Yeah, right after I hit post it occurred to me that assorted media codecs (pictures, video, audio) were probably the next largest attack surface that lynx would also necessarily be immune to :)
I don't know about Lynx, but terminal browsers can display images. w3m is able to do it on virtual terminals and terminal emulators that support it if you install the right packages (w3m-img on Debian for instance).
I don't know nothing about Lynx, except that I always wanted to write a CLI web browser that did support all web features like JavaScript, just to see if it'd work.
This advice mainly applies to people using old OSes or who don't update their browsers.
I just went through Lynx's the <20 CVEs over the last 20 years and couldn't find any that haven't been fixed. Same cannot be said for Chrome or Firefox which have dozens every year.
> Pseudonymous user so concerned about privacy that they use the browser with by far the greatest density of exploitable flaws.
"I love Lynx" is different from "I use Lynx for security-sensitive browsing," and "greatest density of publicly documented exploitable flaws" is, even if true (I don't know), not the same as "greatest density of exploitable flaws."
While you're probably right and we should be concerned, I'd say what is more concerning that than the quantity is the content.
Whenever I hear that an app is collecting telemetry I feel conflicted between leaving it on for maintainers to gain a better understanding of performance and potential issues, or off so that it's not used to profile me.
It would be nice if telemetry was somehow simply differentiated through some app options.
Chromium is open source. And unsurprisingly all the data collection bits are open source too. (They call it UMA metrics in the codebase.) Search in the codebase for things like UMA_HISTOGRAM_ENUMERATION or SCOPED_UMA_HISTOGRAM_TIMER and with a free afternoon you'll have a pretty good idea what kind of telemetry Google really collects.
Chrome is closed-source though. There’s no way to make a reproducible build of Chrome (the Google binary adds DRM and could be adding more).
I’m mentioning this, because this open-closed ambiguity is a typical Google strategy. Similarly, Android in the AOSP flavor is open, but the OS that actually ships on phones is different.
(Sorry I edited my comment before I saw your reply.)
That doesn't seem very useful for the metrics shown in that article. For hard to find bugs sure, for 95th percentile calculations and so on you can just buy a few computers at a retail store and get the same information.
New computers don't behave like old computers, and it's not worth trying to guess why that might be. Could be anything running in the background, old NAND, old battery, low disk space, satellite internet…
Once you do have a model of badness I agree it's better to try to set that up yourself.
That can get you 95th percentile calculations for brand new computers that you bought from the store in 2023 that are running Firefox alone, but that doesn't help you understand what your performance will look like when you're running on a 10-year-old machine running Windows 7 while the user is also running Microsoft Word, Excel, and Outlook at the same time. Your P95 numbers aren't especially meaningful if you've only tested ~10 different PC configurations.
Maybe you get the same result, but with the real user data, you can confidently say the performance has been improved without an disclaimer saying the data was collected in-house.
What's great about it? They can't tie to a specific website, the data is dirty with other factors (as they acknowledge themselves), so what's the benefit vs just testing in on an sample of actual websites to see what is slow?
Happy to see the top comment on a Mozilla/Firefox article not being somebody grinding their axe with Mozilla (and I say that as somebody definitely having a few) :)
I just don't like being bullshitted. Constant marketing about privacy while they're phoning home a bunch of data when you start and stop the browser. I did at some point find a doc page with a zillion steps to disable all of it but that doesn't remediate the hypocrisy IMHO.
What telemetry are you objecting to? Telemetry has good and bad uses. For example, sending in automatic crash reports helps companies find bugs. It can also expose sensitive information which was in ram at the time of the crash.
Another example is usage telemetry tells developers what part of the app is being used and can help them focus popular features or on working to let people know about useful but under used portions of the app.
My main complaint about people who dislike telemetry is they never acknowledge its good uses and they never state what telemetry is objectionable.
> My main complaint about people who dislike telemetry is they never acknowledge its good uses and they never state what telemetry is objectionable.
There's a good reason for that: it is an asymmetric relationship.
The person who enabled telemetry isn't necessarily the user of the software. Ie. it can be mandated or put on by a sysadmin (even by mistake), without user's say. On top of that, the user of the software and/or sysadmin are unable to assess whether they want to share the data because they cannot analyze the data beforehand. They lack the expertise in doing so.
Meanwhile I have to disable telemetry every friggin' time I use Mozilla Firefox. It gets old, having to say 'no' all the time, ya know? I now realize how it feels being a young woman on the market. Geez, I feel sorry for my daughter. The shit she'll have to endure, sayin' 'no' all the time.
Religion seems like a needlessly incendiary example that is going to bring up some strong rhetoric.
But I mean I’m an atheist and I think religion is, on net, bad. But we’ve allowed a sort of less dangerous version of it to persist in most advanced countries, in the form of separation of church and state. If it was really just all bad, I suppose we’d ban it altogether.
I think people can generally see that there are some pros to things they don’t like. Not engaging with the aspects of something that are inconvenient to your case puts you in the realm of propaganda and rhetoric, not good faith discussion.
Probably because there's little disagreement about the existence of the benefits of it or what they are. That's not the issue.
For me, the issue (as with all things like this) is about consent. Opt-in telemetry? I have no issue with it. Opt-out telemetry? Very sketchy, but at least you can opt out. Undisclosed or mandatory telemetry? Completely unacceptable.
That's some of it but not all of it. If you uncheck those and proxy FF when it's starting you'll see the chatter. I have the doc page I'm talking about somewhere but I have no idea where it is. Fully disabling it is a long complex process involving about:config.
* Found this: https://github.com/K3V1991/Disable-Firefox-Telemetry-and-Dat... I haven't compared their list to the one I've used before but it's along the same lines and explains the discrepancy between the config settings and Firefox's actual behavior.
> Telemetry data is stored locally by default. As long as the relevant options in the settings' UI are unchecked, or datareporting.healthreport.uploadEnabled is set to false in about:config, this data won't be sent. <https://medium.com/georg-fritzsche/data-preference-changes-i...>
There's likely to still be some non-telemetry chatter, like checking for available Firefox/plugin updates etc.
If there is one thing we should have learned over the past decade, it should be that if the data is collected, it will be sent.
I followed the argument and I understand what you are saying. What I am saying is that it was not that long ago that FF decided to disable plugins remotely ( I think we even discussed it on HN[1]). What makes you think they won't one day push an update to just upload that local data?
Click the menu button Fx89menuButton and select Settings.
Select the Privacy & Security panel.
Scroll to the Firefox Data Collection and Use section.
Deselect the Allow Firefox to send technical and interaction data to Mozilla checkbox.
You can get directly there by copying into the url bar
I dont understand why this was downvoted. Why should a person be forced to allow updates with no opt-out or pause option? Everyone complains about it with Windows and its bullshit but FF gets a free pass?
> In order to measure the user experience, Firefox collects a wide range of anonymized timing metrics related to page load, responsiveness, startup and other aspects of browser performance.
So, they collect lots and lots of telemetry and what do they do with it?
> Let’s start with page load. First Contentful Paint (FCP) is a better metric for felt performance than the `onload` event.
Misinterpret and misuse it. Better, my a**.
I can't describe how much I HATE pages that are rendered before they are completely loaded.
I type an adress, I hit enter, and promptly I see the page content. I try to click a link or a button on the page, but guess what? Just before I click it, an image just loaded above the link. The link moved down from under the cursor, and now I click the image instead of the link. And the image IS AN AD!!!
It's like FF devs are actually TRYING to make me click ads!
But I have some itching concerns about that methodology with heavy anonymisation and stuff there:
Couldn't it be that the web itself got faster, and folks have better hardware?
I mean, yes, generally these sloppy script kiddies with bloated frameworks produce less and less effective code … but I can imagine, that when some heavily visited page deploys a new better optimized and leaner version (I assume such things happen in reality, don't they?), there is no way to tell it apart from telemetry data telling that everything got slightly faster on average. Or with people getting better hardware, or with OS and driver updates, etc.