Well, that's one way to make me feel young again. Part of my PhD [1] (2001) was creating solid ways of linking into Web pages with the aim of creating hypermedia structures on top of existing Web sites without requiring write access to said sites.
Robust location specifiers is an old problem within what used to be the open hypermedia systems (open as in capable of integrating existing software applications into a hypermedia system) research community. It is certainly possible to create heuristics (such as this system here) that work for some use cases (and you can make it more robust by adding multiple ways of specifying the location, e.g., the text selection, offsets from beginning and start, DOM traversals from root or nearest IDed element), but the author is wise to set it as a non goal to make this universally applicable. For that to work, continued control and tracking of the surrounding document would be necessary, and for better or for worse that is not the way hypermedia systems evolved.
Maybe OP would be more useful if it was instead a polyfill for FF and other browsers which don't support that standard. Then people could use the text links without worrying.
It's obsolete not because it lacks nuanced discussions, but because the world has moved on and left it behind. The Firefox team can argue all they want about whether they want to play catch up, but they're always playing catch up and getting further behind every year.
There's no such thing as web standards anymore, only what Chrome and Safari do.
Edit: I think that discussion is a good example. Reminds me of the Ents' council in LOTR that endlessly debated what to do while the battles were fought without them. Firefox is just living in its own little bubble. The rest of the world keeps lurching forward under the Apple-Google duopoly, and Firefox gets more irrelevant every year. It was once a glorious challenger, now it's just a forgotten has-been celebrated only by ideologues and purists. I wish they'd give up on Gecko and just officially work on Chromium to make that better instead.
I think the issue is that even if Mozilla did do that, to achieve the same level of configurability for privacy and security, in ways that the user.js and all it's options offer, or with the many objections they've had to new standards, they would end up needing to fork Chromium to a point where it might become debatable whether the effort was worth it in the first place.
I don't think enough users care about a lot of the values offered by Firefox, and simply use whatever is the default or what most people they know use. If anti-competitive laws force vendors to provide users with a choice, I think this could hopefully give Firefox more of a fighting chance. I think it's important there are options like this that exist, even if they are small, because they provide for the needs of Tor Browsers users for example. Where there are no other browsers that match the anti-fingerprinting features that Firefox offers and which Tor users require.
> I think the issue is that even if Mozilla did do that, to achieve the same level of configurability for privacy and security, in ways that the user.js and all it's options offer, or with the many objections they've had to new standards, they would end up needing to fork Chromium to a point where it might become debatable whether the effort was worth it in the first place.
Yeah. That's basically what happened with Blink/Webkit. But that chasm is narrower than Gecko/anything, since at least they share a common heritage. And with Chromium, we already have Chromium/Brave/Opera/Edge/Samsung/Silk/Vivaldi/etc., all of which have fewer problems than the entirely separate Gecko, to say nothing of SpiderMonkey vs V8 & Node.
A browser is more than just the renderer, as those alternatives have shown. The world might need an underdog nonprofit browser, but it doesn't need an alternative renderer.
At least at our university it is the default browser on every machine. All have ublock origin installed — I wonder how this would affect browser metrics.
I think that's proof more of an idiosyncratic IT department than of Firefox's secret popularity... a single university, or school system even, wouldn't meaningfully affect the statistics.
If anything, Chrome and Safari (or at least their webviews) have a huge edge because of default mobile installations. Gecko will never be 100% compatible with Blink & Webkit, regardless of UA masking.
> Until their followers leave, they won't change anything. Greedy execs.
That's already happened, and they still haven't changed anything. Even Microsoft was wise enough to jump ship to Chromium, and arguably their last in-house Edge was a superior product to Firefox -- lean and mean and well-thought-out, but still failed because it wasn't Blink. Firefox doesn't even have that, but they stay the course because what, stubbornness? Ideological purity? Poor leadership? I dunno.
Firefox just isn't a meaningful contender on any front anymore. UX is slightly worse than other major browsers, privacy isn't significantly better than Safari or Brave, syncing is more complex than Apple/Google because the Mozilla ecosystem is browser-only, its ads are more intrusive because they're bundled as built-in adware, etc. It doesn't do anything better anymore. It's just riding its coattails from the 2000s, like Opera (which also jumped to Chromium).
If Mozilla wasn't a nonprofit sponsored by Google (oh, the irony), it would've been bought out by a no-name Chinese venture firm a long time ago.
Robust location specifiers is an old problem within what used to be the open hypermedia systems (open as in capable of integrating existing software applications into a hypermedia system) research community. It is certainly possible to create heuristics (such as this system here) that work for some use cases (and you can make it more robust by adding multiple ways of specifying the location, e.g., the text selection, offsets from beginning and start, DOM traversals from root or nearest IDed element), but the author is wise to set it as a non goal to make this universally applicable. For that to work, continued control and tracking of the surrounding document would be necessary, and for better or for worse that is not the way hypermedia systems evolved.
[1] https://cs.au.dk/~bouvin/Arakne/thesis.pdf