I understand (and applaud) the mission to make a cross-browser compatible, predictable format for extensions. What I don't understand is why Firefox is apparently introducing some of Chrome's more weird limitations with it as well:
E.g., XUL add-ons were free to add as many toolbar buttons and menu items as necessary; WebExtensions are restricted to a single button.
Likewise, Firefox had a well-functioning way for extensions to access page JS. This is now replaced by Chrome's awkward approach of injecting script elements, sending messages and praying that no collisions occur.
I believe the injecting script part is to support having the chrome and the content in different processes.
I'm sad that, on the UI modification front, the new system doesn't seem nearly as powerful as XUL overlays. I worry that when Servo comes into production use, all that extensibility will be dropped (with Chrome not having it as the justification) and we'll be left without a highly customizable mainstream browser.
I understand, and even agree with, a lot of the goals behind the WebExtensions API. I'm honestly still worried firefox isn't going to encourage plugins like pterosaur, pentadactyl, etc.. going forward.
Will modifying the UI and the DOM essentially be verboten?
Native.js is interesting. I would wonder how it plans to handle things like callbacks for various events though. As an example, a keypress in a textbox (e.g. pterosaur). I haven't looked at the WebExtensions API in detail yet, so it may already be a solved problem.
More specifically, editing text seems oddly neglected, which is disappointing considering webmail, blogs, social media, stack overflow, etc... I'm glad noscript is being given special consideration.
Alongside the XUL migration, the move to requiring extension signatures has provoked controversy.
I wonder, though. Since WebExtensions have fine-grained permissions, perhaps you could permit unsigned extensions which don't do potentially nefarious things?
Is it possible to write a DownThemAll!-like extension?
I'm still a bit confused on why I should continue to use Firefox when all the features of extensions are stripped to be compatible with Chrome. What am I missing?
Not too relevant, but Firefox has gotten significantly slower in my computer in the past few versions, to the point that I will most likely switch to Chrome again. I barely have any extensions and it's never been slower than what I have now.
I've felt the same, especially with the developer tools, but I started using Firefox developer edition, and it seems much, much faster, so I'm sure improvements are coming.
Thank you for the tip. I installed it and it does indeed seem faster. I suppose the plugin compatibility is not too bad.
EDIT: Seemms like my lovely vimperator doesn't work too well.
Completely agree with this. The difference in resources used by Firefox 30 compared to Firefox 40 is just staggering. A netbook that used to run versions of Firefox 30 and below just fine now bogs down so much in newer versions of Firefox that the netbook is regularly left unusable for minutes at a time. It’s gotten so bad that I try to avoid opening Firefox as much as possible now and just use qutebrowser¹ instead. It’s extremely liberating to be able to open a browser and not have to wait a minute before being able to type anything in the address bar.
I do remember reading about qutebrowser; in fact, I did try it around three or four months ago. Might give it again for a couple of weeks and see how much I like it. Thank you for the tip.
So WebExtensions is Mozilla's name for a subset of the Chrome Extension API they'll be implementing?
It's really weird to see a project named WebThing that doesn't seem to exist outside of a particular vendor's blog/wiki. I'd expect to find a public mailing list/GitHub repo/landing page without much searching. They don't even own webextensions.org.
Firefox is more relevant now than it was before, I welcome their efforts to change with the times.
They are taking massive risks now, simultaneously developing an entirely new programming language (Rust) and building an entirely new browser engine on it (Servo) is ambitious. It is clearly self-evident that it has paid off.
There is nothing wrong with node.js programmers, your implication that they're inferior is a broad-sweeping and baseless generalization.
The grey hairs are most certainly still around, and they're most definitely well-respected. Rust's C++ roots is evidence of this.
Mozilla is flourishing, that cannot be denied.
Firefox is a normal open source project, Mozilla receive's contributions from around the globe.
The hacker ethic isn't gone, it matured into something better.
'Declining in market share': I assume you count 'Google throws crappy internal ads at every internet user and encourages mum and pop to "upgrade your browser"', right?
I find it difficult to see how that can _not_ damage a free market.
Mozilla seems to flourish as far as I am concerned: Servo, Rust and Firefox are surprising me time and time again and seem very well received in all my social circles.
> 'Declining in market share': I assume you count 'Google throws crappy internal ads at every internet user and encourages mum and pop to "upgrade your browser"', right?
Of course, Firefox numbers are going down, no? The reason is irrelevant.
I've seen varying sources and while they agree that Chrome leads (see above, unsurprisingly a wildly successful ad company is able to promote their own product), I don't believe that users are actively leaving Firefox.
Back to flourishing: Recently an article on this very site explained that Mozilla is in a very stable financial situation. So, they seem to be independent (money) and successful (reach - I guess this is where we might not agree?).
I just cannot understand the reasons behind a lot of 'Mozilla is basically dead and irrelevant' style posts on this very site.
Not what was said. He said that the browser market share for Firefox is declining. I know there are plenty of Mozilla fanboys here, but the fact that Firefox is losing is not really disputable : https://en.wikipedia.org/wiki/Usage_share_of_web_browsers
Mozilla's success depends hugely on Firefox, which means, even though they are in a stable financial situation now, if they cannot stop the slide, they will be in trouble in the future.
Firefox's experimentation has been a little bit worrying, but I'm sticking through with them and recommending them everywhere I go so; long as they remain committed to the principles that make Firefox stand out.
Chrome's propagation strategy is obviously very effective, but I know many people who eventually switched to Firefox and haven't gone back. It's a long race, and I'm happy that Mozilla is not giving up.
> - Servo: How many years do they need to build a new browser with the experience they already have?
Browsers are big. There's not really been a from-scratch browser written since the 90s, so there's not really any comparison point as to how long it takes.
It's also worthwhile remembering that Gecko has over ten times the number of developers that Servo does—yet Servo has to catch up with much of what Gecko has done to it!
Servo is certainly getting there (and honestly, the progress that's been made is incredibly impressive, both in terms of the man-hours involved per-feature and in absolute progress: many features are being implemented far quicker than they were in other browsers), and it seems plausible that this time next year there could be something relatively usable.
And nobody has "rushed" to any language introduced in the last decade. The "conquer the world" mentality introduced by Java is over. The big programming language success stories—Scala, node.js, Go, etc.—have all flourished with devoted communities using the languages in the niches they're strongest in. That's what Rust is aiming for, and the resulting community has been amazing so far.
Hi pcwalton, I just wish Servo had clearer goals. In some ways the recent faster-browser initiates have all come out empty in my opinion, as they have nothing to truly show the for the work.
Flexbox is a great example. We were told the problems with DOM updates was caused by the old layouts and things like floats. So Flexbox was supposed to fix those problems. Where's the proof?
I find myself frustrated that now 8 years after the iPhone the web is still incapable of so many things. I wish the browser community would make some very specific goals and say "we'll do whatever it takes to reach these".
Here's a very simple goal: A swipeable sidebar. Here's the requirements:
* It must retain 60fps no matter how fast you swipe.
* It must be easy to create; a virtual-dom shouldn't be a requirement to make up for slow dom updates. A elaborate library that correctly times everything shouldn't be necessary.
That's it. Make that happen. Do whatever it takes. Create another new layout if necessary. Create a new touch spec that's faster, if necessary. Whatever. it. takes. And then when the work is done you can point to this new widget and say that the initiative paid off.
> Hi pcwalton, I just wish Servo had clearer goals.
What is not clear about "parallelize all parts of the pipeline", "get rendering GPU bound", and "write a memory safe browser"?
> Flexbox is a great example. We were told the problems with DOM updates was caused by the old layouts and things like floats.
Nobody said that.
> So Flexbox was supposed to fix those problems.
No, it wasn't.
> I find myself frustrated that now 8 years after the iPhone the web is still incapable of so many things.
Me too.
> Here's a very simple goal: A swipeable sidebar.
You can do it already at 60 FPS. Make an overflow:scroll div and allow the user to scroll it into view. Or even just do it manually with the DOM. It will easily be 60 FPS in existing engines if you animate transforms.
> That's it. Make that happen. Do whatever it takes. Create another new layout if necessary. Create a new touch spec that's faster, if necessary. Whatever. it. takes. And then when the work is done you can point to this new widget and say that the initiative paid off.
There's nothing to do here, as you've described it.
But your comment is actually illustrative of the real problem (at least, what I think it is). We have a ton of ways to do things on the Web. Some of them are fast, and some of them are not. Web developers don't realize what the fast paths are (which is not entirely their fault). I think that all of the paths should be fast, and I think that many of Servo's technologies (such as WebRender) go a long way toward making that happen.
What's frustrating is comments like yours that describe things that the Web can clearly already do and blame browsers for supposedly being unable to do simple things. I used to think the way you did—that the Web is incapable of rendering a 60 FPS scrollable view—and then quickly discovered that, if optimized properly, there is actually no problem. What I think the problem with Web slowness is that Web authors don't know what the fast paths are, and they use things like jQuery.animate() that hit all the slow paths. That is a problem that I think we are well equipped to solve in Servo due to things like parallel style recalculation, off main thread layout, and WebRender, but it's a lot more nuanced than what you describe.
> What's frustrating is comments like yours that describe things that the Web can clearly already do and blame browsers for supposedly being unable to do simple things. I used to think the way you did—that the Web is incapable of rendering a 60 FPS scrollable view—and then quickly discovered that, if optimized properly, there is actually no problem.
I think you misunderstood the widget I'm describing. It is swipeable meaning I swipe to pull the sidebar into view and swipe to push it out of view. If I stop half way the sidebar stops where my finger stops.
Meaning the sidebar has to follow my finger, at 60fps, as I quickly swipe in and out. If such a thing is as easy you claim then a quick demo should be easy enough. I know of no such sidebars on the web that fit this requirement, but am happy to hear all of those devs (and myself!) are just terrible at their (my) jobs.
> Meaning the sidebar has to follow my finger, at 60fps, as I quickly swipe in and out. If such a thing is as easy you claim then a quick demo should be easy enough.
So what you want to do is to install a touch event handler and adjust a transform. Assume that the sidebar is pre-rasterized to a layer via will-change.
In this case, the DOM manipulation is going to be 1µs or so. Style recalculation, 1ms at most (this is a huge overestimate). Layout is skipped since you are animating a transform. Paint is skipped since you're animating a transform, the element is layerized, and all tile contents are cached. Compositing, 3ms at most (also probably a huge overestimate). Even with the most pessimistic possible numbers, I can't get anywhere near 16ms for this workload.
You need to use a Smartphone for the Demo above though (i.e. sth. that can produces touches and has a small enough screen), but it was the first thing I found by googling "hammerjs sidebar".
I'm sure that if you look at a few demos from hammer.js (which is a very - maybe the most - popular touch library) you'll understand why everything you always wanted has already been solved years ago - just not in a very easy way.
But it seems that didn't stop the world from writing libraries that make swiping, panning, and fluid 60fps easy to achieve. Just like with any other language, framework, etc. you just need to learn it well enough...
The example is broken in Chrome 47 on Android 6.0.1: Once I touch the menu putton, the sidebar immediately extends fully. Then i need to swipe right-to-left multiple times to retract it again.
We also have various real-world performance demos showing that, for certain workloads, parallelism results in measurable FPS wins. I haven't wanted to get on stage and publicize them at this point, because it's early. But we do have them.
> What, specifically, were the performance goals then?
I wasn't involved with the design of flexbox, but I didn't see performance as the main goal. Rather, the goal was to make "springs and struts"-style UI layout easy to use, because simulating it in CSS2 had bad ergonomics. Performance wins, if anything, would be relative to doing all of the layout manually in JavaScript, not relative to the CSS2 primitives.
Historically, relatively little attention has been paid to layout performance at all. The main performance wars have concerned DOM performance and JavaScript. (Look at what Dromaeo tests, for example.)
> Link?
I don't have time to create an actual demo for the specific thing you're asking for, but I've done many similar tests. Animating transforms will achieve 60FPS.
That's not what I'm asking for. You're pointing to low-level metrics and I'm asking for high-level examples of things not achievable today that Servo will enable.
> I don't have time to create an actual demo for the specific thing you're asking for, but I've done many similar tests. Animating transforms will achieve 60FPS.
Understandable, well I know there are many such widgets on the web and I've never come across one that fits my requirements. Maybe they are all doing things as slowly as possible, but I doubt it.
> That's not what I'm asking for. You're pointing to low-level metrics and I'm asking for high-level examples of things not achievable today that Servo will enable.
I don't think there are many such things as stated, because the Web is technically capable of almost anything you want to do already. Even if we were to say that CSS is too slow to do the things you need to do (which I don't believe at all), you could always do the famo.us thing, rebuild your interface in WebGL (and even maybe throw out JS in favor of asm.js), and do anything you might possibly want to do in existing engines.
What I think Servo will do is to make it easy to make Web apps fast, by eliminating the huge distinction between the slow paths and the fast paths. In my view this is the biggest performance win we can possibly get on the Web.
To be sure, I'm not saying that Web developers are bad and that it's their fault :) The browsers have created this situation, after all. It's not that the Web is incapable of being fast. It's just performance on the Web is a minefield. I think the biggest thing Servo can do is to eliminate that minefield.
I don't ask this as a challenge or anything like that, I'm really just not very familiar with either so I'm just curious.
The one thing I remember, a few months back I saw a Servo benchmark page posted here in HN, and I think it was claimed that those initial numbers were looking very promising.
They've paid off in the sense that they're clearly on a good path for the future. Both Rust and Servo have earned praise for their approach in their respective fields, and should give Mozilla an edge with the next generation of web browsers.
> There is nothing wrong with node.js programmers, your implication that they're inferior is a broad-sweeping and baseless generalization.
Not the point I was making.
The point is there was a systemic cleansing of senior architects in favor of younger node.js programmers.
Mozilla had the creator of JS there. It's probably one of the offices in the world where the grayhairs could co-exist with the newer crowd.
I thank you for your reply. I don't agree with everything said above (servo has paid off, hacker ethic matured to something better), but will say Mozilla does has a diverse array of new projects.
I'm hoping for your success. I'm happy and encouraged by your optimism. I hope it doesn't come at the expense of insensitively ignoring those who were pushed out wrongly for reasons that had nothing to do with merit.
"The hard truth is internal and external politics in the organization were so toxic senior-level programmers who were more risk averse were pushed out."
I think, if you really want to say something, it'd be more appropriate with your real name and as its own blog post, rather than trying to rain on the parade of some not-management-level Mozillians releasing a project.
I want to just program without all the backstabbing and treachery.
In this case, the drama at this organization in particular is so intense it manifests itself in the news. The creator of javascript, a pillar of the internet, was publicly humiliated and crucified.
Why? They knew his name. They looked up his voting records. That's witchhunting. It's letting a bunch of people in the peanut gallery dictate what goes on - that's not integrity or leadership.
I already suffered enough due to office politics, I feel no need to compromise my identity to voice my opinion.
I want to just program without all the backstabbing and treachery.
I mean, for someone who just wants to program you're creating an awful lot of drama.
I don't really buy your complaints, which come off as mostly baseless ranting. If I were to read a blog post that raised some of those issues in a constructive manner, I might have more confidence in your legitimacy. As it is, though… I don't really believe you.
No comment on their specific claims or validity - but the overarching issue does exist. /r/GitInAction exists specifically to highlight what they are talking about in regards to Open Source projects and, more specifically, GitHub projects where this exact issue is occurring.
A bunch of non-programmer busy-bodies opening issues about wording or grammar without so much as submitting a pull-request to fix it (likely because they don't even know how to open a pull-request, because they don't use Git, because they don't actually code using GitHub). It isn't a stretch of the imagination to think a corporation as large as Mozilla has a few of those busy-body types in it.
It's turned into internal politics, raising non-issues, and creating a fuss around any wrongthinkers to get them fired or scrutinized. Rather than programming.
> here Mozilla is making the Rust language, which is merely a crutch for people not good enough to use C++.
There is, in my experience, zero correlation between programmer skill and frequency of vulnerabilities introduced through memory management problems. In fact, it's often the best programmers who introduce use-after-frees and such, possibly because they're more productive and write more code.
Writing more code doesn't make you productive or a better programmer. Also, i'm not sure how you get a use after free in C++ these days given the libraries and tooling available
> Writing more code doesn't make you productive or a better programmer.
True, but also irrelevant. I have never seen a reasonable metric of "good programmer" that correlates negatively with the number of exploitable memory management bugs they introduce.
> Also, i'm not sure how you get a use after free in C++ these days given the libraries and tooling available
Search the Chromium and Firefox bug trackers.
Whether use after free actually happens in large C++ codebases isn't something that can be debated. It's reasonable to have different opinions about how to fix the problem, and it's even reasonable to say that it's not worth fixing. But it's not reasonable to deny the data.
Take the latest use-after-free in Chrome to date, namely CVE-2015-6765, from the top of[0], The fix for that is was [1], and adds a memory safe standard container (std::set) to keep hold of some objects to change object lifetime. I've looked at it for about 3 minutes and I can tell immediately that the entire implementation of the "appcache update" functionality is a complete hack job. There are variables and calls everywhere keeping track of, and querying, state. There are multiple containers holding objects in different states being manipulated all in the same routine, and this fix adds another. If a "job" can't outlive its "fetchers" then why don't the fetchers own them?
Raw pointers everywhere, this pointers being passed around... "DeleteSoon"... excessive assertions...it's amazing at how much ugly, poorly structured, and scary looking code can be packed in to one file.
I would argue this is a data point in favour of a correlation between programmer skill and frequency of vulnerabilities. Whoever wrote this clearly knows C++, but they don't know how to write good code.
It's very easy to beat up on vulnerable code and say it's "bad C++", but it's also irrelevant. Of course it was bad C++: it was vulnerable code. The problem is empirical and practical. In practice, nobody has managed to scale large C++ codebases up to hundreds of developers without introducing vulnerabilities. You're essentially telling Google "just hire better programmers". That hasn't worked for decades, and it's not going to start working now.
The idea of Rust in this regard is quite simple and obvious. Instead of trying to hire better programmers, just have the computer check to make sure these vulnerabilities don't happen. In contrast to the "just hire better programmers" suggestion, which has failed again and again for decades, this works very well in practice, as memory-safe languages have shown.
At the end of the day, yes, it's theoretically possible to hire hundreds of great programmers who always write perfect C++ with no memory safety problems or undefined behavior. But in practice it never works. Google is in fact in a better position than perhaps any other organization to do this; if it could possibly work for anybody, it surely would have worked for Google!
Rust is a choice grounded in pragmatism. Memory safety enforced by the compiler is the only way that we know of to scale up programs to large codebases with hundreds of programmers without introducing mistakes. "Hire better programmers" may work in theory. In practice, it has been tried again and again and has always failed.
(Also, std::set is not memory safe. It is vulnerable to UAF due to iterator invalidation: iterate over a set and remove elements, for example.)
> It's very easy to beat up on vulnerable code and say it's "bad C++", but it's also irrelevant. Of course it was bad C++: it was vulnerable code.
I'm not just beating it up as "bad C++" per se, I'm beating up the overall thoughtlessness of it all. Introducing a red-black tree, to plug a ownership flaw, is a sign things are dire.
> Also, std::set is not memory safe. It is vulnerable to UAF due to iterator invalidation: iterate over a set and remove elements, for example.
If you want to do that, then you shouldn't be using an ordered set. Doing so, for some arbitrary predicate, is O(n). Language is irrelevant.
Why should the library make doing dumb things easy or safe? (Even though it did make a slight concession in C++11, at zero cost, by returning an iterator from std::set::erase). If someones first thought, when they find a library, which happens to not make something easy, is to shrug it off and write unsafe code, instead of spending 30 seconds re-evaluating their own decisions (whether wrt their own design, or using the library), then I'd say they're not a particularly good programmer.
Now if you want to do something sane, like removing a range or an equal subset, then it's trivial of course:
// remove 6 through 10, inclusive.
s.remove(s.lower_bound(6), s.upper_bound(10));
// remove all the 42s
s.erase(42);
> Why should the library make doing dumb things easy or safe?
Because people empirically do a lot of "dumb things", and it's nice when computers check to make sure that they don't before deploying that code into production.
> If someones first thought when you find a library doesn't make something easy is to shrug and write unsafe code, instead of spending 30 seconds re-evaluating their decisions (whether wrt their own design, or using the library), then I'd say they're not a particularly good programmer.
OK. Let's say you're right and Google (and every other large company, because Google is no worse than any other) is full of "bad programmers" who have no idea what they're doing. What is your practical solution?
> I said the good programmers would eventually be pushed out by webdevs who were not as good, and here Mozilla is making the Rust language, which is merely a crutch for people not good enough to use C++.
And C++ is just a crutch for people not good enough to use assembly.
It would have been funny if you had used C, but you didn't so now it just sounds weird. I suggest you edit it very quickly so you don't embarrass yourself.
This. Firefox NEEDS a fork before all the plugins ecosystem is destroyed. There NEEDS to exist at least one browser for hardcore users, that supports actual plugins and not glorified greasemonkey scripts. Otherwise we are screwed.
"What makes Waterfox so fast? It's built with Intel's C++ compiler. One of the most powerful compilers out there. This enables us to make the fastest possible web browser for all the code changes we make. This potent combination makes for an unparalleled browsing experience."
So basically a build of firefox with a different compiler with binaries available for only mac and windows.
For extra fun its based on an older version of firefox and there is no reason to believe it has devs backporting fixes from higher versions nor that there is anybody with the chops to build anything useful on top of it meaning it
- Is virtually guaranteed to have security vulnerabilities and bugs that have already been fixed elsewhere
- It will either end up with the same flaws you don't like about firefox or alternatively it will be forever stuck at an old version while the bad guys find more and more exploits.
If you have a new account and want to comment here, you're welcome to email us at hn@ycombinator.com.