> If NaCl is capable of reaching 1.1x performance while asm.js is at 2x (and it's not obvious to me within the confines of JS if you can actually do better than 1.5x), people would switch to a browser that supports NaCl (first the hardcore gamers, then the rest of us).
NaCl is not portable. asm.js is. You should compare PNaCl and asm.js.
> and it's not obvious to me within the confines of JS if you can actually do better than 1.5x
> NaCl is not portable. asm.js is. You should compare PNaCl and asm.js.
NaCL covers the only architectures that are likely to matter for phones and desktops for the next 5+ years: ARM and x86. In doing so, it can support NEON, SSE, and other architecture-specific hardware features that make substantial differences in performance.
So why shouldn't you compare NaCL to asm.js?
PNaCL is there to cover the case where we move past ARM and x86, in which case it provides the ability to target new architectures from existing deployed code. In an ideal universe, sites would serve fat x86/arm/pnacl binaries, optimized for a given client.
> NaCL covers the only architectures that are likely to matter for phones and desktops for the next 5+ years: ARM and x86.
This is misleading. NaCl exists for both these platforms, but it doesn't support running the same code on each; NaCl lets you run native x86-compiled code on x86 and native ARM-compiled code on ARM.
> PNaCL is there to cover the case where we move past ARM and x86
That, too, but more importantly, it exists to let you run the same code (delivered as LLVM bitcode) on supported platforms, including x86 and ARM, instead of having to deliver separately-compiled code for each supported platform, as is the case for NaCl. That's what the "P" ("Portable") is about -- the code delivered runs on any PNaCl engine, unlike the case with NaCl where NaCl x86 code runs only on x86 and NaCl ARM code only on ARM.
You seem to have confused what PNaCl does now (experimental toolchain inside the NaCl SDK), and what the point of PNaCl is (to move the .pexe -> .nexe bitcode to native code piece of the toolchain from the SDK side into the client side, so all you deliver to the client is the platform-agnosti .pexe rather than the platform-specific .nexe.)
> This is misleading. NaCl exists for both these platforms, but it doesn't support running the same code on each; NaCl lets you run native x86-compiled code on x86 and native ARM-compiled code on ARM.
I didn't intend for this to be misleading -- the option to deploy code specific to each architecture is a benefit.
It means you can take advantage of the architecture's specific features (such as NEON and SSE).
> I didn't intend for this to be misleading -- running code specific to each architecture is a benefit.
Permitting it may be a benefit in some cases,
Being limited to it, as NaCl is until client-side PNaCl is ready, is a disadvantage.
Which is why Google is not enabling NaCl by default on the web (i.e., outside of the Chrome Web Store) until client-side PNaCl is ready -- they don't want needless proliferation of non-portable NaCl code on the web. And, I suspect, they want to make the .pexe-to-.nexe process as efficient, in terms of code generation, as possible, so as to minimize the scope of cases where specific-platform-targetting is necessary.
The phase that the web camp is in currently reminds me of a brief phase with Java in the 90's where everything was going to be rewritten on top of the JVM and all heterogeneity in computing stacks was going to disappear overnight.
We've already moved past ARM and x86: there's a bunch of MIPS Android tablets and phones on the market (just not in the West), and MIPS is extremely popular in home appliances (e.g. most LCD TVs).
But this is besides the point. The real question is what the web will look like even 10 years from now, with a large installed base of arch-specific binaries everywhere. I don't want to have to explain to my kids why their browser requires the equivalent of a PDP11 emulator installed to watch certain classic movies and games.
We haven't moved past ARM and x86. Those two cover almost the entire mobile/tablet/desktop market.
If there's a real MIPS resurgence outside of WiFi routers (eg, extremely cheap/low-quality tablets), then they can use PNaCL just fine until a native NaCL target is implemented.
However, given the market weight behind ARM and x86, this seems extraordinarily unlikely to be used outside of some extreme edge players. IIRC, MIPS support isn't even mainlined into Android.
Right, so by 2040 we can't distribute an application without rebuilding it for 5 or 6 different architectures, one of which is a bitcode format that doesn't even run at native speeds, despite the name.
I really love where this is going.
[Edit]
You're basically demonstrating the entire problem with the Native Client camp - you're so myopically focused on the immediate needs and the immediate environment (arm! x86! games! performance this year!) that you have absolutely no appreciation for the trail of destruction being left for future generations. You have Pepper which is a huge bastardized, specialized clone of all WebGL/HTML 5 audio/media APIs, you have a totally separate offline manifest format that doesn't work like a regular web app, then you have 20 varieties of dumb formats pushed by a single company to advance its private for-profit agenda, when we already had all the vendor-neutral components openly specified, in place and deployed, that just needed some incremental improvement.
"The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs." -Alan Kay
The problem with your camp is that you're so averse to learning anything but the technologies you already know that you're completely blind to their deficiencies. Javascript is a mediocre-at-best, terribly-flawed-by-design technology that is pushed forward only by its sheer ubiquity. The sooner we replace it with something better, the sooner the entire web benefits.
I'd love if JS would receive a rewrite. But when you get down to it, JS isn't so horrible as you put it. In fact it is a very elegant language burdened by its initial design constrictions (i.e. forced to look like Java, tight time budget). So I don't understand all the hate.
Except, we don't need an elegant language (and Javascript is hardly elegant, by the way). We need a virtual machine, and that's essentially what Mozilla is trying to define as a DSL in Javascript with asm.js. Instead, why not create something better from scratch?
Look past surface (asm.js or JS syntax) to substance. Browsers (not just Firefox) have a VM, with multiple implementations and amazing, continuously improving performance.
Any from-scratch VM could be great, and I hope someone skilled (not a big company big-team effort) does one. Short of a new market-dominating browser, good luck getting such a new VM adopted and then standardized with two or more interoperable implementations cross-browser.
Shortest-path evolution can get stuck, as some here fear, but JS is nowhere near stuck.
Indeed JS is evolving before our eyes into a multi-language VM. asm.js and Source Maps are just a start, and we could have a bytecode syntax (or AST encoding standard, alternatively) down the road.
But any clean slate work along those surface-bytecode-language lines, not to say a whole new VM, is high-risk and low-adoption (at first, and probably for a while) by definition, compared to small evolutionary jumps for JS syntax and optimized parsers and JIT'ing (and with asm.js, AOT'ing) runtimes already in the field and auto-updated in modern browsers.
I'm fine with the democratization of the VM as long as they are standards-based (and compliant). If you look at the comparison of web framework performance benchmarks, they tend to group around their VM even when compared cross-VM, or comparing frameworks built on top of other frameworks which share a base VM (e.g. vert.x is about as capable as servlet in the Java VM world; ExpressJS is about as capable as Node.js (framework) in the Google V8 world).
It seems to me that the VM is very important. One of JavaScript's latest, greatest triumphs has been its standardization. I care much more about -- for example -- Backbone.js's impact on code organization/standardization within an application than I do about its capability as a framework.
I hope that ECMAScript 6 -- while it brings awesome new functionality to the language -- will also bring with it more backwards compatibility and standardization that these frameworks currently provide (in a somewhat fragmented, yet digestible way).
And I hope the same for the democratization of the VM.
> Right, so by 2040 we can't distribute an application without rebuilding it for 5 or 6 different architectures
5 or 6? 2040? All we've seen over the past 20 years is the gradual reduction of in-use architectures down to a few winning players. We're even seeing it in the SoC/MCU space now, where custom architectures are being replaced with ARM.
I expect to continue to see evolution and simplification, not revolution. If there is a revolution, well, we'll want to take advantage of the power of that fancy new hardware, I suppose.
> one of which is a bitcode format that doesn't even run at native speeds, despite the name.
PNaCL generates native code from the bitcode.
[response to your edit]
> You're basically demonstrating the entire problem with the Native Client camp - you're so myopically focused on the immediate needs and the immediate environment (arm! x86! games! performance this year!) that you have absolutely no appreciation for the trail of destruction being left for future generations.
I would say that you're demonstrating why the web camp continues to fail so horribly to move past simple applications: you're so myopically focused on the ideology of tomorrow over providing a great user experience today.
In fact, I day say you're leaving a far worse trail of destruction, because web applications are the ultimate expression of always-on DRM. When the servers for your "apps" disappear in a few years time, the applications will disappear right along with them -- forever.
How is the Internet Archive going to archive your webapp?
> You have Pepper which is a huge bastardized, specialized clone of all WebGL/HTML 5 audio/media APIs
That's an overstatement of Pepper. It's quite simple -- far more so than the massive browser stack that WebGL+HTML5 brings with it.
In fact, Pepper is simple and constrained enough that someone other than an existing browser vendor or major corporation could actually implement and support a standalone implementation of it. By contrast, producing a compatible browser is such an enormous undertaking that even companies like Opera are giving up and tracking WebKit.
>> You're so myopically focused on the ideology of tomorrow over providing a great user experience today.
> Through ActiveX Technologies, today's static Web pages come alive with a new generation of active content, including animation, 3-D virtual reality, video and other multimedia content. ActiveX Technologies embrace Internet standards and will be delivered on multiple platforms, giving users a rich, open framework for innovation while taking full advantage of their investments in applications, tools and source code. "ActiveX brings together the best of the Internet and the best of the PC," said Paul Maritz, group vice president of the platforms group at Microsoft. "Users want the richest, most compelling content delivered through the ubiquitous Internet."
— Microsoft press release, March 1996.
We've seen this all before, and it was as much of a land grab using similar "user experience" excuses then as it is now. And despite it all, everything from this press release is possible with open technologies today already. Now we come to address the performance issue, and platform proponents (let's drop the pretence – platform shills) continue to fiercely reject it, because providing a good user experience was never the original agenda.
I'm deeply reminded of the leaked internal DART e-mail ( https://gist.github.com/paulmillr/1208618 ) where as much ink was expended on marketshare as it was on technical issues.
ActiveX was technologically flawed: it was specific to Microsoft Windows, and it was insecure.
NaCL is secure and portable, without sacrificing the user experience today.
Indirectly referring to me as a platform shill is unwarranted. The reason that I argue so strenuously for maximizing performance is simply that performance improves user experience.
> I'm deeply reminded of the leaked internal DART e-mail ( https://gist.github.com/paulmillr/1208618 ) where as much ink was expended on marketshare as it was on technical issues.
I've read the whole memo, and I don't understand what your complaint is at all. Google seems to be focused on actually improving the web as an application development target, and making it competitive with modern platforms.
By your own words Native Client is technically flawed: it's specific to x86 and ARM (and whatever other vaporware platforms they promise "next week" – PNaCl has been due for years already), and removing the VM, it's also insecure. So basically Native Client is ActiveX inside a VM.
How exactly is that by my own words? You also seem to be confusing quite a few technical topics.
First, you're equating ARM and x86 with Microsoft Windows. Those are all "platforms", but they exist at very different levels of the technology stack, and the impact of relying on them is quite different.
Samsung and Apple can both ship compatible ARM devices. Only Microsoft could ship an ActiveX runtime.
Second, you're throwing out "security" as a red herring without actually explaining what is insecure. Exactly what are you talking about?
Lastly, if PNaCL is not viable, then NaCL is not viable. Thus, Google is holding off on NaCL until PNaCL is viable. In the meantime, I'm targeting mobile and desktop applications -- today -- which is entirely consistent with my earlier stated position.
"ActiveX is definitely going cross-platform," Willis says. "To be a viable technology for the Internet, we need to be able to support more platforms than Microsoft Windows. You cannot predict what kind of platforms will be on the other end."
Which part of ActiveX wasn't portable? COM is definitely portable. Shipping signed executables that implement COM interfaces is as "portable" as NaCL is, so no problems there.
If your complaint is that ActiveX controls used the Win32 API, then you're missing the point. A Mac or Linux ActiveX control would use OS-native APIs, but still use COM and be an ActiveX control.
Furthermore, if you had even looked up the Wikipedia page for ActiveX, you'd see that:
"On 17 October 1996, Microsoft announced availability of the beta release of the Microsoft® ActiveX Software Development Kit (SDK) for the Macintosh."
Yep, so non-portable that they released a Mac SDK back when Macs were PowerPC. Super non-portable. The non-portablest.
ActiveX was portable the way that the SYSV/SVR4 binaries were portable -- almost entirely in theory.
"It was introduced in 1996 and is commonly used in its Windows operating system. In principle it is not dependent on Microsoft Windows, but in practice, most ActiveX controls require either Microsoft Windows or a Windows emulator. Most also require the client to be running on Intel x86 hardware, because they contain compiled code."
What rational comparison do you have between NaCL's constrained syscall interface + Pepper, and ActiveX?
> Yep, so non-portable that they released a Mac SDK back when Macs were PowerPC. Super non-portable. The non-portablest.
Which produced binaries that couldn't run on any other PPC OS other than Mac OS, and provided a runtime that couldn't run the ActiveX controls being deployed for Windows clients, even if it was somehow magically possible to run x86 code.
It was explicitly advertised as being non-portable, as a feature: "Developers can create ActiveX Controls for the Mac that integrate with other components and leverage the full functionality of the Mac operating system, taking advantage of features such as Apple QuickTime."
Not due to technical constraints, unlike ActiveX. It's a self-fulfilling prophesy -- Mozilla's refusal to adopt NaCL will, indeed, stall the adoption of NaCL.
Things might get more interesting if NaCL provides a higher performance deployment target for Chrome users than asm.js does. Then you can just cross-compile for both.
Last I checked, Pepper was full of underdefined stuff that depended on details of Chrome internals. So no, it's not "simple and constrained" enough that you can implement it without reverse-engineering said internals....
A simple example: What happens if negative values are passed for the top_left argument of Graphics2D::PaintImageData? The API documentation doesn't say.
I just picked a class and function at random.... That sort of thing is all over the API.
I think there will be - maybe not at the ARM popularity level, or even x86's level, but it's been acquired by Imagination and I'm sure they have pretty big plans for it.
I agree that it's wise to default to a portable implementation. I also think it's wise to also have the option of serving up binaries that are optimized for the devices your users actually have, today.
Curious, is the performance of PNacl much worse than regular NaCl? I actually like NaCl, but asm.js sounds like a more suited option in the long term, since the other browsers are probably much less likely to adopt NaCl anyway, unless they are going to develop similar sandbox technology as Google, which I think is pretty unlikely right now. So because of that I see the others adopting asm.js much faster than NaCl.
Numbers are IEEE754 floats and no currently accepted spec provides native 64 bit integers with arithmetic.
Even the modest 32-bit integer multiplication is tricky (you only have 52-53 bits of mantissa). Mozilla proposed `Math.imul` extension.
goog.math.long provides a mechanism using an array, and emscripten does something similar (but can also use the hack if you are sure there won't be overflow issues)
Just like pcwalton said, you can trick js engines to do real 32-bit arithmetic. (x|0 + y|0)|0 tells the engine that it's 32-bits all the way through, and it optimizes it appropriately and never touches floats.
64-bits is a problem, yes, but again like pcwalton said, it is being worked on for post ES6 era. Obviously, if the Unreal Engine is working, it's not necessarily critical & a show-stopper, but it is needed for certain things.
Edit: Math.imul is needed because the number can become large enough to where the engine converts it to a double, and then converts it back and you lose precision (I think that's how it works). It's the only arithmetic that needs fixing.
But then you get back into the game of asking browser vendors to make changes, which was one of the original selling points of asm.js (that browser vendors didn't have to make changes).
JS is changing, ES6 is coming. Browser vendors want changes and are on board. They're also under competitive pressure.
But in what scenario is the much greater work to add Pepper and PNaCl support going to prevail? You have to assume JS dies or freezes, and there's exogenous pressure for something new. Instead, we see demand for JS to evolve (Mozilla serves this, we aren't just ginning it up, believe me!).
So browser vendors would have to do two big jobs: tend the web standards and evolve them, and put in a new native code runtime alongside. That is simply not going to happen outside of Google, which can afford endless impossible-to-standardize boondoggles. It is both uneconomic, and technically inferior to a more minimal evolutionary path-finding solution.
Sure, it's unfortunate, but it's one obscure edge case that they can implement instantly. They are still pushed to implement asm.js anyway, so that just comes with it. If they don't, things like BananaBread do still run, so it's not required, it's just an obscure edge case.
> Numbers are IEEE754 floats and no currently accepted spec provides native 64 bit integers with arithmetic.
No, in asm.js, numbers are not always floats. There is a type system that can classify numbers into integers. Native 64 bit integers are being worked on as an extension.
32 bit integers can be represented exactly in a Number but not 64 bit integers. And the typed array system provides Int32Array and Uint32Array but no 64-bit
Downvoters: which part of the statement is factually incorrect? Read the ES3 and ES5 specs
Implementing 64-bit integers on top of JavaScript is going to happen. It's far less effort to implement 64-bit integers than to adopt Native Client, and JavaScript needs 64-bit ints anyway—the Node community has been asking for it for a long time, for example.
NaCl is not portable. asm.js is. You should compare PNaCl and asm.js.
> and it's not obvious to me within the confines of JS if you can actually do better than 1.5x
Why not? Is there a technical reason?