Hacker News new | past | comments | ask | show | jobs | submit login
Mozilla and Epic Announce Unreal Engine for the Web (blog.mozilla.org)
504 points by metajack on March 27, 2013 | hide | past | favorite | 214 comments



For everyone saying that the claim that asm.js runs everywhere is invalid, because only in browsers that implement the optimization will it be feasible, remember that most games are built to scale performance. On browsers that don't implement the asm.js optimizations, the game could feasibly turn off visual effects and scale down the game so that it runs, just not as pretty or realistic.

There's a huge difference between "this game only runs in this one browser that implements this full VM" and "this games runs everywhere, but only with maximum graphics & physics in browsers that implement the asm.js optimizations".


Because it's still javascript, I'm not particularly worried; Unreal support will likely be a powerful incentive to improve performance on the smaller instruction set even if asm.js itself doesn't catch on.

However, in many typical graphics-heavy cases, your statements aren't really true. Many of the heavy visual effects are offloaded to the GPU if at all possible (especially in the case of Unreal Engine's OpenGL ES path, as it has to rely on the speed of mobile CPUs otherwise), and data is preprocessed wherever possible so that it doesn't have to be done at runtime. This takes the CPU out of many of the bottlenecks where the difference between asm.js code and regular JITed js would make a large difference. If you're using Emscripten with typed arrays at all, you're going to get a huge reduction in GC pressure as well.

You will be able to scale things like physics (since something like PhysX doesn't run on mobile GPUs anyway), but any gameplay-affecting physical interactions won't be relying on actual CPU-based physics simulation.

In any case, Emscripten is a powerful tool even without asm.js, and I will bet that these demos will still run pretty well in browsers without it. Sufficient incentive (like, a popular WebGL game) would be enough to get framerates as rocksteady as what we're seeing in Firefox with support.

The big news here is that Mozilla has convinced Epic to bet on the web (where they were previously throwing all their weight behind their Flash-based engine), and they've demonstrated with a major, non-trivial piece of software how easy the porting process can be. Moreover, because it's Mozilla, we know that they'll make sure that cross-browser support is baked in, so that if or when other browsers are fast enough to play these games, they'll be able to.

Serious kudos to Mozilla for this win.


You're right that a lot of the scaling probably happens with the GPU, and that Emscripten-generated code is still the most performant javascript possible (no GC, consistently typed for inference, etc).

I think there are places where the CPU is the bottleneck though, since we're talking about crazy complex 3d games. Managing a large scene graph, and updating physics are some examples. Simply reducing the viewable distance is a huge help when you are CPU bottlenecked for those reasons (most likely physics).

Agreed that no matter what, the Unreal port will help optimize the typed array/no GC code path! Exciting times.


Modern game engines also leverage SIMD and multi-core SMP. Javascript is inherently single threaded. While there is hope in retrofitting SIMD in (Rivertrail ParallelArrays or Dart's recent work with new SIMD TypedArrays), I don't see any path for supporting multithreading. JS is inherently async and singlethreaded in its memory model.

WebWorkers are not really a multithread model. Game engines could be written to leverage message passing based isolates, but in some cases, but doing it in an optimal form isn't exactly developer friendly, it's one of pains of developing the PS3 SPEs is that they didn't share system memory and you had to DMA stuff around.


Perhaps this is the path you weren't seeing:

http://smallcultfollowing.com/babysteps/blog/2013/03/20/para...


I already mentioned RiverTrail stuff in another response. I don't really think it is a replacement for SSE/AVX/NEON/AltiVec.


Agreed. So we are getting together with John McCutchan who did Dart SIMD intrinsics to do same for JS via Ecma TC39 Harmony process and rapid prototyping in SpiderMonkey -- and I hope in V8 as well.

/be


+1 to adding the SIMD Dart types into JS. This is how I like to see Google and Mozilla working together to improve the web. (WebRTC is another fabulous cooperation)


'There's a huge difference between "this game only runs in this one browser that implements this full VM" and "this games runs everywhere, but only with maximum graphics & physics in browsers that implement the asm.js optimizations".'

That's quite a bit of marketing here.

If NaCl is capable of reaching 1.1x performance while asm.js is at 2x (and it's not obvious to me within the confines of JS if you can actually do better than 1.5x), people would switch to a browser that supports NaCl (first the hardcore gamers, then the rest of us). Your argument only makes sense if asm.js and NaCl are at performance parity.

Note that people still write native iOS games, even though they could run everywhere if they did it in JS.


> If NaCl is capable of reaching 1.1x performance while asm.js is at 2x (and it's not obvious to me within the confines of JS if you can actually do better than 1.5x), people would switch to a browser that supports NaCl (first the hardcore gamers, then the rest of us).

NaCl is not portable. asm.js is. You should compare PNaCl and asm.js.

> and it's not obvious to me within the confines of JS if you can actually do better than 1.5x

Why not? Is there a technical reason?


> NaCl is not portable. asm.js is. You should compare PNaCl and asm.js.

NaCL covers the only architectures that are likely to matter for phones and desktops for the next 5+ years: ARM and x86. In doing so, it can support NEON, SSE, and other architecture-specific hardware features that make substantial differences in performance.

So why shouldn't you compare NaCL to asm.js?

PNaCL is there to cover the case where we move past ARM and x86, in which case it provides the ability to target new architectures from existing deployed code. In an ideal universe, sites would serve fat x86/arm/pnacl binaries, optimized for a given client.


> NaCL covers the only architectures that are likely to matter for phones and desktops for the next 5+ years: ARM and x86.

This is misleading. NaCl exists for both these platforms, but it doesn't support running the same code on each; NaCl lets you run native x86-compiled code on x86 and native ARM-compiled code on ARM.

> PNaCL is there to cover the case where we move past ARM and x86

That, too, but more importantly, it exists to let you run the same code (delivered as LLVM bitcode) on supported platforms, including x86 and ARM, instead of having to deliver separately-compiled code for each supported platform, as is the case for NaCl. That's what the "P" ("Portable") is about -- the code delivered runs on any PNaCl engine, unlike the case with NaCl where NaCl x86 code runs only on x86 and NaCl ARM code only on ARM.

You seem to have confused what PNaCl does now (experimental toolchain inside the NaCl SDK), and what the point of PNaCl is (to move the .pexe -> .nexe bitcode to native code piece of the toolchain from the SDK side into the client side, so all you deliver to the client is the platform-agnosti .pexe rather than the platform-specific .nexe.)


> This is misleading. NaCl exists for both these platforms, but it doesn't support running the same code on each; NaCl lets you run native x86-compiled code on x86 and native ARM-compiled code on ARM.

I didn't intend for this to be misleading -- the option to deploy code specific to each architecture is a benefit.

It means you can take advantage of the architecture's specific features (such as NEON and SSE).


> I didn't intend for this to be misleading -- running code specific to each architecture is a benefit.

Permitting it may be a benefit in some cases,

Being limited to it, as NaCl is until client-side PNaCl is ready, is a disadvantage.

Which is why Google is not enabling NaCl by default on the web (i.e., outside of the Chrome Web Store) until client-side PNaCl is ready -- they don't want needless proliferation of non-portable NaCl code on the web. And, I suspect, they want to make the .pexe-to-.nexe process as efficient, in terms of code generation, as possible, so as to minimize the scope of cases where specific-platform-targetting is necessary.


The phase that the web camp is in currently reminds me of a brief phase with Java in the 90's where everything was going to be rewritten on top of the JVM and all heterogeneity in computing stacks was going to disappear overnight.


Sure, thats why all those NaCL based apps in the chrome store work so well on my ARM based chromebook.


We've already moved past ARM and x86: there's a bunch of MIPS Android tablets and phones on the market (just not in the West), and MIPS is extremely popular in home appliances (e.g. most LCD TVs).

But this is besides the point. The real question is what the web will look like even 10 years from now, with a large installed base of arch-specific binaries everywhere. I don't want to have to explain to my kids why their browser requires the equivalent of a PDP11 emulator installed to watch certain classic movies and games.


We haven't moved past ARM and x86. Those two cover almost the entire mobile/tablet/desktop market.

If there's a real MIPS resurgence outside of WiFi routers (eg, extremely cheap/low-quality tablets), then they can use PNaCL just fine until a native NaCL target is implemented.

However, given the market weight behind ARM and x86, this seems extraordinarily unlikely to be used outside of some extreme edge players. IIRC, MIPS support isn't even mainlined into Android.


Right, so by 2040 we can't distribute an application without rebuilding it for 5 or 6 different architectures, one of which is a bitcode format that doesn't even run at native speeds, despite the name.

I really love where this is going.

[Edit]

You're basically demonstrating the entire problem with the Native Client camp - you're so myopically focused on the immediate needs and the immediate environment (arm! x86! games! performance this year!) that you have absolutely no appreciation for the trail of destruction being left for future generations. You have Pepper which is a huge bastardized, specialized clone of all WebGL/HTML 5 audio/media APIs, you have a totally separate offline manifest format that doesn't work like a regular web app, then you have 20 varieties of dumb formats pushed by a single company to advance its private for-profit agenda, when we already had all the vendor-neutral components openly specified, in place and deployed, that just needed some incremental improvement.


"The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs." -Alan Kay

The problem with your camp is that you're so averse to learning anything but the technologies you already know that you're completely blind to their deficiencies. Javascript is a mediocre-at-best, terribly-flawed-by-design technology that is pushed forward only by its sheer ubiquity. The sooner we replace it with something better, the sooner the entire web benefits.


I'd love if JS would receive a rewrite. But when you get down to it, JS isn't so horrible as you put it. In fact it is a very elegant language burdened by its initial design constrictions (i.e. forced to look like Java, tight time budget). So I don't understand all the hate.


Except, we don't need an elegant language (and Javascript is hardly elegant, by the way). We need a virtual machine, and that's essentially what Mozilla is trying to define as a DSL in Javascript with asm.js. Instead, why not create something better from scratch?


Look past surface (asm.js or JS syntax) to substance. Browsers (not just Firefox) have a VM, with multiple implementations and amazing, continuously improving performance.

Any from-scratch VM could be great, and I hope someone skilled (not a big company big-team effort) does one. Short of a new market-dominating browser, good luck getting such a new VM adopted and then standardized with two or more interoperable implementations cross-browser.

Shortest-path evolution can get stuck, as some here fear, but JS is nowhere near stuck.

Indeed JS is evolving before our eyes into a multi-language VM. asm.js and Source Maps are just a start, and we could have a bytecode syntax (or AST encoding standard, alternatively) down the road.

But any clean slate work along those surface-bytecode-language lines, not to say a whole new VM, is high-risk and low-adoption (at first, and probably for a while) by definition, compared to small evolutionary jumps for JS syntax and optimized parsers and JIT'ing (and with asm.js, AOT'ing) runtimes already in the field and auto-updated in modern browsers.

/be


I'm fine with the democratization of the VM as long as they are standards-based (and compliant). If you look at the comparison of web framework performance benchmarks, they tend to group around their VM even when compared cross-VM, or comparing frameworks built on top of other frameworks which share a base VM (e.g. vert.x is about as capable as servlet in the Java VM world; ExpressJS is about as capable as Node.js (framework) in the Google V8 world).

It seems to me that the VM is very important. One of JavaScript's latest, greatest triumphs has been its standardization. I care much more about -- for example -- Backbone.js's impact on code organization/standardization within an application than I do about its capability as a framework.

I hope that ECMAScript 6 -- while it brings awesome new functionality to the language -- will also bring with it more backwards compatibility and standardization that these frameworks currently provide (in a somewhat fragmented, yet digestible way).

And I hope the same for the democratization of the VM.


With you all the way, although JS's standardization success story has been going on since 1996. I have the scars...

/be


JS is flawed, but I'm not sure NaCl really provides anything better.


> Right, so by 2040 we can't distribute an application without rebuilding it for 5 or 6 different architectures

5 or 6? 2040? All we've seen over the past 20 years is the gradual reduction of in-use architectures down to a few winning players. We're even seeing it in the SoC/MCU space now, where custom architectures are being replaced with ARM.

I expect to continue to see evolution and simplification, not revolution. If there is a revolution, well, we'll want to take advantage of the power of that fancy new hardware, I suppose.

> one of which is a bitcode format that doesn't even run at native speeds, despite the name.

PNaCL generates native code from the bitcode.

[response to your edit]

> You're basically demonstrating the entire problem with the Native Client camp - you're so myopically focused on the immediate needs and the immediate environment (arm! x86! games! performance this year!) that you have absolutely no appreciation for the trail of destruction being left for future generations.

I would say that you're demonstrating why the web camp continues to fail so horribly to move past simple applications: you're so myopically focused on the ideology of tomorrow over providing a great user experience today.

In fact, I day say you're leaving a far worse trail of destruction, because web applications are the ultimate expression of always-on DRM. When the servers for your "apps" disappear in a few years time, the applications will disappear right along with them -- forever.

How is the Internet Archive going to archive your webapp?

> You have Pepper which is a huge bastardized, specialized clone of all WebGL/HTML 5 audio/media APIs

That's an overstatement of Pepper. It's quite simple -- far more so than the massive browser stack that WebGL+HTML5 brings with it.

In fact, Pepper is simple and constrained enough that someone other than an existing browser vendor or major corporation could actually implement and support a standalone implementation of it. By contrast, producing a compatible browser is such an enormous undertaking that even companies like Opera are giving up and tracking WebKit.


>> You're so myopically focused on the ideology of tomorrow over providing a great user experience today.

> Through ActiveX Technologies, today's static Web pages come alive with a new generation of active content, including animation, 3-D virtual reality, video and other multimedia content. ActiveX Technologies embrace Internet standards and will be delivered on multiple platforms, giving users a rich, open framework for innovation while taking full advantage of their investments in applications, tools and source code. "ActiveX brings together the best of the Internet and the best of the PC," said Paul Maritz, group vice president of the platforms group at Microsoft. "Users want the richest, most compelling content delivered through the ubiquitous Internet."

— Microsoft press release, March 1996.

We've seen this all before, and it was as much of a land grab using similar "user experience" excuses then as it is now. And despite it all, everything from this press release is possible with open technologies today already. Now we come to address the performance issue, and platform proponents (let's drop the pretence – platform shills) continue to fiercely reject it, because providing a good user experience was never the original agenda.

I'm deeply reminded of the leaked internal DART e-mail ( https://gist.github.com/paulmillr/1208618 ) where as much ink was expended on marketshare as it was on technical issues.


This is a false comparison.

ActiveX was technologically flawed: it was specific to Microsoft Windows, and it was insecure.

NaCL is secure and portable, without sacrificing the user experience today.

Indirectly referring to me as a platform shill is unwarranted. The reason that I argue so strenuously for maximizing performance is simply that performance improves user experience.

> I'm deeply reminded of the leaked internal DART e-mail ( https://gist.github.com/paulmillr/1208618 ) where as much ink was expended on marketshare as it was on technical issues.

I've read the whole memo, and I don't understand what your complaint is at all. Google seems to be focused on actually improving the web as an application development target, and making it competitive with modern platforms.


By your own words Native Client is technically flawed: it's specific to x86 and ARM (and whatever other vaporware platforms they promise "next week" – PNaCl has been due for years already), and removing the VM, it's also insecure. So basically Native Client is ActiveX inside a VM.


How exactly is that by my own words? You also seem to be confusing quite a few technical topics.

First, you're equating ARM and x86 with Microsoft Windows. Those are all "platforms", but they exist at very different levels of the technology stack, and the impact of relying on them is quite different.

Samsung and Apple can both ship compatible ARM devices. Only Microsoft could ship an ActiveX runtime.

Second, you're throwing out "security" as a red herring without actually explaining what is insecure. Exactly what are you talking about?

Lastly, if PNaCL is not viable, then NaCL is not viable. Thus, Google is holding off on NaCL until PNaCL is viable. In the meantime, I'm targeting mobile and desktop applications -- today -- which is entirely consistent with my earlier stated position.


"ActiveX is definitely going cross-platform," Willis says. "To be a viable technology for the Internet, we need to be able to support more platforms than Microsoft Windows. You cannot predict what kind of platforms will be on the other end."

http://books.google.com/books?id=97W638tj_NsC&pg=PA38...


ActiveX was, at its very core, not portable.

Are you making an objective technical argument, or a solipsistic one?


Which part of ActiveX wasn't portable? COM is definitely portable. Shipping signed executables that implement COM interfaces is as "portable" as NaCL is, so no problems there.

If your complaint is that ActiveX controls used the Win32 API, then you're missing the point. A Mac or Linux ActiveX control would use OS-native APIs, but still use COM and be an ActiveX control.

Furthermore, if you had even looked up the Wikipedia page for ActiveX, you'd see that:

"On 17 October 1996, Microsoft announced availability of the beta release of the Microsoft® ActiveX Software Development Kit (SDK) for the Macintosh."

Yep, so non-portable that they released a Mac SDK back when Macs were PowerPC. Super non-portable. The non-portablest.


ActiveX was portable the way that the SYSV/SVR4 binaries were portable -- almost entirely in theory.

"It was introduced in 1996 and is commonly used in its Windows operating system. In principle it is not dependent on Microsoft Windows, but in practice, most ActiveX controls require either Microsoft Windows or a Windows emulator. Most also require the client to be running on Intel x86 hardware, because they contain compiled code."

What rational comparison do you have between NaCL's constrained syscall interface + Pepper, and ActiveX?

> Yep, so non-portable that they released a Mac SDK back when Macs were PowerPC. Super non-portable. The non-portablest.

Which produced binaries that couldn't run on any other PPC OS other than Mac OS, and provided a runtime that couldn't run the ActiveX controls being deployed for Windows clients, even if it was somehow magically possible to run x86 code.

It was explicitly advertised as being non-portable, as a feature: "Developers can create ActiveX Controls for the Mac that integrate with other components and leverage the full functionality of the Mac operating system, taking advantage of features such as Apple QuickTime."

http://www.microsoft.com/en-us/news/press/1996/oct96/macpr.a...

So, yes, "non-portablest".

What exactly is the equivalence to (P)NaCL?


The fact that NaCL only works in Chromium, and will only ever work in Chromium.


Here's an implementation of NaCl that has nothing to do with Chromium: http://zerovm.org/


Not due to technical constraints, unlike ActiveX. It's a self-fulfilling prophesy -- Mozilla's refusal to adopt NaCL will, indeed, stall the adoption of NaCL.

Things might get more interesting if NaCL provides a higher performance deployment target for Chrome users than asm.js does. Then you can just cross-compile for both.


Last I checked, Pepper was full of underdefined stuff that depended on details of Chrome internals. So no, it's not "simple and constrained" enough that you can implement it without reverse-engineering said internals....


Anything in particular? I eyeballed the API docs recently and didn't see anything so frightening, but you may have more in-depth evaluation:

    https://developers.google.com/native-client/peppercpp/inherits
Google has also stated that they're open to re-evaluating and reworking the API.


A simple example: What happens if negative values are passed for the top_left argument of Graphics2D::PaintImageData? The API documentation doesn't say.

I just picked a class and function at random.... That sort of thing is all over the API.


I think there will be - maybe not at the ARM popularity level, or even x86's level, but it's been acquired by Imagination and I'm sure they have pretty big plans for it.


> why shouldn't you compare NaCL to asm.js?

The portability limitations have made Google wait for PNaCl before it flips the switch to authorize NaCl apps on the Web by default.

Note that I believe this choice is wise. I quote them:

> This will… avoid the spread of instruction set architecture dependent apps on the web.


I agree that it's wise to default to a portable implementation. I also think it's wise to also have the option of serving up binaries that are optimized for the devices your users actually have, today.


Curious, is the performance of PNacl much worse than regular NaCl? I actually like NaCl, but asm.js sounds like a more suited option in the long term, since the other browsers are probably much less likely to adopt NaCl anyway, unless they are going to develop similar sandbox technology as Google, which I think is pretty unlikely right now. So because of that I see the others adopting asm.js much faster than NaCl.


Numbers are IEEE754 floats and no currently accepted spec provides native 64 bit integers with arithmetic.

Even the modest 32-bit integer multiplication is tricky (you only have 52-53 bits of mantissa). Mozilla proposed `Math.imul` extension.

goog.math.long provides a mechanism using an array, and emscripten does something similar (but can also use the hack if you are sure there won't be overflow issues)


Just like pcwalton said, you can trick js engines to do real 32-bit arithmetic. (x|0 + y|0)|0 tells the engine that it's 32-bits all the way through, and it optimizes it appropriately and never touches floats.

64-bits is a problem, yes, but again like pcwalton said, it is being worked on for post ES6 era. Obviously, if the Unreal Engine is working, it's not necessarily critical & a show-stopper, but it is needed for certain things.

Edit: Math.imul is needed because the number can become large enough to where the engine converts it to a double, and then converts it back and you lose precision (I think that's how it works). It's the only arithmetic that needs fixing.


"(x|0 + y|0)|0 tells the engine that it's 32-bits all the way through"

You conveniently skipped the complaint: "Even the modest 32-bit integer multiplication is tricky (you only have 52-53 bits of mantissa)"


@niggler I did at first but I added a quick edit after I posted. It's not a hard thing to fix.


"It's not a hard thing to fix."

But then you get back into the game of asking browser vendors to make changes, which was one of the original selling points of asm.js (that browser vendors didn't have to make changes).


JS is changing, ES6 is coming. Browser vendors want changes and are on board. They're also under competitive pressure.

But in what scenario is the much greater work to add Pepper and PNaCl support going to prevail? You have to assume JS dies or freezes, and there's exogenous pressure for something new. Instead, we see demand for JS to evolve (Mozilla serves this, we aren't just ginning it up, believe me!).

So browser vendors would have to do two big jobs: tend the web standards and evolve them, and put in a new native code runtime alongside. That is simply not going to happen outside of Google, which can afford endless impossible-to-standardize boondoggles. It is both uneconomic, and technically inferior to a more minimal evolutionary path-finding solution.

/be


ok


Sure, it's unfortunate, but it's one obscure edge case that they can implement instantly. They are still pushed to implement asm.js anyway, so that just comes with it. If they don't, things like BananaBread do still run, so it's not required, it's just an obscure edge case.


> Numbers are IEEE754 floats and no currently accepted spec provides native 64 bit integers with arithmetic.

No, in asm.js, numbers are not always floats. There is a type system that can classify numbers into integers. Native 64 bit integers are being worked on as an extension.


32 bit integers can be represented exactly in a Number but not 64 bit integers. And the typed array system provides Int32Array and Uint32Array but no 64-bit

Downvoters: which part of the statement is factually incorrect? Read the ES3 and ES5 specs


Implementing 64-bit integers on top of JavaScript is going to happen. It's far less effort to implement 64-bit integers than to adopt Native Client, and JavaScript needs 64-bit ints anyway—the Node community has been asking for it for a long time, for example.


NaCl gets >2x slowdown in some cases as the restrictions enforced by proof carrying code gets in the way (dynamic code generation, instruction set filtering).

There are faster more efficient ways of isolating a process - and better provided at an OS level (see Xax for instance).

PCC makes sense where you need to inline externally delivered code in a process - inline driver filters and so on.


Right, NaCl early work had beautiful in-process SFI. But that went away soon, and it was OOP-isolated on top. I never got a straight story as to why.

/be


The reason people "write native iOS games" is that today, it the shortest and easiest path between them and $$$.

But that is changing and may still be true tomorrow but maybe not next year.


I feel like an old fart for not buying into the push for the browser to be an OS or game system.

In my mind, no matter how optimized the browser becomes, it will always be slower than natively-compiled code. And in the 3d gaming world, where the faster code always has the greatest potential for the best game engines, the browser will always lose.

My instinct tells me that there will be a huge push for webgl games, some initial adoption and frameworks, etc. The result will be that web developers will realize how fucking complex graphics engines are (I'm writing one now...nothing in webdev compares to it) and will give up. And I can't see graphics developers in the game industry incentivized enough to fully make the jump to the browser (why would they want to write game code that is distributed to everyone in raw source form?)


It does not need to be the absolute fastest possible for it to be fun. If the framerate is acceptable and joining a game with my friends is as simple as visiting a URL, this can be a huge competitive advantage over traditional games.


I understand what you are saying. The tipping point for me in terms of "Oh, that will never happen because of X" was MineCraft. An incredible game written in Java. Sure, it's not pushing extreme graphics, but that doesn't matter. It's the game that matters.

So I look at games on the web the same way. Heck, I've been playing games on the web for years. No, they aren't in the same realm as AAA games coming from Blizzard, Valve, or EA. But, I had fun, native wouldn't have gained them anything, and I spent more time playing them (and more money) then some of those AAA games.


Also, remember the original Serious Sam. Had a generation behind graphics but was successful because it was fun. I can see a Quake Live type game without plugin being relatively successful.


Quake live without plugin would be awesome.

Quake live in it's current form, and same with Battlefield Heroes, is just silly. OMG a game that runs in your browser!!1 You just have to install this activex-plugin first... Totally misses the point of having browser games. I might as well install a full game.

In the case of quake live installing the plugin was actually MORE hassle than installing a normal program, i counted the steps and with a normal game i usually just select install dir and press next two times, done. For quake live i had to go through tons of hoops, the installer had several steps and in the end i also had to restart my browser twice. Sooo convenient that it runs in the browser....


> why would they want to write game code that is distributed to everyone in raw source form?

It's not. Asm.js is pretty much just assembly.


> It's not. Asm.js is pretty much just assembly.

No, it's not. Have you even read http://asmjs.org/spec/latest/ ?

It's a rather high-level bytecode, expressed by constraining JavaScript to a specific subset of the language, interpreted according to rules that are considerably more specific than those specified in the JavaScript spec.


It essentially exposes a load-store architecture, exactly as the assembly language of RISC processors does. There are some minor differences, such as the control structures and an infinite register file, but those aren't really relevant to reverse engineering. (IDA has been able to figure out control structures and local variables for a long time now.)


Would you call Java's or Dalvik's bytecode assembly?


Yes. But note that, even so, there is a large distinction between Java or Dalvik bytecode and asm.js: the memory model of Java and Dalvik bytecode is defined in terms of high-level class types, object instances, and properties, while the memory model of asm.js is a large array, 8 bits wide by X MB long. Naturally, the latter is much lower-level and more difficult to decompile…


In that case, I think we just have some differing definitions.


Given who he is, I imagine pcwalton is far more familiar with it than you are. Either way, you're wrong. Bytecode is not human readable which, as even your definition states, asm.js clearly is.


>In my mind, no matter how optimized the browser becomes, it will always be slower than natively-compiled code.

So what? Phones will always be slower than consoles, and desktops. Consoles will always be slower than Desktops. The performance level that is showcased in the mozilla videos is good enough to create very deep and engaging games.

>The result will be that web developers will realize how fucking complex graphics engines are

I agree, which is why nobody will code in raw asm.js and raw webGL. You'll code in a framework (e.g. Unity) which will compile down to asm.js/webGL.

>why would they want to write game code that is distributed to everyone in raw source form?

Flash games could always be decompiled and that didn't stop anyone from using Flash to write games. There's a world of difference between decompiled code (or minified code as the case will be for JS) and properly organized source code. Client code also doesn't matter for a class of games that have a heavy server component.


> (why would they want to write game code that is distributed to everyone in raw source form?)

You would distribute the asm.js code, which is your object code.


I don't buy the performance argument (covered by other replies), but I too am getting sick of the "reinvent the OS inside the browser" mentality.

If the issue is one of security, the underlying OS should implement some kind of jail, which the browser can then call with whatever crazy native code the website serves. This will also end javascript's reign as the ONLY allowed language on the web (for purely arbitrary reasons).

But who am I kidding - this is never going to happen as long as Windows holds the majority OS share. And people are far too ingrained into the current ways. Long live JS as the new machine code, I guess.


I expect most serious developers will stick with engines like Unity, which let you run games in a browser with a plugin, but also compile and distribute them in standard ways on every popular desktop and mobile platform.

Unity Web Player is easy to install, and it's an increasingly common requirement for a lot of games. It's a barrier, but only a small one.


>I expect most serious developers will stick with engines like Unity

They should stick with engines and frameworks like Unity. Coding raw JS and WebGL is ugly. But there's no reason why Unity cannot compile down to something like asm.js and webGL. In fact, I believe Unity already lets you compile to a swf, which gives you great performance and doesn't require any extra plugins (besides Flash).

>Unity Web Player is easy to install, and it's an increasingly common requirement for a lot of games. It's a barrier, but only a small one.

It's an insurmountable barrier.


"...It's an insurmountable barrier..."

Why do you say so? Why is it a small barrier or an insurmountable barrier?

Genuinely interested in the answers there.

Just seems to me that there are a lot of Flash games, Unity games and Java games. I mean, if Minecraft or BSG Online is any indication... technology doesn't really stop your average gamer. But maybe I'm wrong???


I have developed a couple of unity based browsergames and i can Tell from experience that you loose a sibstantial amount of users at the plugin installation, especially for games that target a more casual market. Unity compiling to asm.js would be huge!


I'm sure we'll see exporting to WebGL from Unity once more browsers start to support it. It already has Flash exporting which makes the Unity plugin dead-on-arrival (well a few years after arrival anyway).


Unfortunately it is a MASSIVE BARRIER in many institutions, we developed software for the military and anything we made had to run on the browser, no additional plugins.


(on Scaling) Games usually scaled based on GPU power, but not so much on the core game logic, AI, etc. Any CPU processing that actually affects gameplay outcome (e.g. collision detection) cannot really be disabled and scaled. Otherwise, you end up playing a different game based on your machine, not a game that only looks uglier or plays choppy.

This demo looks great, but all I see are environments rendered. I didn't see any firefights between lots of player models or vehicles, no Havok/PhysX/Bullet, I didn't see any enemy AI running. I did see some frame rate hitching. It's not the first time someone's had environments (sans game) running, see for example Quake3 (http://media.tojicode.com/q3bsp/). Even the Rage Engine got an environment renderer ported to the Web.

There's a danger in overselling stuff. When I helped port Quake2 to the Web (GwtQuake), I didn't have any illusions about being able to run something like Crysis or Call of Duty series in the the browser in JS, or consume the same resource requirements. I mean, I thought it would be possible, but a hideous waste of system resources.

There was a time when Sun was marketing Java and pushing heavily on this idea that it would one day meet or beat C/C++ performance, but it never quite got there after 2 decades (yes, in some circumstances it comes close, in others, it fairs badly). I see the same parallels today, pushing the idea that a dynamically typed language with some extra 'hints' is going to match a language that gives full control over memory, SIMD, layout, threading, etc.

This is not to say that asm.js couldn't do most of the 'casual' games you see today (e.g. Ridiculous Fishing on iOS, Angry Birds, etc), and most of the Flash games, I'm totally onboard with using it for that, but I think it's a mistake to push this as a real solution for bringing triple-A FPSes to the web.

And what of Mobile? Can this even run Infinity Blade on a recent smartphone? What would be the different in battery life between an iOS version and an asm.js version?


> It's not the first time someone's had environments (sans game) running, see for example Quake3 (http://media.tojicode.com/q3bsp/).

Quake 3 was released in 1999. Unreal Engine 3 is a current and top-of-the-line engine for AAA games.

> There was a time when Sun was marketing Java and pushing heavily on this idea that it would one day meet or beat C/C++ performance, but it never quite got there after 2 decades (yes, in some circumstances it comes close, in others, it fairs badly).

This is totally different. asm.js is just a backwards compatible encoding of a different, much lower-level language. In fact, that lower-level language is in direct correspondence with the intermediate representation of a production-quality C++ compiler (clang)!

Sun never tried something like asm.js for Java; rather they claimed that the full language would be as fast as C and C++.

> memory, SIMD, layout, threading

asm.js gives full control over memory and full control over layout, with the exception of stuff like mmap() and mprotect() and self-modifying code.

SIMD is something that RiverTrail/ParallelArray can do. It should be added to asm.js.

As for threading, I'm personally in favor of an extension to allow a typed array to be shared between workers, races and all, with some primitives to allow synchronization. Not everybody is OK with this, but I personally believe we aren't likely to get to native performance without it. (Again, this is just my personal opinion; many disagree on whether a racy typed array is necessary.)


You can polyfill SIMD, but you can't polyfill threads. Once you add shared memory between workers, you've effectively violated the premise that asm.js will run everywhere, even places that don't understand it.

In that case, you've got a portable intermediate representation of opcodes in a bloated legacy format, that is now incompatible with legacy runtimes. Why use JS format then?


Apps could feature detect threading and fall back to single-threaded mode for compatibility with non-asm browsers if it's not supported.

Regarding the claim that asm.js is "bloated": http://mozakai.blogspot.com/2011/11/code-size-when-compiling...


SIMD is coming to JS, it's as short a path as adding to Dart (and Dart is not going cross-browser, so...).

Why JS format? Because another syntax that's both fast to lex+parse and not future-hostile (as JVML was) is a tall order and we'll get to it only when we really need it. Adding SIMD to JS does not require adding a whole new syntax!

/be


Have you seen BananaBread? It is a fully runnable 3d game that does all the AI, physics, etc. It's compiled with emscripten (I'm not sure if the online version uses asm.js yet, but it just needs to be recompiled). https://developer.mozilla.org/en-US/demos/detail/bananabread


Yes, I've seen it, but "physics" in this case means ragdoll physics, it's not like Havok/PhsyX or anything that modern game engines uses. These things are impressive for Web demos, but the question you need to be asking is, what would someone who visited GDC or E3 to see console or desktop games say about it? Would they pay $39 for it, or would they think it looks like a toy compared to what the other vendors are showing?

End users don't care what something is implemented it, how open it is, all the stuff that hackers care about. All they see is a game, and they will compare it to other games that are implemented with different technologies. You won't get bonus points because you managed to implement it in a difficult environment.


This is all early stuff, yes. We aren't quite at the point yet where people can sell AAA games for $40 on the web. But I wouldn't be surprised if it happens, and what we are showing that it's certainly possible with some more work (not a huge amount, either).

End users could benefit greatly; just click on a URL and immediately start playing a game! The game needs to be on par with native, yes, but if we can pull this off there are huge benefits for distributing over the web.

Note that the Bullet physics engine was compiling with emscripten into a library called "ammo.js" and is somewhat of a standard for 3d games on the web.


Since WebGL uses the same graphics API as the mobile devices right now, you're only going to see mobile apps being ported to. It just gives developers one more platform to port to.


This is huge! I do PR in the gaming industry and have been waiting a long time to see a platform that could bring serious game development to the browser. I was expecting it to be Unity, but I'm convinced this is a bigger milestone. Epic is an awesome company with awesome tech and partnering directly with Mozilla is a stroke of genius.

Can't wait to see the games that will come out of this marriage.


Native-like games in browsers with little or no use of plugins may be game-changing for the game industry.


Given the size of game assets these days, that could be a big UX problem

(in places which don't have Google Fiber)


Presumably, game developers would serve up the game to the browser via localStorage. This would make the experience very similar to Steam today, where you download the game, then play it.


You could even design the game to be downloaded in chunks. Download the first level, and while playing, download the second in the background. This reduces downloads and load times.


LocalStorage is 5-10 MB per domain (currently there is a bug in most browsers that allows you to bypass this by using subdomains, but I wouldn't count on that being the case for much longer).


yep like you can put a gig of datas in local storage.


The amount of game code that I can debug, sweet!


Disclaimer: I worked on the original Flash version of this same demo two years ago.

Big congratulations to Mozilla for pulling this off. Feels good to have such a great demo, doesn't it? :) But a lot of the credit also needs to go to Epic for having such an amazingly scalable and portable engine and providing the AAA assets!

One issue is that the game experience does not translate as easily as the wow factor. You will need quite a bit more to go to the first shipping and profitable game built with this tech. Don't rest now :) With Flash + 3D we actually ended up seeing the most successful games not being based on straight native ports but built from scratch with the new environment in mind. Things like caching, startup time, and wide reach are crucial to an in browser game but are not needed for a demo like this.

Still really amazing! Keep up the good work and the fight for the open web!


I was wondering what the end goal of asm.js was - and the ability to run games in your browser at near-desktop speeds is _very_ impressive.

Now, how long until Chrome supports it...


It occurs to me that this also makes the web much more plausible as an application-delivery system.

And who cares what OS you're running - if you can compile your application to asm.js, which runs at only half the speed of C[1], then you can make it available to everyone.

[1] See: the image here: http://www.extremetech.com/computing/151403-firefox-sticks-i...


Assuming the browser vendor cares to support it...


... and at comparable speeds...

Sadly this is one area where plugins still rule and advance innovation. That has made it into the web quite a bit in html5 but still a bit to go before it is stable for 3D without a plugin. Three.js, fallbacks to canvas, etc are awesome and work today (hardware limitations on mobile, browser limitations like IE no webgl). But for intense 3d games some hardware acceleration is needed that is consistent. I am working on a bunch of canvas mobile apps right now and the speeds are problematic with many sprites, would love more hardware access.

If Mozilla can truly get 3d mobile browser gaming going with this (which currently there is no solution other than hardware accelerated canvas) then that opens up interesting areas. Also opens up new security issues. In terms of business/product use that isn't highlighting new tech, it is still a bit away.


I just played the demo game in Chrome and it ran at 30-60 fps on "high resolution" on a 2-year-old MacBook Pro, so... maybe it doesn't matter that much in the short term.


As soon as they find a way to put ads in it...


yes, because chrome is so full of ads...


on the loading screens!


Chrome has supported NaCl for a long time, which is quite a bit saner than asm.js


You show up on every NaCl-related thread claiming it is awesome and asm.js is crap, but that's as far as it goes - you never explain why. You're basically wasting time because what you say is never actionable. Please defend your opinions - inform us!


(P)NaCl exposes pretty much the same environment and tool chain devs get writing native apps. Predictable performance, decent debugging, etc. One can even decide to use platform-specific features if strictly necessary.

The problem it has is that it is largely opaque to the JITs in browser engines (even PNaCl, since JS JITs have nothing to do with llvm). One way to fix that would be to define a simple byte code as a compilation target and have people compile to that, in addition to compilation to JS with emacripten. The end result would be, in my opinion, much cleaner than the ugly hacks in asm.js

Furthermore, by design, asm.js will always need guards in the generated code, thus some overhead that can never be fully removed (although JITs will help with this). In particular, one can write "x = +x" in JS, but then change the type of x, thus invalidating the invariant. A guard will be needed to guarantee correctness, since asm.js is a subset of JS, but the outer set still exists.

It seems to me that web technologies are getting more and more crazy. To me it appeared obvious that asm.js is not a nice approach, but I can see many people disagree.


You are mistaken, there are no type guards within asm.js-verified/compiled/linked code. Read more at http://asmjs.org/.

/be


You're too early for April Fools, but you almost got me.


Unreal? Bah, real gamers play Quake! (As in 199Xs flame wars.)

https://github.com/SiPlus/WebQuake was released some weeks ago. It is a whole different league in terms of technological features but it is a free open-source engine. You can play it at http://webquake.quaddicted.com/

There is also a port of Sauerbraten, another free engine, https://developer.mozilla.org/en-US/demos/detail/bananabread


The thing that I think people are getting confused about is, even though asm.js is faster, you actually don't need asm.js to get native code running the browser. The new engines in Chrome and Firefox already compile to native code.

Asm.js is just a way to ensure that your code always get optimized as much as possible.

I really don't think that asm.js is necessary to bring 3d gaming to the web. WebGL performs very well as long as you take advantage of the graphics hardware. I'm not a fan of asm.js actually, since if I wanted to write low-level code I would just stick with C++. I'm on the web to take advantage of things like JavaScript and CoffeeScript.

I think the way to get to the sweet spot for 3d gaming and applications in the browser is to 1) make sure that WebGL is well supported on as many platforms as possible and 2) take advantage of the built-in networking, instant-on factor, openness, general webiness, and scripting capabilities of the web platform. By 'instant-on factor' I mean that games can load up very quickly after hitting a URL. By 'general webiness' I mean, for example, you can start a game at the end of the work day, later the evening continue from your house -- you aren't tied to a particular machine. Also the web has all of the HTML5 features which could really come in handy for innovative browser-based operating systems or 3d applications.

Also, if you take advantage of things like procedurally generated textures and geometry, people won't even have to wait for stuff like loading textures.

Asm.js is cool but its not really necessary and not my thing. We just need good WebGL support.

Another feature that 3d browser-based games and applications could take advantage of is WebRTC.

The point being that if you take advantage of easy-to-use web APIs and libraries like WebWorkers, WebRTC, Three.js, CopperLicht, web-based languages like CoffeeScript, etc. then the web platform can offer game developers productivity advantages.


>I really don't think that asm.js is necessary to bring 3d gaming to the web.

It helps ... a lot. WebGL gives you GPU access which will largely drive your graphics engine, but there's still tons of places where CPU performance matters (e.g. physics and AI). Asm.js and WebGL are complementary.

>I'm on the web to take advantage of things like JavaScript and CoffeeScript.

And asm.js prevents this how? If you're coding in CoffeeScript, you already have a necessary compile step to do. Are you telling me you wouldn't want the CoffeeScript compiler to give you extra performance by including asm.js heuristics?


Yeah there are tons of places that CPU performance matters, but my point is that Mozilla and WebKit's engines are doing a good job of native compilation already.

If you could make a CoffeeScript compiler that would output asm.js, then great. But I'm not sure that's possible. I think that you either have to go from C/C++ code through the emscripten or right directly using the asm.js stuff in JavaScript.



I worked at a company that developed 3D interactive language and culture software with speech recognition, virtual humans, etc - one of our biggest issues was that customers aren't interested in installing desktop software anymore. We were able to make pretty decent web products using Unity, although our desktop products were still far superior. Good to see 3D on the web edging ever closer to becoming standard.


Are you sure this is even correct? The github project for that 3d demo (https://github.com/kripken/BananaBread/) says the 3d engine is "Cube 2/Sauerbraten 3D", not Unreal.

It seems like the Unreal Engine facebook just posted a link to that video and people are thinking it is Unreal Engine.


BananaBread is a different 3d engine, it's what most of the development work of emscripten has been focused on. The UnrealEngine demo is a separate demo done by compiling the UnrealEngine codebase to asm.js using emscripten.


Wow, I must be really dumb to have not noticed the Unreal Engine demo video. I only saw the BananaBread one. Thanks.


We will need a really good caching/disk space management system before this could really be used for much, is anyone working on that?


Conveniently, this email [1] has been sent today by Jonas Sicking from Mozilla. It basically pinpoints all the problems developers using appcache experience, and proposes a starting point to go towards a good caching system for the web. Rest assured people are working on that actively.

[1]: http://lists.w3.org/Archives/Public/public-webapps/2013JanMa...


This was the problem with early web based MMOs. Unity has a caching license now that adds a bigger cache (http://docs.unity3d.com/Documentation/ScriptReference/Cachin...) that was developed for Cartoon Network but they haven't made it part of the main release due to security and it also is a revenue generator for them I guess. I'd love to see this launch now they have some competition.

Flash had something similar but was available to everyone with up to unlimited space by user setting (flash cookies are stored there so this is part of the security problem).


Html5 offline appcache lets you download the entire application and cache it locally, including assets. It can do incremental upgrades and show progress info while downloading. For large bundles user approval is necessary, but that shouldn't be an issue. It's currently in all shipping browsers, including IE10.


I'm curious what the performance is like in Chrome or Safari (which do not have asm.js support.)

If performance is acceptable in Chrome/Safari then that helps sell the idea that the web is ready for 3D games now.

If the performance isn't acceptable then that helps to make the case for asm.js.


Mozilla has already showed asm.js is 3-6x slower in Chrome, which doesn't support it right now.


Yeah they showed a benchmark was 3-6x slower. That's all well and good. I wanna see what the real world implications are. Here's this cool Unreal demo running with asm.js support, it looks great.

Now let me run it in Chrome or Safari and see if it's even remotely passable or not.


JavaScript is usually compiled to native code by WebKit and can have very good performance. 3d WebGL gaming performance both on Chrome and Firefox and Safari is usually a factor of WebGL support in the browser, drivers, etc., rather than JavaScript performance.


You haven't read http://asmjs.org/ yet, I gather.

/be


What happens if NaCL performance is acceptable? By the reasoning you've applied here, doesn't that make the case for NaCL?


Come on, why ask a question like that? We're not discussing NaCl here.

If you want to discuss NaCl that's a whole different issue. It's a totally different standard and quite a bit more complex than asm.js.


Lets not forget mobile devices as well.


We pitched this to Epic in 2007 when our platform helped bring id Software's Quake online... on any browser.

Included an "app garden" for the mod community around Unreal engine, clans, ladders, and real-time tournaments with automatic server propagation.

We shutdown due to high infrastructure costs and notorious "2 week" => 2 year game delays.


What was the actual reason why Mozilla could not implement NaCL?

Why cannot multiple technologies compete among users on their merits?


If I recall, I read somewhere that it would take a lot of work to get Firefox to where it would need to be to support NaCL. The whole point of making asm.js is that it is easy to implement AND produces tremendous results. Oh, just found what I was referring to. This is from the asm.js FAQ http://asmjs.org/faq.html

> Q. Why not NaCl or PNaCl instead? Are you just being stubborn about JavaScript?

> A. The principal benefit of asm.js over whole new technologies like NaCl and PNaCl is that it works today: existing JavaScript engines already optimize this style of code quite well. This means that developers can ship asm.js today and it'll simply get faster over time. Another important benefit is that it's far simpler to implement, requiring very little additional machinery on top of existing JavaScript engines and no API compatibility layer.

Also for reference, this is the documentation on implementing native client http://www.chromium.org/nativeclient


There's several reasons, read this post and Brendan's corresponding comments about it: https://news.ycombinator.com/item?id=4630057

Practically speaking, it's quite a large engineering effort, duplicates maintenance work, and introduces a whole host of new problems (backwards compatibility, etc) itself. Not to mention the politics involved in getting all the browsers on-board and agreeing to a whole new completely different standard.


Mozilla doesn't want to implement NaCL because most of the people involved in the project feel that the hardware-architecture-dependence this would introduce to the web is a really bad idea.

Mozilla could not implement PNaCL (which would not have that drawback) because it doesn't exist yet.


> Mozilla could not implement PNaCL (which would not have that drawback) because it doesn't exist yet.

PNaCl doesn't exist yet? That's funny, because its both in the current Native Client SDK and has been in Chrome stable (disabled by default) since, well, I'm not sure, but at least since the last major version, which is the first time I noticed the setting.


Except it still requires you to serve up architecture-specific things for now, as far as I can tell.


> Except it still requires you to serve up architecture-specific things for now, as far as I can tell.

PNaCl doesn't require that. The fact that PNaCl is not enabled by default on the client makes that a practical requirement when using NaCl, but "is not enabled by default in Chrome" isn't the same thing as "does not exist". Particularly, the fact that PNaCl isn't enabled by default in Chrome, unlike the claimed fact that it doesn't exist, is no barrier to Mozilla incorporating it.


Uh... if it's not "existing" enough to enable in Chrome, what makes you think it's "existing" enough to enable in Mozilla?

But also, see the Pepper discussions through these various threads. "Existing" for PNaCl also includes being specified enough to actually be implementable, which runs straight into Pepper.


> if it's not "existing" enough to enable in Chrome?

It exists, is included in Chrome, and users can enable it. Its not enabled by default, but it definitely exists and is deployed in the stable version of Chrome (and not just the most recent stable version.)

> what makes you think it's "existing" enough to enable in Mozilla?

There's a difference between "Mozilla can't use PNaCl because it doesn't exist" and "Mozilla doesn't want to use PNaCl because it hasn't reached the maturity point where Google is ready to enable it by default in Chrome".

The first relies on a false premise. The second doesn't (though its still a false statement; Mozilla doesn't want to use PNaCl because they are committed to JavaScript-as-the-engine-of-behavior for the browser; their primary objection to every other alternative is that it isn't JavaScript. This is a perfectly legitimate preference, and there is no need to confuse the issue with false explanations.)

> "Existing" for PNaCl also includes being specified enough to actually be implementable, which runs straight into Pepper.

No, that's not what "existing" means.

That's not to say that issues with the degree to which the Pepper API is well-specified aren't potentially the basis for legitimate opposition to using Pepper+PNaCl in browsers, its just completely nonsense to say that PNaCl doesn't exist (when it manifestly does and is widely distributed). If you want to argue that Pepper is underspecified, then argue that Pepper is underspecified, not that PNaCl doesn't exist. Because the first may or may not be a legitimate argument, but the second is just nonsense.


Here's Mozilla's CTO on why he's not supporting NaCL: https://news.ycombinator.com/item?id=4632410


I wish NaCL was available across browsers - at the very least it offers completely different tradeoffs that can be useful as escape hatches for certain kinds of applications.

Also since none of Google's bread and butter products depend on it, I am a bit wary that it might be mothballed at some point. If anyone in this thread has more clarity about the NaCL roadmap I would be very interested in hearing about it.


NaCL already feels like dead-man-walking. It's a tech that is completely controlled by Google and none of the other big guys (Apple, Microsoft, Mozilla) want anything to do with it.


Thats a shame - Ive been on the fence on writing a data analysis environment using NaCL. Have a couple of small demos running but want to find out more about the roadmap before investing more time into it. This is the main reason I am going to Google IO this year to get some kind of feel for what the roadmap / traction looks like for NaCL going forward.


It's not going to be supported by anybody except Chrome, so whatever app you build, is going to cut 3/4ths of the market (plus all the Chrome Browser users that don't download Chrome apps). Most likely Google will keep it around for a while longer since it underpins not only Chrome Browser, but also Chrome OS.

You're essentially in the ActiveX / Java Web Start worlds. It's fine if it suits your needs but it's not the web.


Why would they implement NaCL? If someone wants to write a plugin for it, they're welcome to - but Firefox is solidly behind supporting Javascript as a standard, and don't see any point in adding other technologies to the mix.


Because that would at least get software like Photoshop / 3d Studio Max etc into the browser as a delivery platform. Then pragmatically parts of those apps can be transitioned to things like webgl / canvas as these technologies mature.


Emscripten will probably do more for that than NaCl ever will.


The future is definitely going to be interesting - would be nice if the browser makers started shifting their own native codebases through Emscripten / asm.js / NaCL. Have a minimal bootstrap environment and then implement your entire stack in terms of primitives that you expect the rest of the world to use.


That is farther in the future. Shorter term, we could NaCl (not PNaCl) parts of our native codebases, and Chrome VIPs have spoken of this to the press. But it has not happened yet, and some performance-critical pieces will probably remain unsafe.

As noted elsewhere here, Mozilla with Servo is investigating zerovm-ish NaCl uses, _sans_ Pepper. Good stuff IMHO.

/be


Do a search for BrendanEich on hnsearch.com, he's discussed it many times.

I'm curious why you're asking about Mozilla and not Apple or Microsoft, though, they aren't implementing NaCL either.


One reason could be that there are higher expectations of Mozilla when it comes to making decisions on technical merit.


No one has made the case that PNaCl/Pepper has more technical merit than Emscripten(or Mandreel, e.g.) / asm.js / WebAPIs. Assuming is a mistake.

/be


It's of course risky to discuss something that hasn't shipped, though extrapolating PNaCl's behavior from its architecture isn't that hard. (It's just NaCl except that LLVM is compiled to NaCl on load. And LLVM itself runs within the NaCl sandbox too.)

Off the top of my head, if PNaCl existed today, its advantages would be: access to SIMD units, true threads, 64-bit integer operations, 32-bit floats, debuggability with nacl-gdb, a growable heap, support for exceptions/setjmp, and no ridiculousness like: https://mail.mozilla.org/pipermail/es-discuss/2013-March/029...

As a browser implementer, you are fully justified in rejecting Pepper. :) Permanent APIs should go through some kind of standardization process. (Or they should HAVE, cough XHR)

I gave a presentation at GDC this morning* on Emscripten and asm.js and agree that asm.js has better survivability characteristics and is likely to succeed in the market. Nonetheless, the proposed design of PNaCl has several technical merits over asm.js as it exists today.

In the meantime, please add 64-bit integers, threads, and SIMD to asm.js :)

* It occupied the same time slot as the Mozilla presentation! arg!

Edit: forgot exceptions, setjmp, and a growable heap, none of which work in asm.js today. I know, I know, asm.js is early, but the point I'm making is that we're comparing two theoretical technologies at this point.


I'm personally hacking int64 and uint64 into SpiderMonkey, rebasing the patch till my fingers bleed. See https://bugzilla.mozilla.org/show_bug.cgi?id=749786.

SIMD is coming to standard JS, and quickly in prototype form in SpiderMonkey. The idea that we need Dart (a whole new language and VM) or PNaCL + Pepper to get SIMD in JS is silly. Just add SIMD to JS, less work!

Some of what you rightly want is thus pure JS extension, not "asm.js". Other things (sbrk/mmap) are asm.js. Both can make progress, and cross-browser. Pepper cannot. That demerit trumps any SIMD-in-NaCl merit.

/be


I believe 64-bit integers are in the works already. https://bugzilla.mozilla.org/show_bug.cgi?id=749786


Right, but until they're in the language spec and implemented across browsers, you can't use 64-bit integer value objects in asm.js code unless you can polyfill or generate asm.js code that only runs in the latest Mozilla or whatever.


So? You can't use PNaCl at all yet, and you can use NaCl only in Chrome and only loading content from the CWS. Please use the same yardstick!

The cross-browser evolving standards story is the winner, not only long-term but shorter term. There is no serious argument to the contrary.

/be


I AM using the same yardstick (neither asm.js nor PNaCl is usable today if you want the things I listed, but I predict PNaCl will exist in some form sooner), and I'm asking you to at least talk about the merits of PNaCl. Instead of constantly beating the JavaScript drum, you could have instead said "We have tried to work with Google to standardize PNaCl without Pepper and they are not open to discussion."

I know you have objections to Pepper, and I don't disagree, but personally I would be happy with a PNaCl minus Pepper too. Make WebIDL accessible from PNaCl and we can ignore all of the API issues.

I have a great deal of respect for you, Brendan, and Mozilla, and this is purely a communication tone issue, but as a game developer, it frequently sounds like Mozilla is more interested in the purity of the web than actually solving developer use cases. In fact, I had a couple discussions with game developers just yesterday about the fact that it feels like Google cares more.

HOWEVER, asm.js, parts of ES6/7, WebGL, etc. go a long way towards meeting the needs of game developers, so I know you actually do care about solving use cases. But you could at least be more respectful of Google's direct approach.


Personally, as someone who works with LLVM IR on a daily basis, the idea of standardizing LLVM IR as it exists today worries me a lot for several reasons:

* Most importantly: LLVM IR is a compiler IR. http://lists.cs.uiuc.edu/pipermail/llvmdev/2011-October/0437...

* LLVM IR is really big compared to asm.js. http://mozakai.blogspot.com/2011/11/code-size-when-compiling...

* LLVM IR wasn't really designed to be portable. clang bakes lots of architecture-specific stuff into the IR that it generates. It can be hacked around the way PNaCl does by trying to define a "generic" architecture, but it's stretching the architecture beyond what it was intended to do.

* LLVM IR is very unstable and continually changes from version to version.

If we don't care about backwards compatibility, I would much rather see a bytecode standard done from scratch than PNaCl's approach of trying to use LLVM IR. I know the reason they did it is to use LLVM's code generation backends, but I feel like that could be done without using a large underspecified compiler IR as the transport format.


Sorry, it was you, not me, who shifted the topic from "merit" to "doesn't work cross-browser". I called that out because I'm pretty sure int64 and uint64, along with SIMD intrinsics, are coming to cross-browser JS -- while NaCl and PNaCl and in particular Pepper are not going cross-browser.

If you want to stick to merit-only arguments, please don't knock int64 as Mozilla-only or we'll circle back to the relative odds of JS standards evolving vs. infeasibility of standardizing Pepper + *NaCl.

On tone, I hope the above makes clear why I responded to your "not in other browsers" comment.

On purity, Mozilla has been faulted for being impure forever (not GPL, not XML/XHTML, not forsaking the average mass-market user for elite developers, etc. etc.). Our asm.js work is fully pragmatic and our talk at GDC (sorry it overlapped yours) had exactly the right tone. We are here to help game devs, without any purity tax.

On NaCl (not PNaCl) minus Pepper, I think we agree. We're looking at zerovm and similar uses of SFI in the context of the Servo project, for safer native code. That is where NaCl really shines, IMHO: safer native and OS/CPU-targeted code in native apps. It would be a shame if NaCl doesn't cater to this better use-case and instead tilts at the PNaCl cross-browser windmill.

/be


How about evaluating both (asm.js and PNaCl)? Looks like we are stuck in the "monoculture" you said you dislike. And this IS the reason I never love web standards. I don't like monocultures too.

I feel you are biased to preserve JavaScript because you are the creator of JavaScript. IMO, asm.js might beat Flash but has a clear dead-end to beat native. We all are in a local optimum trap, I think.


Do we all buy two significantly different and expensive cars to "evaluate" both? Most people can't afford to do any such thing.

Nor do we all buy the same make and model of car. That's implementation monoculture.

Monoculture of implementation (WebKit, which I pointed out is fragmented in practice) is not the same as a single vocabulary for each kind of language at the primitive level of the web standards layer cake.

Yes, we are stuck with the web platform bottoming out at a certain set of languages. That's not monoculture, rather: termination without too much redundancy in something not so complex that multiple browsers can hope to implement interoperably. The platform has to stand on certain primitives.

Remember XHTML2 vs. HTML5 back in 2005? It was going to be one or the other, even if browsers such as Firefox parsed both (but XHTML2 slowly and without incremental layout).

Maybe a Microsoft or Google could have "done both" well (but IE didn't, except by treating text/xhtml as error-corrected HTML!).

Without a single dominant browser vendor with enough cash and willpower to pull off more than the minimum set of semantically sufficient/complementary primitive languages, we were likely stuck with HTML/CSS/JS. SVG and MathML as HTML5 polyglot languages eventually were rationalized into the stack but still see too little use to be optimized well (and didn't Chrome turn off MathML recently?).

So the primitives have to be minimized against competing browsers' ability to implement them all interoperably. Browsers weren't going to do equally well at both the 2005-era XML languages (XHTML/SVG/XSLT) and the competing incremental plan (HTML5).

Arbitrary native code has been falling out as a part of the primitive layer for a while. Plugins such as Flash and Silverlight are in decline. You could say we had a plugin polyculture, but it was a security nightmare and also bad for users just in terms of updates and cross-browser/plugin UX consistency.

Pepper is an attempt to mediate plugins' access to OS and browser APIs, but too tied to chromium and too large for Google to standardize or for other vendors to swallow.

Consider also how Dart support (2nd VM, in particular GC barriers required for cross-VM-heap cycle collection) bounced off WebKit:

https://lists.webkit.org/pipermail/webkit-dev/2011-December/...

I'm mostly being descriptive so far, but let me prescribe:

Based on the hard multi-vendor specification and software engineering problems of doing well with a roughly minimized set of primitives that we have today, I do not believe it's "better" for anyone, especially for web developers, to over-extend the primitive set with things like NaCl and Pepper.

That applies even if one vendor could, e.g., add SIMD primitives quickly. That just fragments the picture for developers, and misdirects some energy that could be spent on adding SIMD to JS.

Same goes for Dart. It was so important to do as a "replacement" (see leaked Dash memo) and not wait for the standards bodies, but that started in 2010 (or earlier? I knew only for certain in spring 2010). It's 2013 now and Dart is not going cross browser. Things it wanted, e.g. bignums, could have been put in ES6 by now and implemented in leading JS engines.

I do not agree we are in a local optimum trap, not with Unreal 3 running within 2x of native unsafe code speed.

/be


> Pepper is an attempt to mediate plugins' access to OS and browser APIs, but too tied to chromium and too large for Google to standardize or for other vendors to swallow.

You keep repeating this, but I don't see how you can be remotely serious about it.

Compared to JS, DOM, HTML, and CSS, Pepper is downright simple:

https://developers.google.com/native-client/peppercpp/inheri...

As an external developer, Pepper is something I could actually implement independently for a reasonable amount of money ($2M or less) in a reasonable amount of time (1 year or less), if I could rely on Google's existing (P)NaCL toolchain.

Would there be ambiguity? Yes. Google has already said they're willing to work with implementors to resolve any design or ambiguity issues.

On the other hand, I can't even imagine trying to write a full browser stack, including HTML(5), CSS, JS, et al, in one year for <= $2M? Not a chance.

Yet you call Pepper "too big" and "too complex". Meanwhile, the enormous weight of the mess that is the web stack creates a huge barrier to entry for anyone that might want to meaningfully contribute new ideas to the space.

We're already seeing interesting development occur around (the much more approachable) NaCL/Pepper:

http://zerovm.org/

> That just fragments the picture for developers, and misdirects some energy that could be spent on adding SIMD to JS.

This is rich coming from the CTO that's refused to collaborate with Google and instead is going off and "misdirecting" energy on alternative approaches.

> It's 2013 now and Dart is not going cross browser.

You know, you might have something to do with that? You're not exactly a powerless or disinterested third party in this dialog.


> As an external developer, Pepper is something I could actually implement independently for a reasonable amount of money ($2M or less) in a reasonable amount of time (1 year or less), if I could rely on Google's existing (P)NaCL toolchain.

But what's the point? You'd be stuck running some subset of "things that can run on the web", and the vast expanse of existing HTML content would be inaccessible. If you want to use NaCL/PNaCl for non-web applications nobody is stopping you. Writing a "browser" that doesn't implement HTML+JS is just nonsense.

> Would there be ambiguity? Yes. Google has already said they're willing to work with implementors to resolve any design or ambiguity issues.

We have no basis to know whether "attempt to specify all out the ambiguitity in the Pepper API, using the Chromium implementation as the standard" is harder than "implement the already-fairly-well-specified set of web technologies". We know web technologies can be implemented fairly interoperably, we have multiple interoperable implementations.

> You know, you might have something to do with that? You're not exactly a powerless or disinterested third party in this dialog.

And yet we're not the only game in town, either. None of the other browser manufacturers has expressed any interest, so it's not like we're alone on this. Why should we have to carry Google's banner? We've got enough stuff on our plate.


> But what's the point? You'd be stuck running some subset of "things that can run on the web", and the vast expanse of existing HTML content would be inaccessible. If you want to use NaCL/PNaCl for non-web applications nobody is stopping you. Writing a "browser" that doesn't implement HTML+JS is just nonsense.

Unless you consider the possibility that:

- Web apps versus web documents are two very different things

- You see this as a way by which we could escape the mess and massive size of the current web stack by standardizing on a much smaller base.

- Simplifying the core stack could open the door to new competition and innovation in the field.

Personally, I'd love to see WebKit and/or Gecko and all the browser chrome implemented on top of NaCL and/or asm.js, with only the most minimal runtime, such that you're not sitting in a privileged position of running native code while the rest of us are stuck in JS/HTML/CSS hell.

> We know web technologies can be implemented fairly interoperably, we have multiple interoperable implementations.

We also know that it's ridiculously expensive and difficult, in no small part due to the resistance to enforcing things like formal validation of HTML in the browser itself, creating a problem that's as much art as it is science.

> And yet we're not the only game in town, either. None of the other browser manufacturers has expressed any interest, so it's not like we're alone on this. Why should we have to carry Google's banner? We've got enough stuff on our plate.

Apple and Microsoft have no interest in undercutting their own vertically integrated platforms.

Only you and Google have an strong interest in furthering the web, and yet, you refuse to work with Google despite them expending considerable time and money on furthering the web, while sharing their work freely.


We at Mozilla don't have $2M and a year to spare, but you're wrong: we've assessed doing Pepper with high fidelity (not porting chromium and duplicating all the DOM and other browser API implementations). It's more like $10M and multiple elapsed years to get to Chrome parity, assuming Chrome doesn't keep evolving and put us on a treadmill.

But then, I'm just the guy managing engineering and worrying about budget at Mozilla. Maybe you have greater skills. Where do you work?

Anyway, Pepper is big, with over a thousand methods among all the interfaces that are "specified" only by C++ implementation code we cannot port. We have a DOM implementation already, for example. So you cannot escape the fact that Pepper is "and also", not "instead of" -- there's no savings, it is purely added-cost, and significant cost.

I'm almost the only guy who will say this on HN, but as far as I can tell, Microsoft and Apple are on the same page. Maciej Stachowiak of Apple has agreed on HN, for what it's worth:

https://news.ycombinator.com/item?id=4648045

Enough whining about Mozilla not doing Pepper. Let's get back to asm.js.

It looks like V8 will implement the optimizations for "use asm": http://code.google.com/p/v8/issues/detail?id=2599.

Also, Epic is just the first big game publisher we are working with. Epic folks have confirmed again just this week that they won't do NaCl, there's no benefit to something CWS-only. But they are super-excited about cross-browser web-wide reach with asm.js and browser-hosted C++-sourced games. That's the PNaCl output format and runtime for Google to collaborate on.

/be


> It's more like $10M and multiple elapsed years to get to Chrome parity, assuming Chrome doesn't keep evolving and put us on a treadmill.

So you don't believe Google's offers to work with external implementors?

> But then, I'm just the guy managing engineering and worrying about budget at Mozilla. Maybe you have greater skills. Where do you work?

I run a systems programming consulting shop. We focus on low-level OS/driver and VM projects, with application development making up the remainder of our work.

In theory, reliable estimates are my livelihood, and I have a very vested interest in the future of the platforms that we will have to develop for. I'm very concerned about the market requiring us to develop apps for a Mozilla platform.

> Microsoft and Apple are on the same page

What reason do they have to roll out something that helps Google undercut their vertical platform/app markets?

If NaCL is 'too good', then it threatens their business. Take that for what it means in terms of asm.js optimization adoption.

> Epic folks have confirmed again just this week that they won't do NaCl, there's no benefit to something CWS-only.

CWS-only and Chrome-only are again, oddly, something you have a hand in. Google pushed out beta PNaCL tools, and CWS-only has primarily been (rightfully) pending PNaCL.


> So you don't believe Google's offers to work with external implementors?

I wasn't born yesterday.

Microsoft is patching WebKit to support Pointer Events. Strange days, but even this "help" does not mean Pointer Events should win (or lose -- it's neutral in standards terms, at most helpful to show implementation feasibility and quality, if possible).

Why do you distrust Apple and Microsoft and ascribe bad motives to them based on their commercial interests, but fault Mozilla for not jumping on Google's Pepper treadmill like a good little-brother caricature? Google has commercial interests too, and they have spent >$1B a year on Chrome -- including advertising and bundling that directly targets Firefox users. (It's amazing we are still alive, and possibly even growing desktop share.)

You are surely right that some amount of competitive pressure will be required to get asm.js optimized in all engines. Ditto for WebGL in IE, and enabled in Safari. But cooperation between Chrome and Firefox is already happening on both asm.js and WebGL, so between these "coopetition" pincers, I bet we'll prevail.

A hopeful sign regarding WebGL in IE11, assuming it is legit: http://fremycompany.com/BG/2013/Internet-Explorer-11-rsquo-s...

Anyway, let's assume everyone has equally good motives, since Google is not any less commercial than Microsoft or Apple these days. Pepper is still too costly for others to swallow even with offers of "help", and therefore losing -- it is not even in the race.

asm.js and evolved Web APIs (which developers need anyway) are winning.

Evil-me on my Throne of Skulls didn't ordain this outcome. It is unfolding before our eyes across browsers, developers, and very pragmatic third parties including big game/game-engine publishers.

(But I am laughing a Dr. Evil laugh, on my Skull island. :-P)

/be


> Why do you distrust Apple and Microsoft and ascribe bad motives to them based on their commercial interests, but fault Mozilla for not jumping on Google's Pepper treadmill like a good little-brother caricature?

I don't distrust them, or ascribe bad motives. I ascribe motives that align with their economic interests -- it doesn't matter if they're bad or good.

> Google has commercial interests too, and they have spent >$1B a year on Chrome -- including advertising and bundling that directly targets Firefox users. (It's amazing we are still alive, and possibly even growing desktop share.)

How much does Google spend on Firefox's search deal? What do you see as Google's commercial motive with Chrome?

My reading (in the context that I have some family that works on Chrome) is that their primary interest was ensuring that they could help push the web forward, and not be beholden to external interests in doing so. They have an interest in open platforms insofar as it gives them leverage over the much more strongly established platform players (eg, Microsoft, Apple).

> asm.js and evolved Web APIs (which developers need anyway) are winning.

The interesting thing is that if asm.js has sufficient adoption, the 'web' part of the equation may very well not matter at all.

If Google maintains their efforts on (P)NaCL and can provide better performance, end implementors may target both asm.js and NaCL, with shims for platform-specific functionality (not unlike most game development today). Once you're doing that, you can also target other native platforms with the same code and tooling.

So I don't think asm.js is all bad. Ironically, it could very well be the escape route from JS/HTML/CSS/DOM that we've all been praying for ever since the idea of the web app came into being. Not because it's the best technical solution, but since when has technical correctness outweighed market forces?

> Evil-me on my Throne of Skulls didn't ordain this outcome.

Of course not. But "evil you" certainly has a hand in it. There are only 4 real players in the browser market, and you're one of them.

In fact, you're the one that decided to ignore what Google has been doing and invest in asm.js to begin with, so divesting any claim in the role of asm.js's possible ascendency is a bit of a stretch.


My use of "bad" and "good" about your relative judgments of Microsoft, Apple, and Google could be "purple" and "shiny" for all it matters.

The point is you seem to treat Google as more benign and non-commercial. That's not credible based on many pieces of evidence, including their public company performance scrutiny by Wall Street. I admire Google for some things they do, and believe they do not over-optimize their share price, but they cannot ignore it, either. Also, they are a house divided, with many conflicting agendas not directly related to business goals.

> How much does Google spend on Firefox's search deal?

That's not something I can comment on, per our contract stipulated by Google, but the rumors are online and if you believe them, they show a commercial partnership, nothing more.

Numerate folks have estimated the value and cost of various search deals, see e.g. Jeremy Wagstaff of Reuters. I won't comment, except to say that Mozilla was underpaid for a long time, something we chose early on in order to avoid being greedy and triggering a bad reaction from search partners.

> What do you see as Google's commercial motive with Chrome?

Lots of motives, some mixed. It's complex, and the "make the web better" motive is still there and all to the good. Some shift away from standardization toward "works in Chrome/CWS" -- and not due to anything I did -- is evident lately, and disturbing. At the limit, it's Microsoft-y.

Again, if Google as a whole were to standardize early and often, just to take one example as we've done with the missing device and sensor APIs for mobile via Firefox OS (with Samsung patching WebKit for Tizen to match), we'd have a better web, faster. Some of the delays there can be blamed on Android, but not all.

> The interesting thing is that if asm.js has sufficient adoption, the 'web' part of the equation may very well not matter at all.

No, for Emscripten/ASM.js, you still need a C or C++ runtime, not just libc/stdio/stdlib stuff but various graphics, audio, and other APIs.

> In fact, you're the one that decided to ignore what Google has been doing and invest in asm.js to begin with, so divesting any claim in the role of asm.js's possible ascendency is a bit of a stretch.

Here you show your bias. I didn't "ignore" what Google has been doing, I estimated it as too costly to risk.

Now you tell me why the shoe isn't on the other foot. Why did Google "ignore" what we've been doing to advance JS for the last two years, and move all its Aarhus talent onto Dart, at some cost to V8? And at the cost of Epic, a "sale" we won that Google could have, had it only invested a bit more in JS.

You really do have a pro-Google, anti-Mozilla animus -- good/bad or purple/shiny, I don't care.

/be


> The point is you seem to treat Google as more benign and non-commercial. That's not credible based on many pieces of evidence, including their public company performance scrutiny by Wall Street. I admire Google for some things they do, and believe they do not over-optimize their share price, but they cannot ignore it, either. Also, they are a house divided, with many conflicting agendas not directly related to business goals.

Absolutely not; I consider Google to be an organization whose commercial interests are ultimately far more aligned with my own than Apple's or Microsoft's. It has nothing to do with being benign or non-commercial -- my interests are in no small part commercial.

> No, for Emscripten/ASM.js, you still need a C or C++ runtime, not just libc/stdio/stdlib stuff but various graphics, audio, and other APIs.

Sure, but we (the systems/games development community) are well equipped to create common runtime libraries, especially on a platform that provides an anemic set of platform standards and components, in which case custom libraries won't suffer from QT or Swing's uncanny valley problem.

> Now you tell me why the shoe isn't on the other foot. Why did Google "ignore" what we've been doing to advance JS for the last two years, and move all its Aarhus talent onto Dart, at some cost to V8?

It seems to me that Google made a rational decision that JavaScript is not a technologically sound foundation upon which to build, and made the first moves to find alternatives.

> You really do have a pro-Google, anti-Mozilla animus -- good/bad or purple/shiny, I don't care.

It's actually an anti-JavaScript animus. The idea that JavaScript ought to be the foundation of the next generation of modern computing platforms runs counter to everything I know about language and runtime design.

It seems to be an entirely market-driven solution, not a technologically-driven one -- but those market circumstances are something you have far more influence over than I do.


> It's actually an anti-JavaScript animus.

You just wrote that I "ignored" Google's NaCl and Pepper work. That is not true, and yet you didn't hold Google to account anywhere on this thread (or others) for "ignoring" opportunities such as asm.js which were latent in Emscripten (in spite of their commercial support for Mandreel-compiled CWS games).

There's more to what you wrote than just anti-JS animus. But let's not get stuck on this dispute; let's be unstuck like JS :-P.

> It seems to be an entirely market-driven solution, not a technologically-driven one -- but those market circumstances are something you have far more influence over than I do.

Look deeper. The market is responding to cost structures ultimately based on how hard it is to do PNaCl + Pepper vs. Emscripten + asm.js + (Web APIs that devs need anyway).

This is not about "best tech" or "marketing" or "market power". (Firefox doesn't have Google's marketing budget, and we definitely do not have decisive market power, as Netscape and then IE did -- but neither does Chrome.)

This is about "shortest path" or "worse is better" -- who gets adoption first with something "good enough" wins, and can keep winning if the good-enough thing keeps evolving and doesn't get stuck.

JS is "good enough" and manifestly capable of evolving cross-browser via the Ecma TC39 standards process and some coopetition to be even better.

Anyway, at the asm.js level, apart from syntax aesthetics, why do you care how an int32 cast or type annotation is encoded? The semantics are sufficient, as proven by UE3 in Firefox. They only get richer as we do SIMD, task ||ism, etc.

/be


> This is about "shortest path" or "worse is better ...

I think we're in agreement there.

> JS is "good enough" and manifestly capable of evolving cross-browser via the Ecma TC39 standards process and some coopetition to be even better.

Perhaps in disagreement there, especially when you're competing against Apple's user-first focus, and especially when I'm on the receiving end of being forced to use a JS-based stack as a developer. I do mean the full stack, too -- instruction level profiling, debugging, failure reporting, consistent performance profiles, the whole nine yards.

> Anyway, at the asm.js level, apart from syntax, why do you care how an int32 cast or type annotation is encoded? The semantics are sufficient, as proven by UE3 in Firefox. They only get richer as we do SIMD, task ||ism, etc.

Well, as an implementor of systems software, I can't help but be wary that asm.js brings along with it the entire JS language and (in practice) web stack, too. It means that the likelihood that 'web' applications will ever be free of the enormous web stack is lowered, though alternative implementations (eg, NaCL) reduce some of that risk.

I guess we'll see how it goes.


Thanks for the thoughtful reply.

I share some concern about Apple, but less so in this "post Apple" era (Google, Samsung, up-and-comers from China).

On asm.js, the subset is verified at parse time and also as needed at link time, so there's no taint of "full JS" -- yet. We do expect the heaps to cross-connect, and want that. It won't make the parts of JS you dislike live much longer, from what I can tell.

To be perfectly candid, my goal is to evolve standardized JS, and therefore its implemented and widely distributed VMs, to be a good multi-language target language (and runtime).

This is eminently doable and closer than many people think. If it succeeds, then JS-the-handcoded-source-language can live or die on its (as evolved by then) language merits, and I won't cry if it dies.

But I wouldn't bet on its fan base dropping it even if they can choose other languages.

/be


I don't really see the point of NaCL when something like asm.js gives us our cake and let's us eat it too.


I don't fully understand, is this equal to NaCl or webgl/canvas?


WebGL/Canvas. Standard HTML5/JavaScript, made possible by improvements in the emscripten C++->JavaScript compiler and faster JavaScript engines with Type Inference.

No native code or complicated, non-standard Chrome-only APIs here.


This Unreal Engine3 port uses WebGL and asm.js.


asm.js paying dividends already, performance is always useful. But the larger problem is cross platform gaming, so hopefully webkit and others get something like this. The problem is Chrome has NaCl, Microsoft has their designs I am sure, we still need WebGL support which I am sure isn't coming.

For now Unity, Flash and general html5/canvas is still a better cross platform solution. If this leads to browser mobile games though that are native speeds, then we have something.


asm.js is just a subset of JavaScript, so it works in all browsers already, unlike those "alternatives" you mentioned.


Those "alternatives" already work with emscripten, with the advantage of working at full speed (at least in the case of NaCl) in the optimal case.


Does it work at full speed without OdinMonkey?


No (well more accurately, "it depends on how good TI works without the restrictions"). Won't matter for most games, see also points made elsewhere here.


> But the larger problem is cross platform gaming, so hopefully webkit and others get something like this. The problem is Chrome has NaCl, Microsoft has their designs I am sure, we still need WebGL support which I am sure isn't coming.

I'd be very surprised if V8 (and so, Chrome and Node and everything else that uses it) didn't get asm.js support. Sure, Google's got NaCl and Dart and... but pursuing multiple paths to improving web performance isn't exactly new for Google.


Nice!! I've been planning on implementing the UT99 equivalent of QuakeLive at http://www.globalunreal.com for quite some time now. I've already made a new ping compensation mod, so all that's left is automating pretty much everything via the web. Although, I highly doubt I'll go 100% in the browser using WebGL. It'll most likely just be a wrapper that makes UT appear to be controlled by the browser when it's actually just running natively. This is all very cool stuff though. I could see it being quite powerful/useful in the future.


I might be able to help aid you in your efforts if you're interested. I have been involved with the modding & hacking communities of the unreal engine for the past 13 years now. I was one of the lead developers for the full conversion mod Unreal Tournament Revolution which ported UT99 gameplay to the unreal 2.5 engine. I've worked on various anticheat teams and released the first unreal engine hook PoC back in 2000. So if you're interested reply to this with an email, I would love to help you in any way that I can.


Ha! It's Aco. Small world! I actually wasn't around UT99 back when you were, but things are chugging along better now than they have in a looong time since I made newnet. If I recall, you were actually the guy who told everyone newnet wasn't possible for UT99!

At any rate, hit me up on irc.globalgamers.net #iPug and #uPug (#tdmpickups and #mlut are dead).


Soon all this tricks with CSS and HTML will obsolete, all will be replaced by canvas, with frameworks, engines, SDK, etc.


I'm waiting for someone to implement a web browser in canvas. Wait, now that I think about it, it has probably been done already.


This was an april fool's joke from last year: http://badassjs.com/post/20294238453/webkit-js-yes-it-has-fi...

In the non-joke category, Mozilla created a JavaScript interpreter written in JavaScript (to test new language features): https://github.com/mozilla/narcissus/


We've been working on it ;-) Zoom into the render of Wikipedia.org on page 8: http://eecs.berkeley.edu/~lmeyerov/projects/pbrowser/pubfile... .

The real story is that we're trying to get CSS etc. running on a GPU so you can use high-level web langs to script big visualizations. In this sense, OpenGL and ThreeJS are very low-level frameworks. Some other fun stuff as well, if we reach it.


One of the interesting projects that Nortel was developing before they went into bankruptcy protection was web.alive. It is a 3D collaboration platform built on the Unreal engine. The biggest problem was that it only supported windows and required a pretty big plugin that needed to be updated regularily. Avaya still sells it, and I think this could really make a big difference for customer adoption. http://avayalive.com/Engage/


I am tempted to say that 'this is not indicating the death of native platform games'

By the time we would see UE4 on the web, developers will be trying to achieve VR graphics in UE5....

Look at your surroundings people. We are far from achieving that kind of graphics and environment on our native systems. We are far from emulating the environment, we live in. For that we will always need a great native system + very fast native programming languages (like C\C++)


> For that we will always need a great native system + very fast native programming languages (like C\C++)

When you're talking about graphics, this really isn't true at all. 99% of what matters there happens on the GPU, not CPU, and that's not going down, it's only going up as GPGPU stuff gets bigger. The days of CPU-bound games are, by and large, behind us.


'optimise the GPU for max performance'

I bet you will hear this kind of statement in the future, when more general purpose work will be done by the GPU. We already have C++AMP, and I bet, in the future, CPU's will be GPU like.


I often imagine offloading everything to GPU is the realistic solution to workaround the mess of standards...


Top youtube comment: "There's nothing about the Unreal Engine 3 on both emscripten and hacks.mozilla" (https://www.youtube.com/watch?v=XsyogXtyU9o)

Even though the major legwork is in emscripten, somehow the focus is on asm.js. Feels like really heavy marketing.


It's natural that newer things get more attention.

Emscripten's been well known for months.

asm.js has been well known for weeks.

A working implementation of asm.js (OdinMonkey) has been well known for days.

A demo showing Unreal running on OdinMonkey has been well known for hours.


I have a question about asm.js. My understanding is that it's only useful for (static typed language) -> asm.js, is that correct? It doesn't make sense for (dynamic languag) -> asm.js because there's no type information to optimize with.


Am I the only one who finds Unreal powered games for kinda slow, sluggish and with "weird" movement? Latest two titles that I've tried - Dishonored and especially Borderlands 2 compared to Rage are hm... crappy... on 660TI


The one thing i don't like about the lowlevel compiled js stuff is the complete absence of threads. There is no way to get real game-like performance on a modern multi-core CPU in a single threaded language.


Is the idea with asm.js to write in C/C++ and convert it to Javascript? It sounds like I should relearn C++ because it's the one thing that will run everywhere. Android, iOS, and now the web.


The idea is to have an llvm backend targeting a weird subset of javascript that happens to get optimized really well. As far as I have heard, there are already other languages that target emscripten via llvm, and I'm sure there will be more when asm.js gets big. I don't think you'll be stuck with C++.


Is this using WebGL. If so then I guess games will be limited to OpenGL ES 2.0 API, until WebGL gets updated to a more advanced graphics API level.


OpenGL ES 2.0 is still the safest bet, as it is what most mobile devices support. Though devices with GLES 3.0 are appearing, driver support is still subpar. I agree it may be time to look at bumping the API level.


Does this mean that the source will be available for anyone playing the game, even if obfuscated?

It should open interesting possibilities.


I mean the assembly has always been available. I'm not sure asm.js would be any more readable.


it would have to be available, but it's pretty much meaningless gobbledygook once you minify


This might go even beyond gaming. First and third person interfaces for the web.


"but can it run Crysis?"


It should be theoretically possible. Crysis is written in C/C++ IIRC (well you program engine in those two languages).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: