> To what? What scenario do you envision where Apple, Google, Microsoft and Mozilla are all willing to implement a single new language for the web?
Well hopefully we wouldn't make the same mistake again by going with a language and we would just implement a VM on which JS is implemented. It would be much easier to validate the correctness of a VM anyway, achieve better speed than was possible before, and not tie people down to a (painful) language. On a personal note I would actually start to view the web browser as a platform instead of a bunch of tangled strings, cans, and tape.
EDIT: I didn't answer your question. But, as a developer, I will only move to the web for my default programming language when I have bytecode or assembly to look at, not some convoluted scripting language. Hopefully, Asm.js is a minor, minor bump on the way towards the browser as a usable platform for arbitrary development.
> Well hopefully we wouldn't make the same mistake again by going with a language and we would just implement a VM on which JS is implemented.
A VM still needs some kind of well-defined input format, and that's a language. Whether it's human-readable text or binary doesn't make that big of a difference. Either way, designing language semantics is hard.
> Whether it's human-readable text or binary doesn't make that big of a difference.
To people like the grandparent, that is the only thing that matters at all. It's not a technical problem they have but an emotional one. They do not want the code they write to be a second class citizen to the code someone else writes, and they will absolutely be ecstatic to see us throw away the last 15 years of progress and start from stratch in order to alleviate that problem they have.
Funny you should mention that, given that I just this week had to use NEON intrinsics to eek out better user-visible wall-time performance in a native app.
I guess it depends on the program, but for 99% of cases you should not have to. The very few times you would have to go for it, it's usually already part of some library you can use.
I'm not a fan of a future in which the only people that can do interesting things (including the use of SIMD intrinsics) are the platform vendors (eg, Mozilla), while the rest of us live in a JavaScript sandbox.
Maybe Mozilla should try writing their entire browser (VM included) in JavaScript/asm.js and let us know how that goes.
Large parts of the Firefox browser (as distinct from the Gecko+SpiderMonkey) are written in JS. Have a look at the code some time. Or, just open chrome://browser/content/browser.xul in Firefox to get a taste.
>I'm not a fan of a future in which the only people that can do interesting things (including the use of SIMD intrinsics) are the platform vendors (eg, Mozilla), while the rest of us live in a JavaScript sandbox.
What you describe as bleak is a much better future than what we have now. At least with Mozilla's proposal we will have a well defined low-level optimizable javascript "assembly", whereas now we just have Javascript itself.
We never had access to the use of SIMD intrinsics in browsers in the first place, anyway.
>Yes, exactly. I want it all: native performance, security, open platform.
The problem is by trying to have "all" we might get less than what we have now.
NaCL for example is a horrible "standard", as far as specifications.
And if companies are allowed to build whole native closed source castles in the web browser, we might return to the era of Active X and Flash. Maybe not in the sense of less security (a common Active X issue), but surely in the sense of less interoperability, transparency and end user control.
You would basically just be running native apps in the browser. Why not do it in the desktop or mobile and let the internet be the open, not opaque, platform that it mostly is?
> A VM still needs some kind of well-defined input format, and that's a language.
I don't really care about the input format, that's incidental. I'm protesting using javascript semantics to implement operations that don't at all require javascript semantics (and would be better off without them).
Don't forget Flash, which did actually work relatively well, and pretty much powered the development of the web as a video and gaming platform... but which we've eventually ditched for HTML/CSS/JavaScript + native web APIs anyway.
That is fair enough. Though that wasn't really Flash's original ambition. And to be further fair, the fact that it wasn't their explicit ambition is possibly what saved it, while Java suffered a doomy fate from Microsoft's embrace and extend strategy.
Flash was fine as a format for vector images. (like SVG). Then it was fine for animations. Then simple interactive graphics. Then it gets a scripting language. And as it gets more and more sophisticated, more used as an application platform-- something flash was not originally designed to do, it becomes more and more like Java in its shortcomings. Binary blobs. Long load times. Serious security holes. Slow as molasses.
But flash always had one good thing going for it: Anti-aliasing.
"Java applets were introduced in the first version of the Java language in 1995"
Remember, why Javascript is called Javascript. So it wouldn't be seen as competing with Java as "the language of the web".
Java, and its VM was designed for the web. To run in a browser. It was given every conceivable chance, and it utterly failed because ultimately, it's fundamentally a terrible idea. But somehow fellows like you are too headcopped to see history for what it is, and you are doomed to repeat it.
I remember when most developers would be surprised if you told them you could make programs in Java that are not applets. And java had a reputation for being "that really shitty slow painful web language"
September 18, 1995: Netscape 2.0 released with Java and Javascript support. This has the DOM level 0-. Java is marketed as the language for "Big boy apps" and javascript is merely the scripting "glue" that lets java access DOM level 0, which is limited really only to reading form data and changing the .src attribute on some images. [1]
1998: "The initial DOM standard, known as "DOM Level 1," was recommended by W3C in late 1998. About the same time, Internet Explorer 5.0 shipped with limited support for DOM Level 1. DOM Level 1 provided a complete model for an entire HTML or XML document, including means to change any portion of the document. Non-conformant browsers such as Internet Explorer 4.x and Netscape 4.x were still widely used as late as 2000." [2]
August 2001: Internet Explorer 6 is released with still " and partial support of DOM level 1" [3]
in the same month (August 2001) Netscape 6.1 is released [1]
netscape 6 was the first Netscape browser based on the "Mozilla Application Suite", what Firefox is based on today. It's hard to find detailed information on this, but I would also guess that Netscape 6 had "Partial Support" for DOM Level 1
February 6, 2002 (6 months later): Java's J2SE 1.4 runtime is released, and has, again, Partial DOM level 1 support, along with the ability to directly manipulate the page that an applet is on without using Javascript as "glue"
Which of course is the whole point of the DOM, as a "language agnostic" API that needed to be used from not just Javascript, but Java, C++, and VBScript and whatever other language.
It was, to be fair, shitty, but we are talking, at this time, 2002, people are using IE6 and netscape 4 still. Browser support for DOM matured around 2006, and so did Java and its applets, keeping pace right along with the browsers. People generally don't really understand that Java and Javascript are two seperate languages. It's all just "Java, that really shitty slow web language"
I'm not sure what you're trying to prove; javascript runtimes are VMs too.
Java was designed poorly, and it performed poorly. It just so happens that its design was well-suited to long-running servers, however, so that's where it's used.
It is not a VM that accepts binary bytecode as input, which is what the person I am replying to wanted. Context matters. And you could have read that yourself.-
Edit to respond to ninja edit:
One could argue quite successfully that one of the chief reasons Java (applets) in a browser is a bad design is because of its "standardised" bytecode format, which is what everyone in this discussion thread is screaming for. My point is: Look we've already done this. Twice in fact, because flash works the same way, and does it much better than Java ever did. And yet, it's still a failed concept in both cases. Flash was able to get by better by virtue of having a monopoly instead of a standard, and thus, has the freedom to change its swf format and bytecode format.
> It is not a VM that accepts binary bytecode as input, which is what the person I am replying to wanted. Context matters. And you could have read that yourself.-
Er, so?
> One could argue quite successfully that one of the chief reasons Java (applets) in a browser is a bad design is because of its "standardised" bytecode format, which is what everyone in this discussion thread is screaming for.
Then please, reasonably argue it. I don't understand how the argument applies.
Java applets perform poorly in the browser for a number of reasons, none of which have anything to do with bytecode:
- Java's generational GC is designed around reserving a very large chunk of RAM, and performs poorly if insufficient RAM is reserved. This is a terrible idea for desktop software.
- Java's sandboxing model is broken and insecure, as it exposes an enormous amount of code as an attack surface. A bug in just about any piece of code in the trusted base libraries can result in a total sandbox compromise.
- Java is slow to start and slow to warm up, and applets more so. It historically ran single-threaded in the browser and blocked execution as it did start up.
- Swing does look native, and doesn't look like the web page, either. Applets can't actually interface with the remainder of the DOM in any integrated fashion (eg, you can't have a Java applet provide a DOM element or directly interface with JS/DOM except through bridging), so applets are odd-men-out for both the platform, and the website they're on.
> Flash was able to get by better by virtue of having a monopoly instead of a standard, and thus, has the freedom to change its swf format and bytecode format.
That doesn't even make sense. Flash was better because it didn't lock up your browser when an applet started, and didn't consume huge amounts of RAM due to a GC architecture that was poorly suited to running on user desktops.
Flash sucked because of its extremely poor implementation and runtime library design.
Actually I missed this before. Javascript runtimes are /not/ VMs- I don't think I've ever seen a javascript engine use a virtual machine. Ever. Do you have evidence of this? (unless you mean in the sense that asm.js is treating the javascript runtime, as though it were a VM)
Basically it comes down to this: It's easier and way more efficient to secure an untrusted program using a language grammar rather than a "bytecode verifier"
and a few other things.
I saw your post before you deleted it. I didn't get a chance to respond before. I just wanted to say that you probably know a lot more about VM's than I do, and I'll concede that. I don't really know for sure whether switching to a bytecode vm would be great or not for the browser. I know that doing /any/ change to "the web" is a huge uphill battle, and so the ecmascript committee has to make a lot of comprimises for pragmatism. In any case, worse is better, and in the real world we can't have the perfect computer system. Haven't you seen tron legacy?
The beauty of this approach is that browser vendors don't have to execute these quirky expressions optimally. They are standard JS, so they will already work.
Yeah, but that's not hard. And there's a spec that makes it extremely clear which tricks are necessary -- no need to reverse-engineer someone else's implementation, and no risk of tricks that work today breaking tomorrow.
To what? What scenario do you envision where Apple, Google, Microsoft and Mozilla are all willing to implement a single new language for the web?