Hacker News new | past | comments | ask | show | jobs | submit login

> It is not a VM that accepts binary bytecode as input, which is what the person I am replying to wanted. Context matters. And you could have read that yourself.-

Er, so?

> One could argue quite successfully that one of the chief reasons Java (applets) in a browser is a bad design is because of its "standardised" bytecode format, which is what everyone in this discussion thread is screaming for.

Then please, reasonably argue it. I don't understand how the argument applies.

Java applets perform poorly in the browser for a number of reasons, none of which have anything to do with bytecode:

- Java's generational GC is designed around reserving a very large chunk of RAM, and performs poorly if insufficient RAM is reserved. This is a terrible idea for desktop software.

- Java's sandboxing model is broken and insecure, as it exposes an enormous amount of code as an attack surface. A bug in just about any piece of code in the trusted base libraries can result in a total sandbox compromise.

- Java is slow to start and slow to warm up, and applets more so. It historically ran single-threaded in the browser and blocked execution as it did start up.

- Swing does look native, and doesn't look like the web page, either. Applets can't actually interface with the remainder of the DOM in any integrated fashion (eg, you can't have a Java applet provide a DOM element or directly interface with JS/DOM except through bridging), so applets are odd-men-out for both the platform, and the website they're on.

> Flash was able to get by better by virtue of having a monopoly instead of a standard, and thus, has the freedom to change its swf format and bytecode format.

That doesn't even make sense. Flash was better because it didn't lock up your browser when an applet started, and didn't consume huge amounts of RAM due to a GC architecture that was poorly suited to running on user desktops.

Flash sucked because of its extremely poor implementation and runtime library design.




Actually I missed this before. Javascript runtimes are /not/ VMs- I don't think I've ever seen a javascript engine use a virtual machine. Ever. Do you have evidence of this? (unless you mean in the sense that asm.js is treating the javascript runtime, as though it were a VM)

as for arguments against bytecode VM's, how about

http://www.dartlang.org/articles/why-not-bytecode/

http://www.aminutewithbrendan.com/pages/20101122

http://www.hanselman.com/blog/JavaScriptIsAssemblyLanguageFo...

Basically it comes down to this: It's easier and way more efficient to secure an untrusted program using a language grammar rather than a "bytecode verifier" and a few other things.

your comments about Java and the DOM are demonstrable untrue http://docs.oracle.com/javase/tutorial/deployment/applet/man...


I saw your post before you deleted it. I didn't get a chance to respond before. I just wanted to say that you probably know a lot more about VM's than I do, and I'll concede that. I don't really know for sure whether switching to a bytecode vm would be great or not for the browser. I know that doing /any/ change to "the web" is a huge uphill battle, and so the ecmascript committee has to make a lot of comprimises for pragmatism. In any case, worse is better, and in the real world we can't have the perfect computer system. Haven't you seen tron legacy?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: