Hacker News new | past | comments | ask | show | jobs | submit login

Browser Javascript suffers from not producing a single "binary" as its build system output. You do get one bundle, but none of the parts know anything about each other, and it limits the ability to optimize. (It is a lot more like the C preprocessor, rather than a C compiler.) Methods that can't possibly ever be called end up being sent to each user, for no reason other than "well someone could write an eval() that needs that". (I was surprised how poorly tree-shaking works in practice, and how even "light" frameworks like Svelte ship a huge runtime to every user, unrelated to what features the code actually uses.)

Meanwhile, WebAssembly suffers from there not being a good programming language that compiles to it natively. I think the Go support is great, but there's no library support for building the webapps that people get paid to build (React, Apollo, etc. all exist for a reason, however much you hate that reason). AssemblyScript was supposed to be this language, but it's just a Typescript-inspired programming language, it's not Typescript and the npm ecosystem.

I don't know what the details of this implementation are, but the direction I'd like to see web development move is having a statically typed language that compiles to VM bytecode. This sounds like the first step for removing Javascript as a Typescript intermediary. Once that happens, then modern programming language features can be bolted on, people can add good compiler optimizations that result in minimal bytes output for users to download, the module system can be made stricter (like "go mod"), etc. Javascript is just a little bit more dynamic than anyone really wants, and it makes the build/deployment tooling complicated and, frankly, kind of bad.




Tree shaking works poorly, but is also thankfully not that important, unless you're terrible with managing dependencies.

Why? Because the initial parser is relatively cheap. This makes all of the code available, even if a stray eval() calls something unexpected.

The real resource expense comes when it's time to JIT a hotspot. But a hotspot is by definition code you use for sure. And code you use a lot.

JS engines use tracing JIT. Tracing allows JIT compilers to see how code runs in practice, and compile THAT to machine language. How it's organized on the file system etc is entirely irrelevant.

So basically in a trace JIT system, code that isn't hot is interpreted, and code that is hot is JIT-ted (and code that's very hot, gets JIT-ted with higher optimization).

Interpreting is the new "tree shaking". It saves the compiler a lot of work, but also it won't crash your app (unlike bad tree shaking).


> the direction I'd like to see web development move is having a statically typed language that compiles to VM bytecode.

Java… you want Java. And so the cycle continues!


You could add some kind of security manager to sand box it and have little apps running in the browser. Maybe call them "applets".


That sounds amazing! What if they came with some kind of built-in canvas?



That's a really good point. What we wanted all along was the JVM with a DOM API.

We started with Java. It didn't gain any traction for browser scripting. They took "Java" out and called it JavaScript. Now we take the "script" out and call it WebAssembly. If someone in 1995 made Java a DOM-editing thing instead of an applet thing, we could have saved 25 years of running around in circles :)

(Going back even further, we all connected to a mainframe with a dumb terminal. The desktop revolution happened... and now we're back to dumb terminals attached to a mainframe. But we call it "The Cloud" instead, and the dumb terminal fits in your pocket.)

Maybe the past wasn't as dumb as we think it was. We just weren't smart enough to understand it at the time.


Hey applets could manipulate JavaScript data/the DOM. There is a bit of bark to the 'Java script' bite. It was very slow, though, sending through the JVM process and to the browser process each time you read or wrote a value (and the plugin that did this I think was in its own process - memory is hazy there).

Overall it's hard to say how Sun could have won here, perhaps acquiring macromedia and using Flash technology to make a browser that ran those cool portals out of the box, and hey scripted first-class in Java too btw ;-)


And we’ve increased latency an order of magnitude each time. We have computers that are orders of magnitude faster, yet using a spreadsheet in a browser today is less responsive than it was in DOS decades ago. But the developer experience is so much better now, so it’s worth it?


What I’ve always wanted is scheme in the browser.


There was a fork in the road of history, where that could have been the future we live in..

> In 1995, Netscape hired Brendan Eich with the promise of letting him implement Scheme (a Lisp dialect) in the browser.

How JavaScript Was Created - http://speakingjs.com/es5/ch04.html

---

> Whether that language should be Scheme was an open question, but Scheme was the bait I went for in joining Netscape. Previously, at SGI, Nick Thompson had turned me on to SICP.

> ..The diktat from upper engineering management was that the language must “look like Java”. That ruled out Perl, Python, and Tcl, along with Scheme.

https://brendaneich.com/2008/04/popularity/


Yeah, I know this history well.



No, like

    <script type=“text/scheme”>
I also wish browsers had built out the ability to plug-in interpreters for processing script tags.



The past was pretty dumb, because it wasn't an open standard based on interoperability.


Or preferably, F#.


> WebAssembly suffers from there not being a good programming language that compiles to it natively

Is Grain insufficient for some reason?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: