Hacker News new | past | comments | ask | show | jobs | submit login
WebContainers: Run Node.js natively in the browser (stackblitz.com)
387 points by bpierre on May 20, 2021 | hide | past | favorite | 230 comments



Everyone thought WASM would enable developers to write code in any language they wanted (with type checking and higher performance) and deploy it on the web.

JS Developers come in: so we can compile node.js to WASM and run that in the browser?! Yay!! Now we can have backend JS running in the browser alongside browser JS!

Jokes aside: this is actually quite interesting, the demo is very impressive (except for the "only works in Chromium-based browser" message on the next.js demo, where the preview should be - despite Mozzila being one of the main WASM backers and Firefox having arguably the best WASM engine).


> Everyone thought WASM would enable developers to write code in any language they wanted (with type checking and higher performance) and deploy it on the web.

https://developer.mozilla.org/en-US/docs/WebAssembly/Concept...

> WebAssembly is a different language from JavaScript, but it is not intended as a replacement. Instead, it is designed to complement and work alongside JavaScript, allowing web developers to take advantage of both languages' strong points

"Not intended as a replacement"

WASM is a math coprocessor for JS. WASM can't even talk to the DOM API directly (which is the API you use to build web pages in JS); you have to write JavaScript glue code for any/all I/O in WASM.


> WASM can't even talk to the DOM API directly

For now. It's planned.


For existing code bases that don’t already do DOM manipulation (so, all of them) other parts of the Web API might be more useful, like the canvas, the video API, the RTC API etc.


The issue in general is how to bridge these JS OOP APIs to something that's basically assembler.

If WASM gains a foreign object model calling convention or whatever, we're in game.


Wasm is not basically assembler, it's (going to be) very high level:

- https://github.com/WebAssembly/reference-types

- https://github.com/webassembly/threads

- https://github.com/WebAssembly/exception-handling

- https://github.com/WebAssembly/function-references

- https://github.com/WebAssembly/gc

Though it is true that the original MVP was very barebone and could be compared to assembler. However, the reference types proposal has been shipped in all major browsers and is considered "finished".


That was the idea, the world decided otherwise based on the Flash and Silverlight experience versus clunky Web.

https://dotnet.microsoft.com/apps/aspnet/web-apps/blazor

https://platform.uno/

https://leaningtech.com/

https://flutter.dev/

https://wiki.qt.io/Qt_for_WebAssembly


>JS Developers come in: so we can compile node.js to WASM and run that in the browser?! Yay!! Now we can have backend JS running in the browser alongside browser JS!

I remember back when SOAP still had the upperhand against REST but just barely the W3C published a document where they showed how you could implement HTTP over SOAP to basically mass ridicule. Unfortunately can't find it anymore, but it was pretty funny.


searching "HTTP over SOAP" finds a few leads:

Tim Bray, 2004, "HTTP over SOAP!?!?!?": https://www.tbray.org/ongoing/When/200x/2004/05/01/SRRH

Erik Wilde, 2008, "HTTP over SOAP over HTTP": https://dret.typepad.com/dretblog/2008/11/http-over-soap-ove...

William Vambenepe, 2008, "WS Resource Access working group starting at W3C": http://stage.vambenepe.com/archives/436

Here are some of the specs:

Web Services Transfer (WS-Transfer). This specification describes a general SOAP-based protocol for accessing XML representations of Web service-based resources. https://www.w3.org/Submission/WS-Transfer/

Web Services Resource Transfer (WS-RT). This specification defines extensions to WS-Transfer. While its initial design focuses on management resource access its use is not necessarily limited to those situations. https://www.w3.org/Submission/WSRT/


it was the document referred to by Tim Bray, 2004, "HTTP over SOAP!?!?!?", I also remember the Tim Bray article as one of the top pieces of ridicule.


Typically when you see stuff like this be Chromium-only, it's because Chromium has a bunch of non-standard/one-vendor-only stuff enabled in production builds and neither Firefox or Safari have it enabled (or implemented at all). A few examples include Bluetooth, MIDI and USB.


I don't understand why you are downvoted. Filesystem access is another big one! Chrome has the technological advantage and sports an enormous kitchen sink of features. One has to make the hard decision, work on a concept that will work only on one browser, try to cut corners with graceful (or crippling) degradation, or wait potentially a long time until a spec is standardized and implemented. The latter may result in others hitting the market sooner, so it is understandable why we end up with web-apps that function in a single browser only. Hopefully this improves in the future.


The down side is that some of these technologies never get standardised and eventually get dropped. Like NaCl and PNaCl.


Or for an example that caused a lot more havoc comparatively: WebSQL.


Browser Javascript suffers from not producing a single "binary" as its build system output. You do get one bundle, but none of the parts know anything about each other, and it limits the ability to optimize. (It is a lot more like the C preprocessor, rather than a C compiler.) Methods that can't possibly ever be called end up being sent to each user, for no reason other than "well someone could write an eval() that needs that". (I was surprised how poorly tree-shaking works in practice, and how even "light" frameworks like Svelte ship a huge runtime to every user, unrelated to what features the code actually uses.)

Meanwhile, WebAssembly suffers from there not being a good programming language that compiles to it natively. I think the Go support is great, but there's no library support for building the webapps that people get paid to build (React, Apollo, etc. all exist for a reason, however much you hate that reason). AssemblyScript was supposed to be this language, but it's just a Typescript-inspired programming language, it's not Typescript and the npm ecosystem.

I don't know what the details of this implementation are, but the direction I'd like to see web development move is having a statically typed language that compiles to VM bytecode. This sounds like the first step for removing Javascript as a Typescript intermediary. Once that happens, then modern programming language features can be bolted on, people can add good compiler optimizations that result in minimal bytes output for users to download, the module system can be made stricter (like "go mod"), etc. Javascript is just a little bit more dynamic than anyone really wants, and it makes the build/deployment tooling complicated and, frankly, kind of bad.


Tree shaking works poorly, but is also thankfully not that important, unless you're terrible with managing dependencies.

Why? Because the initial parser is relatively cheap. This makes all of the code available, even if a stray eval() calls something unexpected.

The real resource expense comes when it's time to JIT a hotspot. But a hotspot is by definition code you use for sure. And code you use a lot.

JS engines use tracing JIT. Tracing allows JIT compilers to see how code runs in practice, and compile THAT to machine language. How it's organized on the file system etc is entirely irrelevant.

So basically in a trace JIT system, code that isn't hot is interpreted, and code that is hot is JIT-ted (and code that's very hot, gets JIT-ted with higher optimization).

Interpreting is the new "tree shaking". It saves the compiler a lot of work, but also it won't crash your app (unlike bad tree shaking).


> the direction I'd like to see web development move is having a statically typed language that compiles to VM bytecode.

Java… you want Java. And so the cycle continues!


You could add some kind of security manager to sand box it and have little apps running in the browser. Maybe call them "applets".


That sounds amazing! What if they came with some kind of built-in canvas?



That's a really good point. What we wanted all along was the JVM with a DOM API.

We started with Java. It didn't gain any traction for browser scripting. They took "Java" out and called it JavaScript. Now we take the "script" out and call it WebAssembly. If someone in 1995 made Java a DOM-editing thing instead of an applet thing, we could have saved 25 years of running around in circles :)

(Going back even further, we all connected to a mainframe with a dumb terminal. The desktop revolution happened... and now we're back to dumb terminals attached to a mainframe. But we call it "The Cloud" instead, and the dumb terminal fits in your pocket.)

Maybe the past wasn't as dumb as we think it was. We just weren't smart enough to understand it at the time.


Hey applets could manipulate JavaScript data/the DOM. There is a bit of bark to the 'Java script' bite. It was very slow, though, sending through the JVM process and to the browser process each time you read or wrote a value (and the plugin that did this I think was in its own process - memory is hazy there).

Overall it's hard to say how Sun could have won here, perhaps acquiring macromedia and using Flash technology to make a browser that ran those cool portals out of the box, and hey scripted first-class in Java too btw ;-)


And we’ve increased latency an order of magnitude each time. We have computers that are orders of magnitude faster, yet using a spreadsheet in a browser today is less responsive than it was in DOS decades ago. But the developer experience is so much better now, so it’s worth it?


What I’ve always wanted is scheme in the browser.


There was a fork in the road of history, where that could have been the future we live in..

> In 1995, Netscape hired Brendan Eich with the promise of letting him implement Scheme (a Lisp dialect) in the browser.

How JavaScript Was Created - http://speakingjs.com/es5/ch04.html

---

> Whether that language should be Scheme was an open question, but Scheme was the bait I went for in joining Netscape. Previously, at SGI, Nick Thompson had turned me on to SICP.

> ..The diktat from upper engineering management was that the language must “look like Java”. That ruled out Perl, Python, and Tcl, along with Scheme.

https://brendaneich.com/2008/04/popularity/


Yeah, I know this history well.



No, like

    <script type=“text/scheme”>
I also wish browsers had built out the ability to plug-in interpreters for processing script tags.



The past was pretty dumb, because it wasn't an open standard based on interoperability.


Or preferably, F#.


> WebAssembly suffers from there not being a good programming language that compiles to it natively

Is Grain insufficient for some reason?


I think the issue isn't with Wasm support, it's the filesystem API. They could've included a polyfill though... But as that API is new it might not be available yet.


It's the "File System Access API," the one that allows the browser to read/write files on your actual computer. It's an I/O feature, impossible to polyfill.

The similarly named "FileSystem API" gives you access to a sandboxed "virtual drive" in the browser. Firefox has supported that for years, and polyfills are possible, but it's irrelevant in this case.

https://caniuse.com/?search=file%20system


I believe it still counts as a polyfill, even if it doesn't do what it says on the can (but it has compatible interface, so your app runs).


I wonder if this would work in electron?


"StackBlitz v2 Beta currently works in Chrome and Chromium-based browsers. We’re hoping to add support for more browsers as they implement the necessary Web Platform features."

Pretty useless, then. What features are missing? As far as I know Firefox was one of the major contributors to pushing WebASM to the web and has a superior WASM engine in many aspects.

I'd be more interested in a post about what WebASM features are missing than an announcement that someone hacked Javascript into a browser.


Copy+pasting my answer from existing comment:

It works in Firefox today (minor issues keeping it feature flagged for now) and Safari is close to shipping WASM Threads, so this will likely work on all major browsers by EOY.


It looks like the problem isn't WASM features, but web I/O APIs, like the File System Access API.


Many of the more "advanced APIs" are unlikely to be shipped in either Firefox or Safari because of security and privacy concerns.

For example, Chrome released File System Access by defaul tin Chrome 86, on October 6, 2020.

And yet, as late as August 2020 they were saying that they'd just got the spec in shape: https://github.com/mozilla/standards-positions/issues/154#is.... And there are still unresolved issues around actual security of the thing (that Chrome happily ignores).

The "spec" itself lists 13 issues: https://wicg.github.io/file-system-access/

And Safari won't implement it: https://lists.webkit.org/pipermail/webkit-dev/2020-August/03...


If business PCs are running Microsoft Teams + Outlook PWAs on Edge (Chromium based), personal PCs as a majority are running Chrome/Chromium...I can't see why Apple wouldn't come to a compromise to treat PWAs with respect.

It's not just about privacy of web APIs...it's about Apple's chokehold on iPhone users.


> It's not just about privacy of web APIs...it's about Apple's chokehold on iPhone users.

Funny how you fully ignore Firefox's position


They're unable to keep anything close to security parity with Chromium — even with hundreds of millions of dollars.

You're right — frankly, I don't look to Mozilla as the leader or voice of web development anymore.


> They're unable to keep anything close to security parity with Chromium

Funny how you went from "Apple's chokehold on iPhone users" to "unable to keep anything close to security parity".

I wonder in which respects Safari "can't keep up"?

> I don't look to Mozilla as the leader or voice of web development anymore.

- There are only two other independent browser engines of any importance, and their input is increasingly ignored by Chrome

- Who cares, go Chrome

This is not as good a take as you think it is.


> 13 issues

These are mostly a to-do list...which makes sense at this stage?

I'm not sure what you're getting at.

Do you think that Apple or Mozilla can stop webapps from taking their rightful place?


> These are mostly a to-do list...which makes sense at this stage

Which stage is it? Chrome is already shipping this, enabled by default.

And the "todo list" covers important issues around permissions, restricting file system access we etc.

> Do you think that Apple or Mozilla can stop webapps from taking their rightful place?

I think that Chrome plays fast and loose with web standards, and has essentially replaced web standards with Chrome-designed and Chrome-specific APIs, while fully ignoring any concerns from other browsers.


> Which stage is it?

The stage where a majority of professional web devs (y'know, the people who aren't here!) still haven't even heard, or fully assimilated, the term "PWA" yet.

> the "todo list" covers important issues around permissions, restricting file system access we etc.

They've proven very capable of managing this in AOSP.

> I think that Chrome plays fast and loose with web standards, and has essentially replaced web standards with Chrome-designed and Chrome-specific APIs, while fully ignoring any concerns from other browsers.

"Chrome-specific" implies DRM etc which I don't find fair here, as a very regular Google critic.

If the concerns are always "We can't keep up", I don't blame Chromium team for pushing on ahead, and letting other browsers catch up.

Frankly: Mozilla isn't leading the way on web standards anymore. (And they're not doing as good a job with security+privacy, really, either, compared to properly configured Chromium.)


> (And they're [Mozilla] not doing as good a job with security+privacy, really, either, compared to properly configured Chromium.)

Anecdotally, I feel that the quickly growing number of websites that complain that Firefox is an "ad blocker" with an unconfigured, out of the box default install seems to imply Mozilla is doing a great job with respect to security+privacy.

> Mozilla isn't leading the way on web standards anymore.

All the way back to the IE3-6 era, Mozilla has never been about "leading" web standards. IE3-6 lead web standards. Mozilla's role in web standard has almost always largely been about curtailing the excesses of whoever is currently leading, and keeping standards fair, safe, "quality" over quantity.

It's not Mozilla's job to keep up with ~IE6~ Chromium, it's their job to keep Google from spinning web standards out of control. To keep Chromium from being IE6 2.0. To keep Google from just defining all the standards in ways that make Google happy but maybe aren't great long term for the health/safety of the web. It's Mozilla's job to slow Chromium down, and the fact that people are complaining "Firefox isn't keeping up" with Chrome's non-standards and in general that they are losing that battle in mainstream browser usage isn't at all great for the web. (The last time something like this happened was IE6. Whether or not Chromium is the "New IE6" will take a while to play out, but we're deep into the IE6 playbook at this point and anyone not seeing parallels either doesn't remember the early years of IE6 well or has too high of an opinion of Google.)


Can you give a few examples to backup your claim that chrome is head of Firefox in terms of security and privacy?



> If the concerns are always "We can't keep up"

Safari and Mozilla: here are the security concerns, here are privacy concerns, here are concerns that the "specs" reflect internal APIs and are not actual specs.

HN: nah, concerns are always "we can't keep up"

No, those are not "we can't keep up".


> compared to properly configured Chromium.

Ah yes. Because that's exactly what you want from a browser: spend time properly configuring it to make it secure.


The configuration is for privacy.

The security is inherent.


So, FF doesn't need configuration to provide privacy, Chrome has to be "properly configured".

Chrome is more secure.

So, not so simple as a blanket statement you provided.


Yes.

Which is why I recommend Ungoogled Chromium.

Or just Chromium.


Hi all- CEO of StackBlitz/author of this blog post here. Happy to answer any questions! (Also, thanks for posting this bpierre!)


> Faster than your local environment. Builds complete up to 20% faster and package installs complete >= 5x faster than yarn/npm.

Above is a quote from the post. I feel like this is a stupid question, but how can running yarn/npm in a browser on my machine be faster than running yarn/npm on my machine? Particularly when each page load runs a fresh npm/yarn install?

Nonetheless, this is a incredible piece of software that i'll be following closely


I've also asked myself this question. I think the comparison is not with a local environment but rather with running node/yarn/npm on a remote virtual machine/ci. This is my hypothesis, I have no proof that's what they meant.


I'm wondering if there is more happening in-memory, even if the project files and modules themselves are on-disk (accessed via FileSystemAPI) the vscode/node/other bits including temporary files may be held in RAM once initially loaded and never swapped out.

If that is the reason, then if your machine becomes memory constrained, the performance will drop through the floor quicker than with full-local installs.

It could be a mix of this and your suggestion (lower network latency related bottlenecks than experienced with non-local deployment).


I was wondering the same thing


I'm also curious about this


Is this all open source?

I'm fascinated in learning more about how you run a web server with this - the article says "WebContainers include a virtualized TCP network stack that's mapped to your browser's ServiceWorker API" but I'd love to understand a bit more about how that works.

Since this is all done with WebAssembly are you planning on targeting other stacks such as Python? The amount of time I lose helping other people getting their Python development environments to work remains horrifying.


We'll be open sourcing core parts of WebContainer as we move towards GA. We have some additional technical info in our core WG repo: https://github.com/stackblitz/webcontainer-core

We also did an hour long podcast a few months back that goes into pretty deep detail: https://www.youtube.com/watch?v=5F9qH-ea5Qk


First, congratulations! This is an astonishing advancement.

I found this on the GH repo: "Is this open source? Today no, but the API will be open source to developers at GA."

I'm confused by what "the API" is, exactly. Is WebContainers a technology you plan to make available for others to use node, in the browser, in their own apps?


> "Is this open source? Today no, but the API will be open source to developers at GA."

If they make it Open Source, Microsoft will add it to VScode and eat their lunch.

If they don't, developers might be afraid of lock in.

Best exit for them is to just grow users and not make a decision either way until they get acquired by Github (aka Microsoft) and folded into VScode.


> If they make it Open Source, Microsoft will add it to VScode and eat their lunch.

Put the code under the GPL so no one can use it to build proprietary software. Visual Studio Code contains proprietary components, so this would prevent that.


Hi, CTO of LeaningTech here. Congratulations for the highly polished demo. I think you might find interesting this one we did months ago: https://repl.leaningtech.com/?nodejs


Is node/v8 compiled to Wasm entirely, or are you using the browser's JS engine with polyfills for node's APIs? Or something in between?


Why do this at all?

With all the security and design problems of Node.js, when there are actually secure and reliable ways (harder ways, yes. Security and reliability are hard) to do all of this. Why?


You mention being able to use the debugger. How do I do this? The docs are very poor.

I dont see any way to set breakpoints. When i use a debugger statement and open inspector it breaks on transpiled/packaged code (its different to the actual file). This is a deal breaker for me at present. Am I doing something wrong?

(Using the http server template)


Can you expand on the capabilities of the terminal? What sort of operations can be done from the terminal?

It looks like the shell is JSH. Is this an in house shell? Is there any more information on it?


Could you point us to an introduction to how operating system concepts get mapped to browser concepts?

It sounds like each native binary gets compiled to a WebAssembly binary, but how do they communicate? How are the network and file system implemented? Are parts of the file system persistent? How does a system call work?


Great questions- we have some additional technical info in our core WG repo: https://github.com/stackblitz/webcontainer-core

We also did an hour long podcast a few months back that goes into pretty deep detail: https://www.youtube.com/watch?v=5F9qH-ea5Qk


How much of this is JavaScript specific and how much could be expanded to “any” programming language?


This pretty cool stuff. I am wondering about another use case for this type of thing.

We have a CLI tool built with node.js, could we build one of these containers with with the CLI tool setup and expose just a terminal with access to the local file system?


You’re welcome :)

Any plan to work towards supporting Firefox & other browsers?


Yes definitely! Firefox compatibility is close to done and will ship soon. Waiting on Safari to ship some additional features. They don’t share timelines so it’s unclear at this point.


That’s great to hear! Looking forward to it.


Hi, Can you share how it is working under the hood ?


hey very nice work !

do you think that a docker.js running containers in the browser is something that could become a reality soon if we follow you line of work ?


That isn't a blog post. That's an ad.

Can you link to the blog post where you actually discuss literally any of the technical stack in any depth further than one sentence of wow more pizzaz much fast.


I used to teach web development and was always frustrated with just how much effort went into setting up a machine. It would be really nice if a developer could just include a script tag in their project that setup a TypeScript compiler inside service worker. I managed to get the basics down, even fixing relative import statements. Requests like localhost:3000/main.tsx would compile and cache on the fly.

There were a few native file system issues I encountered that this would be perfect for!


I wonder if that is the ideal path though, I was an instructor for a few subjects in my university and sometimes the biggest detractor for students was the lack of familiarity with the tool set used by the language,

for example, people sometimes did not know how to freeze their dependencies and use python virtual environments when working with python projects, that lead to problems in collaboration, handing students "magic" prefabricated environments could lead them to believe that that's all there is to it, and explaining that it is not to the student in an edge case might end up being less productive than understanding this process from the get go

you could say that the _effort_ to set up a development machine should be part of the learning experience, at least that's what I think.

I remember seeing my classmates having their first freelance jobs and editing minified css/javascript directly because they did not know anything about the transpiling that goes on or the toolset surrounding javascript and web development


It’s even funnier when those same devs who struggle with packaging and distribution in their best language go on to work at Fortune 500 companies and require constant devops assistance to actually make what they wrote work in even a test env. “But it works on my laptop!”


“Yes, but we are not shipping your laptop!”


I taught web development at a university last summer with CodeSandbox remotely.

Worked like a charm.



VS Code seems to allow this with Dev Containers. I can download a repo with a `.devcontainer.json` and be up and running in virtually no time, in a container.


Just wait till you try vscode devcontainers with GitHubs new code spaces.


How did you end up doing this? Do you mind sharing?


Excuse my mess but feel free to poke around:

https://github.com/nirrius/my-server


Yes web developing is so hard...

sudo apt install apache2 php libapache2-mod-php; vim /var/www/html/index.php


This comment is hilarious. It's like saying "being a doctor is so hard... pulls out an amputation saw"


That's like saying that driving a car is easy. Foot on gas. Hands on wheel. Drive to destination.

Most things can be simplified to the raw essentials, doesn't mean you're going to achieve a result that people want or need.


I see you haven't done anything in 10 years


PHP is actually pretty great these days, you should have a look at what modern PHP has become if you’ve been out of the loop for a while. Have a look at https://phptherightway.com/ if you’re interested.


Cool, hello-world is easy, now give me a basic bar chart.


As far as I understand JS code is being run natively in the browser. Node has been hooked to the native engine, so V8 is not involved.

At LeaningTech we have since several months been able to run nodejs, but also python and ruby using our Wasm based x86 VM (CheerpX), the demo is available here: https://repl.leaningtech.com/?nodejs

With our solution the full Linux x86 binary of nodejs is running including the whole of V8 and its JIT engine. Of course fully client side.


It looks like the difference is you went for full x86 system call level emulation to handle existing binaries whereas stackblitz decided to build a container for things like Node compiled directly to WASM with no static assumptions of a base system. The latter approach with 1 less level of indirection should be much more efficient, as is evident by startup times of the 2 demos. Not to mention the ability to hook into things above the syscall level.


> We are working on CheerpX, but it’s not yet available generally. If you are curious and want to know more, stay tuned, or get in touch!

How close is CheerpX to getting (for example) an app that uses a PyTorch model running in the browser? :)


Not an expert on PyTorch in particular, the python + native (if any) components would work right now. If GPU acceleration is required some way to bridge into WebGL or (much better) WebGPU would need to be engineered, certainly doable but not existing right now.


Except for figuring out Pytorch's blas, the rest should be pretty straightforward to use CPU-only, I'd love to use that to (even pay to) compile some side projects to wasm apps in the browser :)


Wow, the PWA is really impressive. Honestly it's starting to get hard to tell the difference between this and Electron-based editors like VS Code. I'm really excited for integration with the File System Access API; that's the final puzzle piece I'd need to use something like this for serious work.


You can checkout www.StackBlitz.com/local and give it a try! Still in Alpha, but should generally work for you :)


So it's 8 dollars for the astronaut plan. That's what I pay for robux to my kids, per week. It's jaw dropping to me how such advanced tech is made available at a single digit price. Thanks for sharing.


> Honestly it's starting to get hard to tell the difference between this and Electron-based editors like VS Code

Since Electron is just chromium, I'm not sure why you would expect a difference in the first place? The differences that previously existed between Electron & Web hasn't changed - Electron's only real purpose was to basically turn off the browser permission model & let the "web app" do whatever it wants. That's still true with WebContainers, it still has its hands & features tied to whatever Chrome & browsers in general are comfortable letting websites ask permission to do.


Me too and I was surprised to find that Chrome lists this as "ready" or "released" (I don't recall the exact term). I tried the example web based editor and found that the file system access API worked in Firefox too.

I'm working on a note taking project right now that I think would be great as a web app, but I'd like to store the result in a flat text file and the file system access API looks like it will let me do exactly that.


> ..found that the file system access API worked in Firefox too

For now, File System Access API seems to be available only in desktop Chromium browsers: Edge, Chrome, Opera. I believe Firefox has not implemented it yet.

https://caniuse.com/native-filesystem-api

---

Mozilla's current position on the specs:

> The ability to read and write from the filesystem is potentially very dangerous. We will need to carefully consider any solution in light of the security and privacy implications. We recognize that the spec authors take this issue seriously, but we are concerned that any solution will increase the risk of security incidents more than we are willing to tolerate.

> Right now, there isn't enough detail in the specification to make an assessment of these risks, so we will defer our decision until we have more information.

https://mozilla.github.io/standards-positions/#native-file-s...


They said it worked in Firefox...


I think very few people will realize the systemic effects that a demo like this entails. Very, very good work StackBlitz team!


Funny you say that. I'm usually really skeptical of the next big thing but I felt differently about this, just can't put my finger on it.


This really means a lot. I'm glad our work & writing came across genuine.


Thank you! Our entire team has definitely averaged 3hrs of sleep every night the past week to make this launch happen :) Incredible how much work goes into launching something like this.


If their base is node, that implies we could start to see many electron.js app being ported to the browser


ELI5... why would you want to run an Electon app in the browser? Isn't that just a website at that point?

EDIT: Upon closer inspection, it looks like this is an online IDE meant to mimic a desktop environment. I don't know if there's any demand for that, but sure, why not. I'm sure someone will come up w/ a reason to run WebAssembly in a jQuery plugin, too.


I guess that running in the browser it doesn't need to be installed on the machine.

Or a desktop-only Electron App would be usable from a mobile phone browser etc.


Backwards compatibility with existing Electron apps I guess? Otherwise yeah, a native PWA would be better.


What "systemic effects" do you mean?


> Faster than your local environment. Builds complete up to 20% faster and package installs complete >= 5x faster than yarn/npm.

How? Isn't this using npm?


It's using a custom npm client we've built called Turbo. You can read more about the v1 version of Turbo here: https://medium.com/stackblitz-blog/introducing-turbo-5x-fast...

We haven't released any more info on Turbo v2 yet but will soon.


I was kinda hoping it was something crazy like nodejs with v8 engine compiled to wasm lol


Did you read the post? You might be pleasantly surprised :)


Can you please go into more detail as to what exactly is being compiled to wasm? The post does not describe it. In particular there seems to be a lot of confusion about whether the JS engine itself is running in wasm or not.


The largest wasm file I see downloaded on the website is 440K. Disassembling it, it seems to contain a bunch of libc-like filesystem code, and was built using the rust toolchain. So it doesn't look like v8 has been compiled to wasm here.

Rather, I assume they compiled a library or two, but otherwise they use the JS VM in the browser, and they've ported the Node.js runtime scripts to that environment.

That doesn't take anything away from the achievement here - it's really impressive! It's better to use the VM in the browser, if you can get those runtime scripts portable enough, which from other comments it seems like the answer is (or will be) yes.


This makes me want to quit webdev and never look back. Why the fuck are we recreating operating systems in the browser? We already have operating systems. We already have containers. We already have native code execution.


Because operating systems don‘t support seamless and secure distribution of cross platform software. Web browsers are currently the only secure sandbox we have.


Cross platform, right. So we ignore the perfectly adequate operating systems with their mature GUI frameworks, system calls, filesystems and programming language support, because having separate teams for different platforms is apparently too much, and instead we over-engineer the fuck out of web technologies (both front and back end) to the point people are estimating their RAM needs using the number of Chrome tabs they use.


Functionality is often largely independent of platform specific features. Spending limited resources on re-implementing stuff five times over can result in a worse experience for users than focusing on a single high quality cross-platform solution that is easy to distribute and keep up-to-date.

It completely depends on the specifics of the app. Some apps benefit from tight device integration. For others it's a waste of resources that could be spent on features.


Secure sandbox? Are you kidding?


What other platform allows arbitrary code execution like the Web does? We have billions of Internet users yet actual security issues are few and far between. It's remarkably safe and secure given the scale


To be safer than other platforms doesn't mean it's safe. And because Chrome is dominating it's a huge target. And at the moment the browser access to the underlying infrastructure is limited.


>To be safer than other platforms doesn't mean it's safe

Not automatically, but it can mean that, and in my opinion it is true in this particular case. Safety is not defined as the complete absence of risk.


Lol it seems that these days, any new significant web technology makes people want to ragequit their job for some reason https://news.ycombinator.com/item?id=26960946


thank you this was the comment i was looking for hah. although i'm not quitting, i'm continuing to use the already awesome tools we get for free..


Why is this a problem? Why can't they try out new things?!


My problem with it is that most of the new stuff ends up in production and widely used for cargo-culting reasons.


For fun?

Dude you live on such a black and white world


So... Two thoughts: this probably means we can run node in deno?

Second: looks like js is catching up with smalltalk. JJust a shame the debugger is still rather crappy, and the browser/inspector only applies to a subset of widgets/controls (the html/css stuff, not the tab bar, address line, bookmarks menu etc..).


Web development is a big gigantic shit storm. Why can’t we use a real statically typed language without this JS shitty overhead, generate compiled binary and send that to users? I personally hate writing javascript


Have you not heard of WASM? And while it's not exactly what you're asking for, what about Typescript?


To demonstrate the potential this needs a demo of a "middle-tier" service implemented in NodeJS. The service needs to talk to a MySQL database, and expose a Rest API to the web application.

If that works that means many intranet applications can be server-less. Both the JavaScript app and the "middle-tier" runs in the browser, and the middle-tier talks to MySQL server which is also on the intranet. All you need is a file server to serve the static JS files.


If the middle-tier and the app are both written in JavaScript, and you're developing them both simultaneously in the browser, does it make sense for them to be communicating via REST any more?

Wouldn't it make more sense for the app to send SQL prepared statements to the database (via a tiny standardised wrapper on the server to enforce per-user permissions based on the authenticated session)?

It probably depends on whether other apps are requesting data from the same database, in which case it might make sense to convert from the relational model to JSON objects in a tier which exposes the objects over REST.


> via a tiny standardised wrapper on the server

That defeats the point of not having a server.

The middle-tier should be written in Nodejs because it has drivers for relational databases such as MySQL.


Wow. How? Node.JS needs a "full-blown" OS to compile and run... right? And that's not something you could emulate efficiently using wasm? Right?...?


Great question :) We have some additional technical info in our core WG repo: https://github.com/stackblitz/webcontainer-core

We also did an hour long podcast a few months back that goes into pretty deep detail: https://www.youtube.com/watch?v=5F9qH-ea5Qk


node.js, as any other native Linux/Windows/Mac application, talks to the OS via a system API. You can emulate that API using WASM, mapping I/O to browser APIs, which is why they mention network is implemented via service workers, for example. File I/O can be mapped to IndexDB... Threads can also be mapped to service workers, I guess... and so on.

emscripten[1] did this a long time ago for C, which is how a bunch of native applications have already been ported to run on the browser.

webcontainers[2] seem to do a similar thing but focused on exposing the browser API to the native apps in a way that integrates well with the JS environment.

[1] https://emscripten.org/

[2] https://github.com/stackblitz/webcontainer-core


People have made Linux run on the browser, so... Although I don't think this is what they did here.

Man, the yet another layer they've made between code and the CPU makes me uncomfortable. Imagine finding a bug and going through 20 layers to troubleshoot it.

Also: https://www.theregister.com/2017/03/23/cursor_devours_cpu_cy...


"You want it to be a mini operating system"

I don't think this is universally true.


Isn't it awful? Now each smart ass web developer will start sending down scripts with nodejs to enable 'rich' user experience


I think that wouldn't be a problem, as long as people get used to containerize what they write


Well, it's called WebContainers. One would hope everything is containerised


Oh no! The user experience must surely not be enriched! Damn those smart ass web developers who aren't like the rest of us that care about code quality and engineering principles!


This is super cool! I think pairing this with the upcoming native support for modules and a CDN like skypack could lead to a truly powerful browser development experience.

No bundlers and a full IDE? Maybe this could even mean that there's a chance for a some of the 'just fiddle with the page' experience to come back to the web! (/rainbows and unicorns)


If it really runs inside the Browser: How did they implement adapters for networking and crypto/tls?

I mean, a console is nice and all. But I'm highly sceptic as to how the filesystem and networking works, because fetch cannot be used to have dgram or tls/net sockets. And that is literally the primary reason nodejs exists.


Part of the answer to your question:

> WebContainers include a virtualized TCP network stack that's mapped to your browser's ServiceWorker API, enabling you to instantly create live Node.js servers on-demand that continue to work even when you go offline.


I was trying to refer to the simulated iframe that's the live preview...and probably uses the service worker's responses to emulate the behaviour rather than say, a real end-to-end http server pipeline.

This sounds very likely like http module injection, so that the nodejs module isn't really doing what it thinks it does, but is using an injected API behind the scenes.

Raw TCP is only available for Chrome Extensions, on Chrome OS, iirc.

(Setting aside the lack of TLS or a real crypto module that isn't using the Browser's Web Crypto API approach)


John Blow is going to lose his mind at this.


Title needs fixing. This is chrome only.


For now. It works in Firefox today (minor issues keeping it feature flagged for now) and Safari is close to shipping WASM Threads, so this will likely work on all major browsers by EOY.


I'm trying to get my head around how the networking of it works. Is local.webcontainer.io being overridden somehow on the browser like a hosts file might do? Trying to figure out how I'm hitting that URL from another tab


There is a service worker registered on the domain name, for example https://nextjs-agaw8i--3000.local.webcontainer.io/, then when you visit this URL in another tab it basically shares the service worker from the stackblitz tab.


Must be some sort of tunneling, because it works outside the browser too


If by "works" you mean "serves up an error page". It's a standard domain, not real localhost, and it needs the service worker from the editor page running to work properly.


What about this is "native"? The page says it is "running natively" but also says "All code execution happens inside the browser's security sandbox".

Is wasm in a browser somehow considered native now?


I interpreted it as “native to the browser”, which is a bit silly when you think about it, but makes sense when opposed to “running remotely”.


Isn't the opposite to "running remotely" "running locally"? Meanwhile if pressed for the opposite of "running native" I would answer "running virtualized".


The web devs where I work have started calling pretty much everything "native" as long as it interacts with the host in some way since people assume it means fast.


WASM can be compiled to native code.


Tried this out with a basic set of React deps in package.json

Reloading now spends 10+s "installing dependencies" - pretty quick, and the UI is still interactive.

Also, Chrome has two "Google Chrome helper" processes each sitting at 550MB+ RSS. I'm not sure if "no node_modules black hole on disk" is quite as compelling if the black hole has merely been relocated to RAM.


Does this mean we can now have a SOA in the browser? True separation of concerns: Render code deal with writing to DOM, when data is needed it calls a node API proxy server which in tern interfaces with REST/grpc/graphql servers, then there's also auth service, and also a local (literally local) cache service. All in the browser. Damn.


This is a very cool project!

I noticed that outgoing http requests don't seem to work (there's no `curl` and after installing a CLI tool I work on via npm (which worked like a charm!), the outgoing API calls it makes were blocked. I assume that's intentional at this stage. Is that changing in the future?


Their repo implies that network requests are subject to the same-origin policy, since it's in a browser after all. If you open up devtools you will probably see complaints along those lines. In short: the endpoints you request need CORS enabled.


I assume you still can't use UDP or bind to TCP sockets that are accessible outside of the browser?


Quoting https://github.com/stackblitz/webcontainer-core: > We're limited by the browser's ability to make network requests, so connecting to processes like MongoDB, Redis, PostgreSQL, etc. are not currently possible



Does this mean we can use sqlite natively on the browser now?


SQLite has been ported to JS (and wasm) for some time,

https://sql.js.org/#/

https://github.com/sql-js/sql.js/


Yes. One of our engineers has gotten sqlite to run natively in WebContainer. Will be releasing as OSS in the next few weeks!


Great! Are there any tickets on this we can follow?


That's pure orgasm. Thank you for this!



Far out! But. Is there no end to our collective madness!


I'd be super interested in seeing how much of this is really wasmified. I'd love to see if this could get running in a DFINITY Canister.


I am definitely going to try this first thing tomorrow


Someday, I'd like to see Wasi-based webassembly containers. It'd be awesome to spin up a sandboxed shell with file access in a webpage.


You can actually do this with WebContainers today :) Try it out here: https://stackblitz.com/fork/node

Full WASI support in WebContainer is landing in the next 1-2 months.


why do you want to spin up a shell with file access in a webpage?


So you can open up your local project directory in your browser-based IDE and have it work like you expect.


My wish was actually to be able to spin up a shell locally on a github repo that pulls stuff over lazily.


totally lost newbie here: but what is the difference between running javascript on the browser and running node.js on the browser?


This is going full inception mode. Nodejs is compiled to wasm (like assembly for your browser), and then loaded inside the browser, which can then run javascript.

So a full JS engine is loaded, completely separate from the built-in one.


node uses v8. Is the idea that v8 is being compiled to wasm? That seems implausible though it would be amazing if true. Or is there another engine?

My assumption was that the node APIs are simply being exposed to v8.


yeah that's what I gathered: javascript already runs on the browser, then we got node.js because people liked javascript and wanted to run it natively, and then finally we got this which allows us to run node.js on the browser again? it looks to me like we've gone full circle with extra overhead :D but I guess people have reasons to run node.js on the browser and not javascript so idk


I guess the idea is to access some Apis that are not present in the browser, like fs. This was mentioned in a question to StackBlitzs CEO, but he seems to be an AI that just pastes the same text over and over again.


> I guess the idea is to access some Apis that are not present in the browser, like fs.

That's why electron exists. But this can't do that, it's still limited by the APIs that are present in the browser. It's seemingly just full-overhead for... some... reason?


basically node exposes all these other APIs like reading a file from disk. Browser also has APIs like detecting when the page is fully loaded that wouldn't be available on Node. What JavaScript syntax is supported can also vary between Node and browsers.


WASM cannot magically load from disk, there is browser protection, right?


It can't access the file system, but it could implement an API that emulates a file system inside the browser sandbox. Presumably that's what they're doing here.


But you can do that same API emulation in regular browser JavaScript, too. What's the WASM intermediate buying here?


I think it is more complicated to emulate those api in normal browser environment compared to emulating some OS api for running node.js.


Why? It's literally JavaScript on both sides - what's complicated here?


Because many node.js api are written in C/C++, it might be easier to implements OS features instead of rewriting in JS. This is just my assumption.


Server to server code instead of client to server code, expose db Api to “client” ... lots of think about.


I will certainly try this, but besides the debugger connection it seems more like a PR stunt, to be honest.


Alright, so you can create an IDE in the browser, but, what else? I'm just not seeing the value in running servers in your browser. Presumably, most sites will still need to connect to a remote back-end services to get at data and whatnot, and that back-end isn't going to be running locally on a customer's web browser.


I don't know about your dev environment, but in mine we run server code separate from web code. So you could toggle your local WebContainer to access either localhost:3001 or a remote branch, staging or prod environment for whatever server you wanted to test against.


Are any demo applications created? What is memory usage like for a simple application?


Yep- if you go to https://stackblitz.com you can start a Next.js, GraphQL, etc dev environment in one click.


... in chrome.


I'm not sure what the complaint is here. Isn't it something like 90% of the browser market is chromium-based, now that edge is on the bandwagon?

Supporting chrome/chromium doesn't seem like a bad place to start with a prototype with that kind of potential adoption.


You are right. In Firefox I get "StackBlitz v2 Beta currently works in Chrome and Chromium-based browsers. We’re hoping to add support for more browsers as they implement the necessary Web Platform features."


Mind-blowing. The env spins fast.

Maybe one day, smartphones and tablets are just browsers in hand.


Do I, as a user, have any possibility of protection against things like ads, or, cookie/tracking abuse with this? I'm really ignorant about this, so I don't known even if I'm doing the correct questions.

It looks just like a canvas, there's "no" DOM.


The future is going to be wild. You could bundle V8 compiled to WASM with a graphics toolkit that uses a syscall like interface to access a canvas backed by WebGPU?

At that point you could replace the browser with ZINE (Zine is not Electron) and run webapps natively just how WINE works.


I dig Web IDEs. I have been using Cloud9 (AWS) for the past 5-6 years.


"Your scientists", etc


it is working for me, stuck on "Running start command"

Anyone having same issue?


Can we please call it NodeJS and stop calling it Node.js? It's not a .js file, and it's not written in JavaScript.


My personal pet peeve is golang. No language is named like that. The language name is Go. Calling it golang because of the domain name of the project is like saying typescriptlang instead of Typescript.


Names don't have any such requirements - but we should use them correctly. You don't rename Windows even though it's not a hole in the wall stuffed with glass.


It's just irksome. I don't think Windows is a good analogy.

More like if I called the OS "Windows.jpg".

If you're going to use something that looks like a filename as a product name, then make sure that file actually exists and is in the expected format.

D3.js --> good

Angular.js --> good

Node.js --> bad

python3.py --> bad

Windows.jpg --> bad

Windows.exe --> okay (I don't use windows but I assume there's something of the sort)


I agree it's not a great name - but IMHO using names incorrectly is even worse than bad naming.


If you are going to get this pedantic, then why not also include *.com as bad, because that was a file type in DOS.


JavaScript is not a scripting language for Java.


This is a really bad idea.

Building on the really good work of the WASM folks, running Node.js, with all its design problems, over the top of it.

I am forever astounded anew by the hubris and naivety of the Node.js crew. It is a example of "It is easier to write than read, easier to talk than listen, easier to build than design"


You didn't say anything substantial here about why it's a bad idea. "Design problems." What design problems do you have in mind?


Yawn.

It isn't open source. It has a name, WebContainers, that implies it's based on the web, but it's designed around Node.js which isn't built on browser technologies as much as something like Skypack or Deno.

It also seems like it's going to be a memory, disk, and CPU hog, and that it's going to be pretty complex. I like where Skypack is headed and this seems like the opposite direction.

It's pretty cool but it seems like they're trying to create a lot of hype around it.


> I like where Skypack is headed and this seems like the opposite direction.

I did a quick google search because I had never heard of Skypack.

Skypack seems to be a far cry away from what WebContainers is claiming to do. Like, they're completely different products. I guess I don't understand why you're even comparing the two.


I think he means deno+skypack


Soothe your heart




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: