Everyone thought WASM would enable developers to write code in any language they wanted (with type checking and higher performance) and deploy it on the web.
JS Developers come in: so we can compile node.js to WASM and run that in the browser?! Yay!! Now we can have backend JS running in the browser alongside browser JS!
Jokes aside: this is actually quite interesting, the demo is very impressive (except for the "only works in Chromium-based browser" message on the next.js demo, where the preview should be - despite Mozzila being one of the main WASM backers and Firefox having arguably the best WASM engine).
> Everyone thought WASM would enable developers to write code in any language they wanted (with type checking and higher performance) and deploy it on the web.
> WebAssembly is a different language from JavaScript, but it is not intended as a replacement. Instead, it is designed to complement and work alongside JavaScript, allowing web developers to take advantage of both languages' strong points
"Not intended as a replacement"
WASM is a math coprocessor for JS. WASM can't even talk to the DOM API directly (which is the API you use to build web pages in JS); you have to write JavaScript glue code for any/all I/O in WASM.
For existing code bases that don’t already do DOM manipulation (so, all of them) other parts of the Web API might be more useful, like the canvas, the video API, the RTC API etc.
Though it is true that the original MVP was very barebone and could be compared to assembler. However, the reference types proposal has been shipped in all major browsers and is considered "finished".
>JS Developers come in: so we can compile node.js to WASM and run that in the browser?! Yay!! Now we can have backend JS running in the browser alongside browser JS!
I remember back when SOAP still had the upperhand against REST but just barely the W3C published a document where they showed how you could implement HTTP over SOAP to basically mass ridicule. Unfortunately can't find it anymore, but it was pretty funny.
Web Services Transfer (WS-Transfer). This specification describes a general SOAP-based protocol for accessing XML representations of Web service-based resources. https://www.w3.org/Submission/WS-Transfer/
Web Services Resource Transfer (WS-RT). This specification defines extensions to WS-Transfer. While its initial design focuses on management resource access its use is not necessarily limited to those situations. https://www.w3.org/Submission/WSRT/
Typically when you see stuff like this be Chromium-only, it's because Chromium has a bunch of non-standard/one-vendor-only stuff enabled in production builds and neither Firefox or Safari have it enabled (or implemented at all). A few examples include Bluetooth, MIDI and USB.
I don't understand why you are downvoted. Filesystem access is another big one! Chrome has the technological advantage and sports an enormous kitchen sink of features. One has to make the hard decision, work on a concept that will work only on one browser, try to cut corners with graceful (or crippling) degradation, or wait potentially a long time until a spec is standardized and implemented. The latter may result in others hitting the market sooner, so it is understandable why we end up with web-apps that function in a single browser only. Hopefully this improves in the future.
Browser Javascript suffers from not producing a single "binary" as its build system output. You do get one bundle, but none of the parts know anything about each other, and it limits the ability to optimize. (It is a lot more like the C preprocessor, rather than a C compiler.) Methods that can't possibly ever be called end up being sent to each user, for no reason other than "well someone could write an eval() that needs that". (I was surprised how poorly tree-shaking works in practice, and how even "light" frameworks like Svelte ship a huge runtime to every user, unrelated to what features the code actually uses.)
Meanwhile, WebAssembly suffers from there not being a good programming language that compiles to it natively. I think the Go support is great, but there's no library support for building the webapps that people get paid to build (React, Apollo, etc. all exist for a reason, however much you hate that reason). AssemblyScript was supposed to be this language, but it's just a Typescript-inspired programming language, it's not Typescript and the npm ecosystem.
I don't know what the details of this implementation are, but the direction I'd like to see web development move is having a statically typed language that compiles to VM bytecode. This sounds like the first step for removing Javascript as a Typescript intermediary. Once that happens, then modern programming language features can be bolted on, people can add good compiler optimizations that result in minimal bytes output for users to download, the module system can be made stricter (like "go mod"), etc. Javascript is just a little bit more dynamic than anyone really wants, and it makes the build/deployment tooling complicated and, frankly, kind of bad.
Tree shaking works poorly, but is also thankfully not that important, unless you're terrible with managing dependencies.
Why? Because the initial parser is relatively cheap. This makes all of the code available, even if a stray eval() calls something unexpected.
The real resource expense comes when it's time to JIT a hotspot. But a hotspot is by definition code you use for sure. And code you use a lot.
JS engines use tracing JIT. Tracing allows JIT compilers to see how code runs in practice, and compile THAT to machine language. How it's organized on the file system etc is entirely irrelevant.
So basically in a trace JIT system, code that isn't hot is interpreted, and code that is hot is JIT-ted (and code that's very hot, gets JIT-ted with higher optimization).
Interpreting is the new "tree shaking". It saves the compiler a lot of work, but also it won't crash your app (unlike bad tree shaking).
That's a really good point. What we wanted all along was the JVM with a DOM API.
We started with Java. It didn't gain any traction for browser scripting. They took "Java" out and called it JavaScript. Now we take the "script" out and call it WebAssembly. If someone in 1995 made Java a DOM-editing thing instead of an applet thing, we could have saved 25 years of running around in circles :)
(Going back even further, we all connected to a mainframe with a dumb terminal. The desktop revolution happened... and now we're back to dumb terminals attached to a mainframe. But we call it "The Cloud" instead, and the dumb terminal fits in your pocket.)
Maybe the past wasn't as dumb as we think it was. We just weren't smart enough to understand it at the time.
Hey applets could manipulate JavaScript data/the DOM. There is a bit of bark to the 'Java script' bite. It was very slow, though, sending through the JVM process and to the browser process each time you read or wrote a value (and the plugin that did this I think was in its own process - memory is hazy there).
Overall it's hard to say how Sun could have won here, perhaps acquiring macromedia and using Flash technology to make a browser that ran those cool portals out of the box, and hey scripted first-class in Java too btw ;-)
And we’ve increased latency an order of magnitude each time. We have computers that are orders of magnitude faster, yet using a spreadsheet in a browser today is less responsive than it was in DOS decades ago. But the developer experience is so much better now, so it’s worth it?
> Whether that language should be Scheme was an open question, but Scheme was the bait I went for in joining Netscape. Previously, at SGI, Nick Thompson had turned me on to SICP.
> ..The diktat from upper engineering management was that the language must “look like Java”. That ruled out Perl, Python, and Tcl, along with Scheme.
I think the issue isn't with Wasm support, it's the filesystem API. They could've included a polyfill though... But as that API is new it might not be available yet.
It's the "File System Access API," the one that allows the browser to read/write files on your actual computer. It's an I/O feature, impossible to polyfill.
The similarly named "FileSystem API" gives you access to a sandboxed "virtual drive" in the browser. Firefox has supported that for years, and polyfills are possible, but it's irrelevant in this case.
"StackBlitz v2 Beta currently works in Chrome and Chromium-based browsers. We’re hoping to add support for more browsers as they implement the necessary Web Platform features."
Pretty useless, then. What features are missing? As far as I know Firefox was one of the major contributors to pushing WebASM to the web and has a superior WASM engine in many aspects.
I'd be more interested in a post about what WebASM features are missing than an announcement that someone hacked Javascript into a browser.
It works in Firefox today (minor issues keeping it feature flagged for now) and Safari is close to shipping WASM Threads, so this will likely work on all major browsers by EOY.
Many of the more "advanced APIs" are unlikely to be shipped in either Firefox or Safari because of security and privacy concerns.
For example, Chrome released File System Access by defaul tin Chrome 86, on October 6, 2020.
And yet, as late as August 2020 they were saying that they'd just got the spec in shape: https://github.com/mozilla/standards-positions/issues/154#is.... And there are still unresolved issues around actual security of the thing (that Chrome happily ignores).
If business PCs are running Microsoft Teams + Outlook PWAs on Edge (Chromium based), personal PCs as a majority are running Chrome/Chromium...I can't see why Apple wouldn't come to a compromise to treat PWAs with respect.
It's not just about privacy of web APIs...it's about Apple's chokehold on iPhone users.
> These are mostly a to-do list...which makes sense at this stage
Which stage is it? Chrome is already shipping this, enabled by default.
And the "todo list" covers important issues around permissions, restricting file system access we etc.
> Do you think that Apple or Mozilla can stop webapps from taking their rightful place?
I think that Chrome plays fast and loose with web standards, and has essentially replaced web standards with Chrome-designed and Chrome-specific APIs, while fully ignoring any concerns from other browsers.
The stage where a majority of professional web devs (y'know, the people who aren't here!) still haven't even heard, or fully assimilated, the term "PWA" yet.
> the "todo list" covers important issues around permissions, restricting file system access we etc.
They've proven very capable of managing this in AOSP.
> I think that Chrome plays fast and loose with web standards, and has essentially replaced web standards with Chrome-designed and Chrome-specific APIs, while fully ignoring any concerns from other browsers.
"Chrome-specific" implies DRM etc which I don't find fair here, as a very regular Google critic.
If the concerns are always "We can't keep up", I don't blame Chromium team for pushing on ahead, and letting other browsers catch up.
Frankly: Mozilla isn't leading the way on web standards anymore. (And they're not doing as good a job with security+privacy, really, either, compared to properly configured Chromium.)
> (And they're [Mozilla] not doing as good a job with security+privacy, really, either, compared to properly configured Chromium.)
Anecdotally, I feel that the quickly growing number of websites that complain that Firefox is an "ad blocker" with an unconfigured, out of the box default install seems to imply Mozilla is doing a great job with respect to security+privacy.
> Mozilla isn't leading the way on web standards anymore.
All the way back to the IE3-6 era, Mozilla has never been about "leading" web standards. IE3-6 lead web standards. Mozilla's role in web standard has almost always largely been about curtailing the excesses of whoever is currently leading, and keeping standards fair, safe, "quality" over quantity.
It's not Mozilla's job to keep up with ~IE6~ Chromium, it's their job to keep Google from spinning web standards out of control. To keep Chromium from being IE6 2.0. To keep Google from just defining all the standards in ways that make Google happy but maybe aren't great long term for the health/safety of the web. It's Mozilla's job to slow Chromium down, and the fact that people are complaining "Firefox isn't keeping up" with Chrome's non-standards and in general that they are losing that battle in mainstream browser usage isn't at all great for the web. (The last time something like this happened was IE6. Whether or not Chromium is the "New IE6" will take a while to play out, but we're deep into the IE6 playbook at this point and anyone not seeing parallels either doesn't remember the early years of IE6 well or has too high of an opinion of Google.)
Safari and Mozilla: here are the security concerns, here are privacy concerns, here are concerns that the "specs" reflect internal APIs and are not actual specs.
> Faster than your local environment. Builds complete up to 20% faster and package installs complete >= 5x faster than yarn/npm.
Above is a quote from the post. I feel like this is a stupid question, but how can running yarn/npm in a browser on my machine be faster than running yarn/npm on my machine? Particularly when each page load runs a fresh npm/yarn install?
Nonetheless, this is a incredible piece of software that i'll be following closely
I've also asked myself this question.
I think the comparison is not with a local environment but rather with running node/yarn/npm on a remote virtual machine/ci. This is my hypothesis, I have no proof that's what they meant.
I'm wondering if there is more happening in-memory, even if the project files and modules themselves are on-disk (accessed via FileSystemAPI) the vscode/node/other bits including temporary files may be held in RAM once initially loaded and never swapped out.
If that is the reason, then if your machine becomes memory constrained, the performance will drop through the floor quicker than with full-local installs.
It could be a mix of this and your suggestion (lower network latency related bottlenecks than experienced with non-local deployment).
I'm fascinated in learning more about how you run a web server with this - the article says "WebContainers include a virtualized TCP network stack that's mapped to your browser's ServiceWorker API" but I'd love to understand a bit more about how that works.
Since this is all done with WebAssembly are you planning on targeting other stacks such as Python? The amount of time I lose helping other people getting their Python development environments to work remains horrifying.
First, congratulations! This is an astonishing advancement.
I found this on the GH repo:
"Is this open source? Today no, but the API will be open source to developers at GA."
I'm confused by what "the API" is, exactly. Is WebContainers a technology you plan to make available for others to use node, in the browser, in their own apps?
> If they make it Open Source, Microsoft will add it to VScode and eat their lunch.
Put the code under the GPL so no one can use it to build proprietary software. Visual Studio Code contains proprietary components, so this would prevent that.
Hi, CTO of LeaningTech here. Congratulations for the highly polished demo. I think you might find interesting this one we did months ago: https://repl.leaningtech.com/?nodejs
With all the security and design problems of Node.js, when there are actually secure and reliable ways (harder ways, yes. Security and reliability are hard) to do all of this. Why?
You mention being able to use the debugger. How do I do this? The docs are very poor.
I dont see any way to set breakpoints. When i use a debugger statement and open inspector it breaks on transpiled/packaged code (its different to the actual file). This is a deal breaker for me at present. Am I doing something wrong?
Could you point us to an introduction to how operating system concepts get mapped to browser concepts?
It sounds like each native binary gets compiled to a WebAssembly binary, but how do they communicate? How are the network and file system implemented? Are parts of the file system persistent? How does a system call work?
This pretty cool stuff.
I am wondering about another use case for this type of thing.
We have a CLI tool built with node.js, could we build one of these containers with with the CLI tool setup and expose just a terminal with access to the local file system?
Yes definitely! Firefox compatibility is close to done and will ship soon. Waiting on Safari to ship some additional features. They don’t share timelines so it’s unclear at this point.
Can you link to the blog post where you actually discuss literally any of the technical stack in any depth further than one sentence of wow more pizzaz much fast.
I used to teach web development and was always frustrated with just how much effort went into setting up a machine. It would be really nice if a developer could just include a script tag in their project that setup a TypeScript compiler inside service worker. I managed to get the basics down, even fixing relative import statements. Requests like localhost:3000/main.tsx would compile and cache on the fly.
There were a few native file system issues I encountered that this would be perfect for!
I wonder if that is the ideal path though, I was an instructor for a few subjects in my university and sometimes the biggest detractor for students was the lack of familiarity with the tool set used by the language,
for example, people sometimes did not know how to freeze their dependencies and use python virtual environments when working with python projects, that lead to problems in collaboration, handing students "magic" prefabricated environments could lead them to believe that that's all there is to it, and explaining that it is not to the student in an edge case might end up being less productive than understanding this process from the get go
you could say that the _effort_ to set up a development machine should be part of the learning experience, at least that's what I think.
I remember seeing my classmates having their first freelance jobs and editing minified css/javascript directly because they did not know anything about the transpiling that goes on or the toolset surrounding javascript and web development
It’s even funnier when those same devs who struggle with packaging and distribution in their best language go on to work at Fortune 500 companies and require constant devops assistance to actually make what they wrote work in even a test env. “But it works on my laptop!”
VS Code seems to allow this with Dev Containers. I can download a repo with a `.devcontainer.json` and be up and running in virtually no time, in a container.
PHP is actually pretty great these days, you should have a look at what modern PHP has become if you’ve been out of the loop for a while. Have a look at https://phptherightway.com/ if you’re interested.
As far as I understand JS code is being run natively in the browser. Node has been hooked to the native engine, so V8 is not involved.
At LeaningTech we have since several months been able to run nodejs, but also python and ruby using our Wasm based x86 VM (CheerpX), the demo is available here: https://repl.leaningtech.com/?nodejs
With our solution the full Linux x86 binary of nodejs is running including the whole of V8 and its JIT engine. Of course fully client side.
It looks like the difference is you went for full x86 system call level emulation to handle existing binaries whereas stackblitz decided to build a container for things like Node compiled directly to WASM with no static assumptions of a base system. The latter approach with 1 less level of indirection should be much more efficient, as is evident by startup times of the 2 demos. Not to mention the ability to hook into things above the syscall level.
Not an expert on PyTorch in particular, the python + native (if any) components would work right now. If GPU acceleration is required some way to bridge into WebGL or (much better) WebGPU would need to be engineered, certainly doable but not existing right now.
Except for figuring out Pytorch's blas, the rest should be pretty straightforward to use CPU-only, I'd love to use that to (even pay to) compile some side projects to wasm apps in the browser :)
Wow, the PWA is really impressive. Honestly it's starting to get hard to tell the difference between this and Electron-based editors like VS Code. I'm really excited for integration with the File System Access API; that's the final puzzle piece I'd need to use something like this for serious work.
So it's 8 dollars for the astronaut plan. That's what I pay for robux to my kids, per week. It's jaw dropping to me how such advanced tech is made available at a single digit price. Thanks for sharing.
> Honestly it's starting to get hard to tell the difference between this and Electron-based editors like VS Code
Since Electron is just chromium, I'm not sure why you would expect a difference in the first place? The differences that previously existed between Electron & Web hasn't changed - Electron's only real purpose was to basically turn off the browser permission model & let the "web app" do whatever it wants. That's still true with WebContainers, it still has its hands & features tied to whatever Chrome & browsers in general are comfortable letting websites ask permission to do.
Me too and I was surprised to find that Chrome lists this as "ready" or "released" (I don't recall the exact term). I tried the example web based editor and found that the file system access API worked in Firefox too.
I'm working on a note taking project right now that I think would be great as a web app, but I'd like to store the result in a flat text file and the file system access API looks like it will let me do exactly that.
> ..found that the file system access API worked in Firefox too
For now, File System Access API seems to be available only in desktop Chromium browsers: Edge, Chrome, Opera. I believe Firefox has not implemented it yet.
> The ability to read and write from the filesystem is potentially very dangerous. We will need to carefully consider any solution in light of the security and privacy implications. We recognize that the spec authors take this issue seriously, but we are concerned that any solution will increase the risk of security incidents more than we are willing to tolerate.
> Right now, there isn't enough detail in the specification to make an assessment of these risks, so we will defer our decision until we have more information.
Thank you! Our entire team has definitely averaged 3hrs of sleep every night the past week to make this launch happen :) Incredible how much work goes into launching something like this.
ELI5... why would you want to run an Electon app in the browser? Isn't that just a website at that point?
EDIT: Upon closer inspection, it looks like this is an online IDE meant to mimic a desktop environment. I don't know if there's any demand for that, but sure, why not. I'm sure someone will come up w/ a reason to run WebAssembly in a jQuery plugin, too.
Can you please go into more detail as to what exactly is being compiled to wasm? The post does not describe it. In particular there seems to be a lot of confusion about whether the JS engine itself is running in wasm or not.
The largest wasm file I see downloaded on the website is 440K. Disassembling it, it seems to contain a bunch of libc-like filesystem code, and was built using the rust toolchain. So it doesn't look like v8 has been compiled to wasm here.
Rather, I assume they compiled a library or two, but otherwise they use the JS VM in the browser, and they've ported the Node.js runtime scripts to that environment.
That doesn't take anything away from the achievement here - it's really impressive! It's better to use the VM in the browser, if you can get those runtime scripts portable enough, which from other comments it seems like the answer is (or will be) yes.
This makes me want to quit webdev and never look back. Why the fuck are we recreating operating systems in the browser? We already have operating systems. We already have containers. We already have native code execution.
Because operating systems don‘t support seamless and secure distribution of cross platform software. Web browsers are currently the only secure sandbox we have.
Cross platform, right. So we ignore the perfectly adequate operating systems with their mature GUI frameworks, system calls, filesystems and programming language support, because having separate teams for different platforms is apparently too much, and instead we over-engineer the fuck out of web technologies (both front and back end) to the point people are estimating their RAM needs using the number of Chrome tabs they use.
Functionality is often largely independent of platform specific features. Spending limited resources on re-implementing stuff five times over can result in a worse experience for users than focusing on a single high quality cross-platform solution that is easy to distribute and keep up-to-date.
It completely depends on the specifics of the app. Some apps benefit from tight device integration. For others it's a waste of resources that could be spent on features.
What other platform allows arbitrary code execution like the Web does? We have billions of Internet users yet actual security issues are few and far between. It's remarkably safe and secure given the scale
To be safer than other platforms doesn't mean it's safe.
And because Chrome is dominating it's a huge target.
And at the moment the browser access to the underlying infrastructure is limited.
So... Two thoughts: this probably means we can run node in deno?
Second: looks like js is catching up with smalltalk. JJust a shame the debugger is still rather crappy, and the browser/inspector only applies to a subset of widgets/controls (the html/css stuff, not the tab bar, address line, bookmarks menu etc..).
Web development is a big gigantic shit storm. Why can’t we use a real statically typed language without this JS shitty overhead, generate compiled binary and send that to users?
I personally hate writing javascript
To demonstrate the potential this needs a demo of a "middle-tier" service implemented in NodeJS. The service needs to talk to a MySQL database, and expose a Rest API to the web application.
If that works that means many intranet applications can be server-less. Both the JavaScript app and the "middle-tier" runs in the browser, and the middle-tier talks to MySQL server which is also on the intranet. All you need is a file server to serve the static JS files.
If the middle-tier and the app are both written in JavaScript, and you're developing them both simultaneously in the browser, does it make sense for them to be communicating via REST any more?
Wouldn't it make more sense for the app to send SQL prepared statements to the database (via a tiny standardised wrapper on the server to enforce per-user permissions based on the authenticated session)?
It probably depends on whether other apps are requesting data from the same database, in which case it might make sense to convert from the relational model to JSON objects in a tier which exposes the objects over REST.
node.js, as any other native Linux/Windows/Mac application, talks to the OS via a system API. You can emulate that API using WASM, mapping I/O to browser APIs, which is why they mention network is implemented via service workers, for example. File I/O can be mapped to IndexDB... Threads can also be mapped to service workers, I guess... and so on.
emscripten[1] did this a long time ago for C, which is how a bunch of native applications have already been ported to run on the browser.
webcontainers[2] seem to do a similar thing but focused on exposing the browser API to the native apps in a way that integrates well with the JS environment.
People have made Linux run on the browser, so... Although I don't think this is what they did here.
Man, the yet another layer they've made between code and the CPU makes me uncomfortable. Imagine finding a bug and going through 20 layers to troubleshoot it.
Oh no! The user experience must surely not be enriched! Damn those smart ass web developers who aren't like the rest of us that care about code quality and engineering principles!
This is super cool! I think pairing this with the upcoming native support for modules and a CDN like skypack could lead to a truly powerful browser development experience.
No bundlers and a full IDE? Maybe this could even mean that there's a chance for a some of the 'just fiddle with the page' experience to come back to the web! (/rainbows and unicorns)
If it really runs inside the Browser: How did they implement adapters for networking and crypto/tls?
I mean, a console is nice and all. But I'm highly sceptic as to how the filesystem and networking works, because fetch cannot be used to have dgram or tls/net sockets. And that is literally the primary reason nodejs exists.
> WebContainers include a virtualized TCP network stack that's mapped to your browser's ServiceWorker API, enabling you to instantly create live Node.js servers on-demand that continue to work even when you go offline.
I was trying to refer to the simulated iframe that's the live preview...and probably uses the service worker's responses to emulate the behaviour rather than say, a real end-to-end http server pipeline.
This sounds very likely like http module injection, so that the nodejs module isn't really doing what it thinks it does, but is using an injected API behind the scenes.
Raw TCP is only available for Chrome Extensions, on Chrome OS, iirc.
(Setting aside the lack of TLS or a real crypto module that isn't using the Browser's Web Crypto API approach)
For now. It works in Firefox today (minor issues keeping it feature flagged for now) and Safari is close to shipping WASM Threads, so this will likely work on all major browsers by EOY.
I'm trying to get my head around how the networking of it works. Is local.webcontainer.io being overridden somehow on the browser like a hosts file might do? Trying to figure out how I'm hitting that URL from another tab
There is a service worker registered on the domain name, for example https://nextjs-agaw8i--3000.local.webcontainer.io/, then when you visit this URL in another tab it basically shares the service worker from the stackblitz tab.
If by "works" you mean "serves up an error page". It's a standard domain, not real localhost, and it needs the service worker from the editor page running to work properly.
Isn't the opposite to "running remotely" "running locally"? Meanwhile if pressed for the opposite of "running native" I would answer "running virtualized".
The web devs where I work have started calling pretty much everything "native" as long as it interacts with the host in some way since people assume it means fast.
Tried this out with a basic set of React deps in package.json
Reloading now spends 10+s "installing dependencies" - pretty quick, and the UI is still interactive.
Also, Chrome has two "Google Chrome helper" processes each sitting at 550MB+ RSS. I'm not sure if "no node_modules black hole on disk" is quite as compelling if the black hole has merely been relocated to RAM.
Does this mean we can now have a SOA in the browser? True separation of concerns: Render code deal with writing to DOM, when data is needed it calls a node API proxy server which in tern interfaces with REST/grpc/graphql servers, then there's also auth service, and also a local (literally local) cache service. All in the browser. Damn.
I noticed that outgoing http requests don't seem to work (there's no `curl` and after installing a CLI tool I work on via npm (which worked like a charm!), the outgoing API calls it makes were blocked. I assume that's intentional at this stage. Is that changing in the future?
Their repo implies that network requests are subject to the same-origin policy, since it's in a browser after all. If you open up devtools you will probably see complaints along those lines. In short: the endpoints you request need CORS enabled.
Quoting https://github.com/stackblitz/webcontainer-core:
> We're limited by the browser's ability to make network requests, so connecting to processes like MongoDB, Redis, PostgreSQL, etc. are not currently possible
This is going full inception mode. Nodejs is compiled to wasm (like assembly for your browser), and then loaded inside the browser, which can then run javascript.
So a full JS engine is loaded, completely separate from the built-in one.
yeah that's what I gathered: javascript already runs on the browser, then we got node.js because people liked javascript and wanted to run it natively, and then finally we got this which allows us to run node.js on the browser again? it looks to me like we've gone full circle with extra overhead :D but I guess people have reasons to run node.js on the browser and not javascript so idk
I guess the idea is to access some Apis that are not present in the browser, like fs. This was mentioned in a question to StackBlitzs CEO, but he seems to be an AI that just pastes the same text over and over again.
> I guess the idea is to access some Apis that are not present in the browser, like fs.
That's why electron exists. But this can't do that, it's still limited by the APIs that are present in the browser. It's seemingly just full-overhead for... some... reason?
basically node exposes all these other APIs like reading a file from disk. Browser also has APIs like detecting when the page is fully loaded that wouldn't be available on Node. What JavaScript syntax is supported can also vary between Node and browsers.
It can't access the file system, but it could implement an API that emulates a file system inside the browser sandbox. Presumably that's what they're doing here.
Alright, so you can create an IDE in the browser, but, what else? I'm just not seeing the value in running servers in your browser. Presumably, most sites will still need to connect to a remote back-end services to get at data and whatnot, and that back-end isn't going to be running locally on a customer's web browser.
I don't know about your dev environment, but in mine we run server code separate from web code. So you could toggle your local WebContainer to access either localhost:3001 or a remote branch, staging or prod environment for whatever server you wanted to test against.
You are right. In Firefox I get "StackBlitz v2 Beta currently works in Chrome and Chromium-based browsers. We’re hoping to add support for more browsers as they implement the necessary Web Platform features."
Do I, as a user, have any possibility of protection against things like ads, or, cookie/tracking abuse with this? I'm really ignorant about this, so I don't known even if I'm doing the correct questions.
The future is going to be wild. You could bundle V8 compiled to WASM with a graphics toolkit that uses a syscall like interface to access a canvas backed by WebGPU?
At that point you could replace the browser with ZINE (Zine is not Electron) and run webapps natively just how WINE works.
My personal pet peeve is golang. No language is named like that. The language name is Go. Calling it golang because of the domain name of the project is like saying typescriptlang instead of Typescript.
Names don't have any such requirements - but we should use them correctly. You don't rename Windows even though it's not a hole in the wall stuffed with glass.
Building on the really good work of the WASM folks, running Node.js, with all its design problems, over the top of it.
I am forever astounded anew by the hubris and naivety of the Node.js crew. It is a example of "It is easier to write than read, easier to talk than listen, easier to build than design"
It isn't open source. It has a name, WebContainers, that implies it's based on the web, but it's designed around Node.js which isn't built on browser technologies as much as something like Skypack or Deno.
It also seems like it's going to be a memory, disk, and CPU hog, and that it's going to be pretty complex. I like where Skypack is headed and this seems like the opposite direction.
It's pretty cool but it seems like they're trying to create a lot of hype around it.
> I like where Skypack is headed and this seems like the opposite direction.
I did a quick google search because I had never heard of Skypack.
Skypack seems to be a far cry away from what WebContainers is claiming to do. Like, they're completely different products. I guess I don't understand why you're even comparing the two.
JS Developers come in: so we can compile node.js to WASM and run that in the browser?! Yay!! Now we can have backend JS running in the browser alongside browser JS!
Jokes aside: this is actually quite interesting, the demo is very impressive (except for the "only works in Chromium-based browser" message on the next.js demo, where the preview should be - despite Mozzila being one of the main WASM backers and Firefox having arguably the best WASM engine).