I like this tech "fights" (evolutions). Even if bun will not overtake node, it will make node better. We see this many times with other projects (js/coffeescript/typescript/..., php/HHVM/HPHPc, webpack/vite/...)
Hadn't heard of io.js before but from reading up on it it seems to be a unique case since they had the explicit goal of merging back into node eventually.
Can't really find what the main benefits of io.js are though...
Why wouldn't bun, if it keeps its performance promises on its way to 100% node compatibility? I am intently keeping tabs on bun's progress because a better-engineered, faster, and leaner node-compatible runtime means $$ saved in server costs.
Besides, from the effort going into bun, it looks like the node community has its work cut out.
Also, a big reduction in dependencies could make hardening a Bun application something that’s realistic. It’s unimaginable with the typical node stack that I’ve seen.
I’m starting a project that requires a lower level language. Ideally I have tighter control of memory and no GC. I want to move fast and be safe. Go gives me the speed of development I desire, but is a little higher level than this project calls for. Rust is in theory the right choice, but my development speed is like molasses. Given I hope for this project to turn into a company I seek VC backing for, I’m uncomfortable investing in a tool that slows me down so early on.
How has Zig been for you in this regard? Do you have any regrets building your company’s flagship software around it at this stage?
IMO Zig right now is fairly buggy. And the rate of bug introduction has been greater than the fixing of said bugs over the past few years. (Not a knock on the project.) Move fast and break things. Trying to develop production quality software in Zig is like trying to hit a moving target. Zig is not production ready, and they mean it. The ABI is not stable and features are removed/retooled from version to version. (eg. removal of async.) This is all stated upfront, however. Zig is a WIP. That being said, if you've read the warning labels and are still excited there's /tons/ of promise. They are on the right track to being a modern C replacement/augment/mux with an integrated build system. And it's a joy to program in. And the community is pretty great. The best way forward is joining the community and contributing in one way or another, because Zig will be quite special once it's done. Bun is a clear indication of this.
I'm also interested in this, the segmentation faults Primeagen found in Bun were concerning: https://youtu.be/qAYFepR4GcE?t=370 might have been fixed by now though.
I was seriously looking at Zig, but I'm always getting faster in Rust and it feels like the downsides of extra complexity is well worth the upsides for larger projects.
What's the target level of compatibility with existing npm modules? 100%? Some lower percentile?
Hate to carry forward baggage of past design choices, but likely essential to really get widespread adoption. I'd definitely start using Bun for my projects today (non-production), if it works seamlessly with existing packages.
I suppose a compatibility shim for features that should not go into the new core could work. It would have a performance cost, but as long as this cost is limited to some deprecated APIs it would be OK.
I’ve built a few utility apps at work. I absolutely love it.
I hijacked your jsx support so that I had built in server-side templating without having to pull in any external libraries (e.g. React). The process of building my own TSX bindings was pretty trivial, but did feel like a hack (I created a React package.json entry that was a file path to my local source folder).
Bun seems really cool! I had a question about this part:
> Bun now works in more Linux environments, including Amazon Linux 2 and builds for Vercel and Cloudflare Pages. (Previously, you might have seen errors like: "version 'GLIBC_2.29' not found")
How would building for Vercel and CF pages work? Like normal but installing the relevant build tools using bun?
Additionally, I'd love to see the PR where this change was implemented. I've been trying to convince Zola, a static site generator written in Rust, to support older versions of GLIBC.
Most of the code is here. It needs some linker config but basically you wrap symbols which depend on too new versions of glibc and dynamically load them or call some other implementation internally
Are there any exclusive features that Bun has, that are particularly well-suited for writing databases or other low-latency, high-throughput I/O applications?
Seems like being written in Zig might give it a good foot in the door here.
can Bun help solve the hell that is running `npm install` in a project and seeing an error mentioning `node-pre-gyp` in the output with some platform-specific native dependency build issue
I'm surprised JSC doesn't get more press. Lots of news/articles about V8, but seems JSC has eclipsed it on perf by some metrics.
Overall though, Bun honestly looks like it has a shot to supplant Node if npm package compatibility reaches a sufficient level. Or at least encourage Node to work much harder on perf. Deno feels a bit too esoteric/theoretical in its approach, vs Bun which looks to be much more focused on ease of use
FWIW this similarly-trivial "Hello world" benchmark shows Bun's lead at ~25% faster than Deno at the moment, which doesn't seem insurmountable. https://github.com/denosaurs/bench
"Take all benchmarks with a grain of salt" is my middle name, but the benchmark I linked to is an open source project with 17 contributors. In contrast, I'm not seeing any source referenced by the Medium benchmark article I responded to.
Well then... don't write your back end in javascript? The next most popular alternative is probably Ruby which has even worse performance. I don't disagree that performance matters but I don't think the industry seems to agree
Not that I want to cause scope creep, but it would be amazing to have something like the certmagic library (the Caddy team’s automatic TLS Go library) without having to reach for a 3rd party.
There's AssemblyScript, which is designed to be Typescript-esque, and compiles to webassembly, but apart from that, there's not much. The thing is that there's not much value in a Typescript runtime. The semantics of Typescript are fundamentally the semantics of Javascript, but with labels attached to each variable giving a rough hint as to what the type might be. For static analysis, that's really useful - rough hints are mostly good enough for human things like editor hints and typechecking that are allowed to be incorrect - but when it comes to executing the code, the type hints don't actually have that much value. It's very easy for them to be incorrect ("A as B" is a valid Typescript construct that just asserts that a variable is a type without needing to check that it's actually the case), and so the runtime engine can only ever use them as a hint. But with the JIT engines that must runtimes use, the interpreter already has a pretty good idea what the type is going to be, because it's already executed the code and inspected the runtime variable. So you don't get a huge amount of practical value by using the type hints.
And if the type hints aren't useful for the runtime, then there's no real reason to enforce that they be present. A Typescript runtime that ignores types is just a Javascript runtime with a more pedantic syntax, and if you're going to that effort, you may as well support both.
Not now but there is work going on in the JS standardisation process that will mean valid typescript is interpreted as valid JavaScript. (Basically run it and ignore the type annotations)