The title "Javascript to Rust and Back Again" had me believing that this would be a tale of how Rust was abandoned in favor of Javascript! I'm so glad to see we're actually talking about wasm interop
> The wasm-bindgen code generation has been designed with the future host bindings proposal in mind from day 1. As soon as that’s a feature available in WebAssembly, we’ll be able to directly invoke imported functions without any of wasm-bindgen‘s JS shims.
Sweet.
Once again, Rust team knocking it out of the park.
That will be when the clock starts ticking on Javascript.
Who'da thunk one language might eventually eat both a significant chunk of C and a significant chunk of Javascript someday?
I'm not saying Rust will completely eat Javascript; there is no chance that all web developers would or even necessarily could switch to Rust. But the framework war will take a very interesting turn if Rust gets faster access to the DOM than JS can provide, and is also a faster language than JS, and also tends to afford the use of more efficient constructs that Javascript can provide (i.e., can't do anything about cache coherency in JS, can't manage how much copying you do in JS directly, etc.). And if the ball starts rolling on this, the browsers will start doing things like allowing WASM to register correctly-typed events handlers that don't have to be marshaled into JS and back... it's been a long road and there's a long road ahead still but do I see the glimmer of a world in which browsers can actually perform within a factor of 2 of native UIs?
>Who'da thunk one language might eventually eat both a significant chunk of C and a significant chunk of Javascript someday?
Erm... PSA: Rust isn't the only language that is compiling to WASM.
I'm glad for Rust developers (How exciting is for them to be able to target WASM), but what is really exciting is to have alternatives; even more exciting if they have nothing to do with javascript.
"Erm... PSA: Rust isn't the only language that is compiling to WASM."
No, but Rust will still be sitting in a sweet spot, having many of those characteristics I named. For instance, Python compiling to WASM may be useful, but it won't have any of those advantages. Python running on WASM isn't hardly a threat to JS at all, because it won't do anything JS can't. Rust will conceivably be able to in the near term, and as I said, if it takes off and the browsers start adapting to statically-typed functions running in WASM the gap could open even more.
I think IoT will push embedded development towards Rust. As well as security critical desktop/server code. Especially when said code needs to be a loadable library (.dylib, .so, .dll).
While Rust certainly isn't the silver bullet, it demonstrably significantly reduces memory and concurrency related bugs.
Yeah, the last firmware I developed was in C and assembler. No, I could not have done it in Rust, mainly due to no Rust compiler available for the arch. Then again, I could not have done it in pure C either...
For future platform selection availability of a Rust compiler is already a factor. Even if we'd still end up using C/C++ for now.
Since Rust does not yet really work on ARM Cortex or AVR it's still a long way from taking over IoT. Only devices with an OS, like a RaspberryPi, are properly supported by Rust at this point. That's a huge limitation. GCC can target literally hundreds of bare metal devices with C and even C++. Developers have been using C and C++ successfully in IoT devices for years now.
Rust may be the bees knees but it has many hurdles to overcome before it dominates a language that has been around for 46 years.
ARM (thumb 2, Cortex M0-M4) is by far the most important target. Second is cores like Cortex A7 and A53.
I think AVR is pretty minor outside hobbyist circles. After ARM support, I'd rather first see MSP430 (amazing for low power), some FPGA softcores (like NIOS, microblaze, etc., common in low volume, industrial & medical devices) and even 8051 (these buggers seem to still be everywhere, but I guess targeting Rust for this arch is the ultimate challenge.).
I guess RISC-V would be cool as well, I can see this getting more popular in the future, eating ARM market share.
Embedded use cases including ARM Cortex-M and AVR support seems to be in the focus on the Rust 2018 roadmap: "We want embedded programming to reach first-class status this year." https://blog.rust-lang.org/2018/03/12/roadmap.html
It's still too early. There's tons of good reasons to write C code today for many projects. Furthermore, there's so much C code out there that even if nobody wrote a single new line of C today, it'd be decades before it would go away.
It's really about growing the pie rather than replacement anyway, in my opinion.
Although, a lot of us are working towards replacement of C in a lot of areas. Being able to say you've written something in Rust is one thing, but having the whole stack Rust all the way down is something else. Not to mention, it's fairly simple to create C bindings from Rust libraries using Cargo workspaces and cbindgen.
Even if it is objectively better in every way (not saying it is, only going for the most extreme end point for purposes of making a point) it is not guaranteed to replace C. Too many people have invested too much time mastering C and learning to deal with its quirks to simply give it up.
There may come a day where basically 0 new C projects are made. I do not think that will be in my lifetime. And I say that as someone with near 0 investment in the language but intends to get more serious about learning rust in the near future.
We've already thrown away POSIX compatibility. Linux isn't POSIX. Mac OS X isn't POSIX. Windows isn't POSIX. POSIX only continues to exist in varying degrees at different times.
The other question is whether there actually is a reason that it will eventually do so. I personally have my doubts. C has found that perfect sweet spot between not getting in the way of doing things and running on almost any platform. Rust still has to achieve that status and judging from other competitors like C++/D it seems hard to get to that point. Heck, even Java was advertised as systems language on Java CPUs for a while.
> The other question is whether there actually is a reason that it will eventually do so. I personally have my doubts. C has found that perfect sweet spot between not getting in the way of doing things and running on almost any platform.
That is the question. The way I see it, Rust's attractiveness to C developers goes up considerably the more accountable they are for the code they ship. If IoT and embedded devices start having repercussions for out of date, exploitable firmware/services, either through public opinion, legal or legislative means, making a switch to a language that provides more guarantees starts to look a lot more feasible and attractive to a lot of C die-hards.
I think more importantly Rust has to clearly demonstrate that it really is better for projects that use C currently. I think the failure of wider adoption of D is a good case study.
> I think more importantly Rust has to clearly demonstrate that it really is better for projects that use C currently.
I halfway agree. More evidence is always welcome, but the nature of that evidence will almost always be subjective, and I'm not sure a lot of language adherents even care all that much when the evidence is objective. So more is better, but "clearly demonstrate" may be pie in the sky thinking.
> I think the failure of wider adoption of D is a good case study.
I always considered D interesting, but not worth the effort for the gains it offered. A more ergonomic C/C++ is nice, but ergonomics just steer you the right way, they're a far cry from preventing a while class of errors. A GC does offer some help in that direction, but comes with a clear continuous cost. Compiler level error prevention with zero cost abstractions comes across as something a bit more new and novel, and changes the cost/benefit analysis some. It's now a one-time up-front cost, with the possibility that the cost will lesson over time as you become more accustomed to the language.
From an outsider's perspective, Nim was actually a lot more attractive than D as a replacement for C/C++. The cost seems almost negligible since it compiles to C, which makes the benefit fairly good (if overall of less magnitude than I perceive the benefit of Rust to be).
"I halfway agree. More evidence is always welcome, but the nature of that evidence will almost always be subjective, and I'm not sure a lot of language adherents even care all that much when the evidence is objective. So more is better, but "clearly demonstrate" may be pie in the sky thinking."
One way to clearly demonstrate this in my opinion would be big projects that are implemented in Rust. I remember when Java came up there were plenty of people saying that it would totally replace C/C++ but you just had to look around to see that not much big software was written with it. Same for C#. Once we see something widely used like git, a big database, a web browser or similar written in Rust then we know Rust has arrived.
In that respect, I think Servo and Firefox is the big project. Unfortunately (or fortunately, depending on the direction you look at it from), due to Mozilla's putting good engineering principles ahead of marketing and pulling in proven pieces of Rust code from Servo piecemeal, most people will likely never know. The best we can hope for is not that it's just a win, but that it's such a large, obvious win that when people go "How is Firefox so awesome"? the answer is Rust. That's a tall order to fill. :/
" putting good engineering principles ahead of marketing and pulling in proven pieces of Rust code from Servo piecemeal, most people will likely never know"
That's really not a good strategy. At work I can't just start using Rust without justification but if I can point at large projects being written in Rust suddenly has credibility.
The problem is that if Rust doesn't establish itself in the broader market then the effort of developing it and adding to the browser is pretty questionable. You generally don't want code in a language nobody else uses.
I hope rust will make it but it's very hard to establish a language long-term.
I mean it might eventually happen but currently hardware support, vendor support, also all major Unix operating system implementations are written in C and the maintainers don't look like they will change anytime soon. Also C has lot's of inertia behind it.
> there is no chance that all web developers would or even necessarily could switch to Rust. But the framework war will take a very interesting turn if Rust gets faster access to the DOM than JS can provide, and is also a faster language than JS,
Maybe a good way of kicking this off is if someone writes a DSL/hosted language that compiles/transpiles/whatever down into Rust. Have this language be nice and easy to write (a la Python) and have it's compiler turn it into performant Rust. Kind of like how Nim works with respect to C.
This way we could let web devs worry about the actual logic that runs the animations and stuff but have it compiled down into efficient, performant code. Everybody wins: we can finally get rid of JS, code that runs on our machines gets safer, faster and less resource hungry and JS devs are happy.
The bottleneck varies depending on the specifics of the JS, DOM and CSS. I find that when something is slow the culprit is usually someone doing something dumb in JS. For non-dumb causes, layout is usually the bottleneck (reflow, applying styles, building the display list) but it can be other things. As an example, I worked on a site 5 years ago or so where the bottleneck was on paint, mostly due to lots of shadows. I believe caching of rendered shadows has made its way into most engines but it caused us issues at the time.
Mozilla has been doing a lot of work on this front. The Servo project demonstrated you can do style and reflow in parallel. Pieces of that effort have been pulled into Firefox under the "Quantum" label but not the actual reflow part. They're in the process of finishing and incorporating WebRender to make paint more efficient and GPU driven and there have also been efforts at building new drawing APIs (PathTracer, Lyon) as part of that project.
I know Chrome had been doing experiments with implementing DOM methods in Javascript to avoid having to cross the JS/C++ barrier in v8 but I haven't heard about that in a while nor any other major DOM-related perf projects out of them.
Fairly long answer to a short question. If the new APIs retain the synchronous nature of the current DOM APIs where reads against the DOM state force previous DOM manipulations are blocked until a forced layout happens (I hope they are) then it'll still be possible to DOM thrash and kill your perf that way. I also suspect there's enough overhead in JS->DOM calls that WASM could pick up a double digit perf gain in microbenchmarks but not an order of magnitude with the caveat that's a fairly uninformed guess.
I understand. My question is whether it's the Javascript code that uses CPU cycles or the DOM manipulations that cause the browser to recalculate the layout and do other stuff all the time? If it's the browser then it doesn't matter whether you use Javascript or webasm.
I don't really mind Javascript that much, although I do agree that there are definitely nicer languages, and that Rust is probably one (I haven't worked with Rust yet).
That said, one really nice thing that I fear we'd lose is the interoperability of the ecosystem. There are a lot of good libraries available for Javascript that help doing webdev, and I wonder what effect fragmentation of languages could have on that. Will we trade having nicer languages for availability of useful libraries in your language of choice?
One project you'll hear about soon is wasm-pack; it's a tool that lets you take wasm-bindgen, compile your Rust to wasm, and then upload an npm package containing just the wasm, so that anyone using Node can use your stuff.
The real issue is the various runtimes, not the actual libraries themselves.
Rust has some significant advantages compared to languages like Kotlin, Java, and C# here. A really big one is binary sizes. To get those languages to work, you have to ship the entire runtime. For example, see https://www.infoq.com/news/2018/01/mono-cs-webassembly
> (the "hello world" example is 10 megabytes)
The "hello world" example in Rust is ~100 bytes.
Some applications and some people can pay these costs, it's true. But tiny binaries is quite appealing. It's even part of the reason that WebAssembly was created in the first place.
C# etc. shipping a runtime is a current limitation of wasm and those ports to wasm. But in principle once wasm has GC support they can also compile to tiny binaries.
Not for everything, of course - if you use certain C#/Java/Kotlin features you'll need bundled runtime support. But if you avoid them, you don't.
For comparison, Rust can't emit tiny binaries if you use malloc/free, because it needs to bundle those. This is actually an area where Rust/C/C++ are at a disadvantage once wasm has GC, as GC will be "free" while malloc/free won't be, so many programs will be smaller as C#/Java/Kotlin rather than Rust/C/C++.
Is it really on the roadmap to support GC before a simple API level malloc/free? It seems like the sane progression path would be to have WASM support an API for malloc/free first (which would be useful any system that supports a pluggable memory allocator, as Rust is getting), them an API for requesting garbage collected bytes of memory (which I assume, with fairly little prior knowledge, malloc/free would be useful for).
I'm not aware of any plans to support a malloc/free API currently. GC plans are already underway.
Thinking about it, it's actually not that obvious how to support a malloc/free API. It seems like it would need to be deterministic to fit properly on the Web. But writing a spec for that is not easy since efficient malloc/frees are fairly complex and detailed. And once specced out, it could never be improved.
GC on the other hand already exists on the Web, and all the complexity is not noticeable (except for things like proper weak refs and finalizers, which is why those have not been added to the Web yet) so the spec is fairly simple and it allows constant optimization on the implementation side.
Yeah, the question is, how much can you realistically not use those features? I don't actually know. You don't have to give up any of the Rust language.
> You don't have to give up any of the Rust language.
I don't know Rust that well, but what about unwinding and multithreading for example - don't you need to give those Rust features up if you don't want to ship any runtime code?
> It's true that you need to ship malloc/free, but that can be really tiny too
True, yeah. It's a tradeoff, though, tiny mallocs will be much slower than an optimized malloc (like dlmalloc) on real-world benchmarks.
Rust doesn't have a runtime, so there's no runtime code to ship in the first place. It's as low level as C, but with a modern syntax and accompanying core and standard libraries. Thread support is done by using existing OS primitives for threading.
Kotlin specifically has a "native mode" where it doesn't use the JVM and ships its own small runtime with a ref-counting cycle detecting gc, C interop and not much else.
The binaries aren't 100 bytes but I suppose with more optimisation they could be. And although Kotlin/Native isn't actually exactly the same language as Kotlin/JVM it's got very good usability and IDE support already. So I think it can be quite a strong competitor for Rust in many areas.
Doesn't the small size only apply if you compile to assembly? If you compile to webasm shouldn't things be different? I am assuming a C# to webasm would be very different from the current C# to IL compiler.
A previous poster states that a "hello world" in Rust is 100 byte. Is that Rust compiled to machine or webasm? Simple C# compiled to an exe is bigger but by not much. But you need a big runtime. Now if you compiled C# to webasm you potentially wouldn't need the .NET runtime so the result should be quite similar to the Rust code. In addition if you modified the C# compiler to have a real linker that only includes code from the runtime that's actually used I would think the output size shouldn't grow much.
> Now if you compiled C# to webasm you potentially wouldn't need the .NET runtime
Why not? How does C# work without the runtime? As the docs for .NET Native say:
> You can continue to take advantage of the resources provided by the .NET Framework, including its class library, automatic memory management and garbage collection, and exception handling.
So you still have that runtime code in your binary, even if it's not JITted.
I think what he means is that C# apps would have a smaller size if AOT compiled to wasm instead of shipping the mono runtime with .NET assemblies. (sorry for my English)