Hacker News new | past | comments | ask | show | jobs | submit | syrusakbary's comments login

Wasmer (https://wasmer.io/) | Frontend Software Engineers | Hybrid (EU Timezones + Madrid Office) | Full Time

At Wasmer we are working on the software that will power the next generation of Cloud Computing platforms using WebAssembly. Similarly to Node.js, we are moving WebAssembly to the server-side but completely emancipated from JavaScript.

We are seeking a skilled frontend developer with industrial-strength software engineering skills to help us build our the Wasmer website. Stack: JS/TS. React/Next.js + Tailwind + GraphQL (Relay)

Reach out to syrus [at] wasmer.io or apply via Work at a Startup [1]

[1] https://www.workatastartup.com/jobs/45795


Since Porffor can compile itself (you can run the compiler inside of Porffor), any calls to eval could be compiled to Wasm (via executing the Porffor compiler in Porffor JS engine) and executed performantly on the same JS context *

*or at least, in theory


I haven't used it, but reading their landing page, Porffor says their runtime is vastly smaller because it is AOT. If the compiler had to be bundled with the executable, then the size of the executable would grow much larger.


Yeah, but you'd only need to include Porffor into compiled code if it used eval.

And most devs stay away from eval for well-deserved security reasons.


That wouldn't work: The Wasm spec does not allow for modifying an already running program (e.g. JIT).

AFAIK the only option is to include an interpreter.


You don't need to modify an already running program, you can plug new functions into an existing Wasm program via a table, and even attach variables via globals or function arguments.

I'd recommend checking the work on making SpiderMonkey emit Wasm as a backend


It's awesome to see how more JS runtimes try to approach Wasm. This project reminds me to Static Hermes (the JS engine from Facebook to improve the speed of React Native projects on iOS and Android).

I've spent a bit of time trying to review each, so hopefully this analysis will be useful for some readers. What are the main commonalities and differences between Static Hermes and Porffor?

  * They both aim for JS test262 conformance [1]
  * Porffor supports both Native and Wasm outputs while Static Hermes is mainly focused on Native outputs for now
  * Porffor is self-hosted (Porffor is written in pure JS and can compile itself), while Static Hermes relies on LLVM
  * Porffor currently doesn't support async/promise/await while Static Hermes does (with some limitations)
  * Static Hermes is written in C++ while Porffor is mainly JS
  * They both support TypeScript (although Static Hermes does it through transpiling the TS AST to Flow, while Porffor supports it natively)
  * Static Hermes has a fallback interpreter (to support `eval` and other hard-to-compile JS scenarios), while Porffor only supports AOT compiling (although, as I commented in other thread here, it maybe be possible to support `eval` in Porffor as well)
In general, I'm excited to see if this project can gain some traction so we can speed-up Javascript engines one the Edge! Context: I'm Syrus, from Wasmer [3]

[1] https://github.com/facebook/hermes/discussions/1137

[2] https://github.com/tc39/test262

[3] https://wasmer.io


For the record, Static Hermes fully supports compiling JS to WASM. We get it basically for free, because it is an existing LLVM backend. See https://x.com/tmikov/status/1706138872412074204 for example.

Admittedly, it is not our focus, we are focusing mainly on React Native, where WASM doesn't make sense.

The most important feature of Static Hermes is our type checker, which guarantees runtime soundness.

Porffor is very interesting, I have been watching it for some time and I am rooting for it.


Thanks for the correction Tzvetan! Keep up the great work in Static Hermes


Contributor for Porffor here! I think this is a great comparison, but Porffor does technically support promises, albeit synchronously. It's a similar approach to Kiesel, https://kiesel.dev/.


Not sure where you mean by synchronously but if you mean what I think you mean then that is not correct behaviour. This is important to ensure predicatibility.

Eg.

    Promise.then(() => console.log(“a”));
    console.log(“b”)
must log [“b”, “a”] and not [“a”, “b”].


This type of test does work as expected. The "sync" means that it does not feature a full event loop (yet) so cannot easily support async I/O or some more "advanced" use cases.


There's a WASI async functions proposal I think? Are you looking at supporting that so you don't have to bring your own event loop?


JavaScript doesn’t make the guarantee you are claiming here


Yes, it does. Promise continuations always run in the micro task queue per the standard. I guess if someone mutates the promise prototype it’s not guaranteed, but the spec does guarantee this order


What do you mean? Does JavaScript allow the `then` of a promise to execute before the contents of the promise?


Good comparison and thanks! A few minor clarifications: - Porffor isn't fully self-hosted yet but should be possible hopefully! It does partially compile itself for builtins (eg Array.prototype.filter, Math.sin, atob, ...) though. - As of late, Porffor does now support basic async/promise/await! Not very well yet though.


Just wanted to say I really appreciated the high-quality comparison. How something compares to existing work is my #1 question whenever I read an announcement like this.


Thanks!


You make it sound bad to rely on LLVM.


Yeah... It is unclear to me how not using LLVM is a good thing. You'd inherit millions of man-hours of optimization work, code gen, and general thought process.

Is there a technical reason why?


In this case, being self contained will help implementing things like `eval()` and `Function()` since Porffor can self-host. That would be much harder with a LLVM based solution.


I'm amazed on how well this article fits with a new product that we have been working on at Wasmer. AWS Lambda is great, but it doesn't really solve the cold start problem of dynamic languages. Nor does FastCGI.

We are very close to launch Instaboot, a new feature for Wasmer Edge that thanks to WebAssembly is able to bring incredible fast cold-starts to dynamic languages. Bringing 90ms cold-start times to WordPress (compared to >1s in state-of-the-art cloud providers).


Fully agree with your take.

I have no idea what the people guiding WASI are motivated about, but for sure it doesn't seem aligned with the community motivations... which might be hurting the ecosystem in the long run.

For the exact reasons the parent comment mentioned, at Wasmer we aimed for full POSIX compatibility as part of WASIX (a superset of WASI 0.1): https://wasix.org/. WASIX has support for many syscalls not available in WASI: fork, exec, longjmp, setjmp, threads, sockets, and many more. It's insightful to know that many people, and specially those close to the Bytecode Alliance, neglected the contributions, and try to push down the project, and even tried to make us rename it: https://github.com/wasix-org/cargo-wasix/issues/4


Great analysis on all the runtimes.

I love all the improvements that Wasmi has been doing lately. Being so close to the super-optimal interpreter Stitch (a new interpreter similar to Wasm3, but made in Rust) is quite impressive.

As a side note, I wish most of the runtimes in Rust stopped adopting the "linker" paradigm for imports, as is a completely unnecessary abstraction when setting up imports is a close-to-zero cost


Thank you! :)

when using lazy-unchecked translation with relatively small programs, setting up the Linker sometimes can take up the majority of the overall execution with ~50 host functions (which is a common average number). We are talking about microseconds, but microseconds come to play an importance at these scales. This is why for Wasmi we implemented the LinkerBuilder for a 120x speed-up. :)


I see [1], thanks for sharing. I'll need to dig a bit deeper in your implementation!

[1] https://github.com/wasmi-labs/wasmi/blob/master/crates/wasmi...


I've been following Tantivy for a little while. It's impressive the grit that the founders have, and the performance that Tantivy has been able to achieve lately.

Mad props to all the team! I'm a firm believer they will succeed on their quest!


Author here. Faster than it has ever been in the Edge via WebAssembly :)

But you are completely right pointing out that there's still some room to improve, specially when compared to running PHP natively.

Right now there's some price to pay for the isolation and sandboxing, but we are working hard to reduce the gap to zero. Stay tuned for more updates on this front!


Hey you should really check out fly.io if you want to run stuff on the edge. They have it pretty much figured out.

FrankenPHP is also really good and probably a much smarter play than trying to get your code running under WebAssembly.


Given that we run PHP on the edge, what is the point of running the PHP interpreter on top of a WebAssembly interpreter (Wasmer) instead of just running the PHP interpreter directly?

The latter will always be faster.


From what I can tell it's because some of these "edge" service providers will expect you to give them a WASM binary instead of a PHP script.

The other caveats about "edge" throughout this discussion aside, if I needed to do this, I'd try to write something in Zig or (gag) JS or something else that compiles to WASM directly rather than writing a script for an interpreter that runs under WASM.


Is this tech meant for developers' needs only, or can regular, already existing PHP websites (e-shops...) somehow take advantage of it as well?


It's meant for direct usage, not just for developers.

You can deploy apps on the Wasmer Edge cloud, and also host things yourself if you want to, though in the later case the setup will be non-trivial.


It's refreshing to see that we are not alone in our thoughts on how the community is not being stewarded towards its own interests. I applaud the author on how clear he made the argument.

For those that aim to continue working on top of WASIp1, WASIX (https://wasix.org) might be a great way to get your programs with sockets and threads fully running on Wasm.

Note: I work at Wasmer (https://wasmer.io), a WebAssembly runtime.


This is really awesome. Mad props for Chris for working on weval, and to Max for the great overview with the blogpost.

I found the video in Vimeo quite useful to understand how it works: https://vimeo.com/940568191


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: