I like Wasmer, but I very much dislike the idea of another major package repository being owned and operated by a commercial entity. They seem like good people doing valuable work now, but that doesn't tell us very much about who the company will be in five or ten years.
I've grown uncomfortable with NPM being operated by NPM Inc instead of The Node.js Foundation, but it's a hard thing to change once it's established. We should hesitate to support the establishment of another community package manager by a for-profit company.
I completely understand your concerns, especially given the issues that NPM had lately.
We are an open-source focused company and our success is tightly tied to the success of our open-source solutions.
Here are some of the things that we are doing in order to resolve (or minimize) the concerns:
* Completely open API to retrieve the registry data (it's a work in progress, but our GraphQL API is open to everyone to use: https://registry.wapm.io/graphql )
* De-centralized hosting of the packages (news to come soon!)
Do you have ideas on how you will make the governance of the registry open? Who will make important decisions around policies, and how will they be made? I personally find this to be the core reason I find it hard to use startup-run package registries.
We are starting to research on governance, but to date nothing is defined (since we have just launched).
However, I would love to follow up with you personally to make sure we make the right choices from the start.
Can you write me an email to syrus@wasmer.io so we continue the conversation there?
How are you going to make money? How are you going to fund offices in CA + developers + the operating costs of the registry?
I really don’t think a private entity should be running a registry. Ruby is just about the only example of a registry handled correctly - via the community and non-profits.
Sigh. There is a difference between architectural design (namespacing), and funding income.
I am -not- discussing the software architecture of rubygems here, please don't start a bikeshed with that. Rubygems is a shining example of how non-profits and the ruby community fund rubygems, not yet-another-californian-company-without-clear-income-plans.
Also, there's no need for highly embellished language ("languages are chopping their hands off").
WASM should use something like GO where you can use any git/mercurial/bazaar repository to fetch the packages and 3rd party services(i.e godoc.org) to index them.
Not a good idea. Semver range on dependency plus lock file is the holy grail of dependency management (ask Maven people) and cannot be done with git (cvs) registries. You’re tied to pinned versions (lockfile) only.
Don't let the perfect be the enemy of the good. WASM is barely off the ground and already we have a company trying to establish itself as the authoritative package manager and registry host the way NPM has become for javascript.
It's more important to prevent centralization by demanding that there be no authority other than the end user, and by building tools to enforce those expectations, than to have a "holy grail" of dependency management be controlled by a single corporate interest.
I'd rather WAPM be built with their registry as a default option, but designed to be completely agnostic regarding registries or repositories. I'd like to use it and never have to touch their servers if I don't want to.
We’re not talking about good vs perfect. We’re talking about «a big ball of hairy non-compatible dependencies» vs «an upgradeable declarative definition of dependencies».
> I'd rather WAPM be built with their registry as a default option, but designed to be completely agnostic regarding registries or repositories.
That’s how npm works, yet you don‘t see git/vcs dependencies very often. That’s because they don’t work.
In fairness, the OP mentioned Go, but you're dropping details.
Ie, in Go it works just fine but it is more than just plain Git, there is a dependency file. Semver works with the repo itself based on repo tags.
I think you're not arguing against what the spirit of OPs post meant, aka the "like Go" part. Like Go does work, Go is using it and it includes a lock file just like you mentioned.
There are definitely downsides to using repos as dependency resolution hosts, but none of that in my view is what you mentioned. Lockfiles in repos are not complex or unsolvable, I'm not sure why you pit them as such - again, look at Go.
If you want to talk about why repos shouldn't be used, imo, talk about the volatility of them. A de/centralized host specially targeted at distributing source seems to have less volatility in existence when compared to git repos. It's rare to have packed disappear from Cargo (Rust's package manager), but I've had it happen multiple times in Go.
Agreed. I wonder why package managers can't follow Go's lead in re-using DNS system for packages and then offering a service that just indexes what is available.
Julia uses git and GitHub. This was a stroke of brilliance until microsoft bought GitHub. It still will probably be a clever decision, GitHub could be phased out in favor of another git site.
Why a new package manager? Why not port an existing well-designed generic package manager such as Nix?
Making a new package manager is very expensive in terms of developer times - developers have to learn yet another new thing to use your platform. Can you justify this?
Which general purpose package manager works correctly on Windows (not WSL, actual Windows) and Linux? Cross-platform is more important than not creating another package manager.
I think a far more interesting question is, does the package repository support falling back to e.g. curl. That would allow those who don't want another package manager to still use the registry.
You should distinguish between "can run on Windows" and "can manage Windows software as packages". Most package managers could probably run fine on Windows; at worst they could use cygwin or, indeed, WSL. Managing Windows software is much harder, but isn't necessary: This is a package manager for WebAssembly packages, not Windows packages or Linux packages.
And anyway, wasmer itself doesn't even support Windows.
Well, obviously both WSL and Cygwin run on Windows (and nowhere else), so, rhetoric aside, could you explain why they are they insufficient for your purposes?
Perhaps I don't want to install several GBs of invasive, hard to manage, and vulnerable linux dependencies to run a package manager unrelated to linux?
I'm not sure I understand the problem. Are you having to use all of these dozens of package managers, or is this more of a moral objection to the existence of similar but distinct things?
I have to deal with many of these regularly, yes. But more generally, I'm expressing a desire for fewer ecosystem-specific package managers and more ecosystem-agnostic package managers.
Right, but why? Generalization isn't free, and there doesn't seem to be any real benefit here. This sounds to me like arguing that there should be one programming language, or one version control system, or one declarative sysadmin automation language.
I think that the cost in complexity for one Uber package manager is greater than the waste in duplication from many smaller, more focused ones.
Because, among many other reasons, I work with a lot of projects that aren't written in just one programming language or use one type of environment. It's not just "I don't want to use five package managers for five different projects in five ecosystems", it's "I don't want to bridge between five package managers for one project that touches five ecosystems". (Or a project in one ecosystem with dependencies from another...)
AppFS [0] is cross-platform and general purpose. It should work on Windows via cxfuse [1], but I have not tested it. The data structure is simple and could also be handled by an offline fetching system.
Because I don't want to deal with packages for Linux (deb, apt, rpm, tarballs, whatever), HP-UX depot, Solaris package, AIX packages, IBM i catalogs, IBM z packages, ClearPath packages, Windows (exe installers, msi, msix, appx), iOS ppa, Android (apk, aapk), and thousands of embedded OSes out there packages.
I love WebAssembly/WASI, but I'm concerned about us repeating the mistakes of the past.
How does the dependency model work? Can I safely install parallel streams of software? Does it support unprivileged installation? Is the installation stateless (no scriptlets/lifecycle scripts)?
It is my understanding that wasmer is trying to create a new ecosystem that doesn't rely on native modules.
> run initial post-installation configuration scripts
It's been my experience, as a long-time Linux user, that this is actually a bad thing. Stateless systems are far easier to work with -- a package manager can be far faster and simpler if it just extracts an archive. I can't think of a single case where post-installation configuration scripts couldn't be replaced by something simpler, using the filesystem.
Scripts aren’t truly platform independent. You’ll branching logic for different platforms. Why not capture that with different declarative structures for each platform?
Running scripts as part of software install is something we, as an industry, need to solve.
Currently dependencies are keyed by namespace, name and version. There is a global namespace that has restricted access. WAPM does not do any dynamic linking of WebAssembly and currently only resolves dependencies one node deep. This will likely change in the future as the story on WebAssembly libraries and dynamic linking becomes more concrete.
> Can I safely install parallel streams of software?
WAPM operates synchronously at the moment, but there is no reason why WAPM should not be able to install dependencies concurrently.
> Does it support unprivileged installation?
WAPM installs packages into a "wapm_packages" directory in the current directory. WAPM will probably support global installs in the future. WASI enables a "capability-oriented system", but this is a concern of wasm runtimes, and not WAPM. WAPM only manages wasm binaries.
> Is the installation stateless (no scriptlets/lifecycle scripts)?
WAPM installs with a single command and there are no lifecycle scripts. Ideally, one would install their wasm packages with wapm-cli and not require any intervention from other tools. It was designed to be unobtrusive.
> WAPM operates synchronously at the moment, but there is no reason why WAPM should not be able to install dependencies concurrently.
I'm sorry, I didn't word that well. In Fedora, there is a feature called Modularity that enables you to switch between different release streams (for instance, Node.js LTS or current). You can do so with `dnf module install nodejs:11` or `dnf module install nodejs:10`. Since `dnf` is installing into a single global space, though, it isn't possible to install both nodejs:11 and nodejs:10, and the Fedora project recommends containers as a solution here. Could WAPM support this use case, without requiring one to buy into the entire container tooling ecosystem?
> no lifecycle scripts
Yes! This could enable a declarative OCI image build tool, which could be used to bridge the gap between Wasmer and the Docker/Kubernetes stack.
Are all the 'packages' here reproducible? I'd love to be able to run something like `wapm verify <package>`, or `wapm verify --all`. Running arbitrary code from random people on my computer is sketchy even when I can look at the code. Its far more sketchy when I can't.
We are working on supporting signed packages to assure they can't be tampered with.
At the same time WebAssembly provides some nice sandboxing capabilities and we are working to add permissions on top of syscalls, so packages will not do what they are not suppose to do.
Reproducible builds are definitely something great, but they are quite tricky. However we are very open to hear more thoughts on how to do it!
Signed packages are useful, but still don't solve the problem.
One fairly simple thing you can do to improve this would be to include build scripts as part of the package, and allow people to run those build scripts through wapm. The exact versions of any involved tools will need to be recorded too, but since there arent too many ways to generate wasm blobs yet, this shouldnt be too out there.
Yeah, instructions onto how to build the package would be highly valuable. Environments where binaries are very backwards compatible (which itself is a good thing!) lend themselves to discarding abilities about how to build something. You could quickly land in a situation where you need to recompile some particular binary for some reason but figuring out how to compile it is really hard. Like how Microsoft fixed a vulnerability in an Office program by changing the binary instead of the source code.
IMO any open source centric repository should have developers upload the source code instead of binaries so that the repository can compile it themselves to give binaries or source code to users. At the start they could use docker where the docker file is part of the uploaded artifacts, and later they could use WASI based toolchains directly, e.g. clang compiled to wasm or rustc compiled to wasm.
This "developers upload binary artifacts, source code is an afterthought" idea of npm rubs me wrongly.
Yes, definitely source+build instructions should be uploaded rather than binaries.
You can still support proprietary software by just uploading the binaries as source (and maybe doing some build-process to adapt it to the packaging format)
Definitely, proprietary software should be supported. You could either let developers upload the binary directly, or just not offer sources to download for users, optionally deleting them after they have been built.
Debian has developed an amazing set of standards for software repositories which I think should be applied universally to all package management ecosystems that want to have open source at their core.
E.g. they require that the source code has to be actual source code, aka "preferred form of modification" and not something minified. They also have separate repositories for proprietary and DFSG-free software, allowing users to choose whether they want to use proprietary software or not. Also, they run their builds without internet access, so the source code (as well as the dependencies tracked by the system) is actually everything needed.
The documentation such as it is, doesn't even tell me what this is supposed to do (or at least not after a superficial look). Is it for using web assembly binaries on a command line? Or in the browser? Integration with webpack or the like?
There is a link on how to install from source, pointing to github, but the link is dead and the organization has no repositories...
At the moment, there is not enough momentum behind individual web assembly packages, as in "many people wanting to include the same wasm binaries". If at some point there are runtimes for like C#, Java, Python etc which people agree on, they might try and use a CDN approach, such that users visiting multiple websites don't need to download everything at every site.
It's a Java/.NET/JS-style VM target. So you can distribute your app with a single binary download, instead of different ones for different Operating Systems.
1. Package managers are very hard! The same problems wrt dependency resolution pop up again and again. Whatever you do, I recommend uhm, copying whatever Yehuda et al did for Ruby, Rust, Yarn, i.e. bundler/cargo/yarn
Then submit that to HN, showcasing what wapm can really do.
Instead, people here are rightly cynical. I do not want another private company managing something as fundamental as a package registry.
I want them to answer my question about how they are actually going to fund themselves. Especially with the dubious decision to pay for office space in San Francisco.
It'd be cool to see what language the parent library was written in, so you could explore them similar to Github's language based browse feature.
Although I guess the whole point, as far as the end user is concerned, is it doesn't matter what language it was written in. But it's still very relevant to people using them as libraries.
does that mean everything installs globallu by default. if there is one thing I love about npm it's that it's local by default and global is the exception.
Is it possible to integrate wapm with web browser as plugin, so web site will be able to request up-to date wasm packages from local store instead of downloading of their own copies?
I've grown uncomfortable with NPM being operated by NPM Inc instead of The Node.js Foundation, but it's a hard thing to change once it's established. We should hesitate to support the establishment of another community package manager by a for-profit company.