Last time I tried deno there was some friction with depending on npm packages that didn't natively support deno without vendoring, is that easier nowadays? Can we seamlessly import anything from npm inside deno-run code and use the deno stdlib side-by-side with node_modules code? We love the Deno direction and would even willing to donate $ to grow its development, but for us it's pretty much non-starter to switch to a different runtime until we can use the wealth of packages available on npm without any additional friction.
That really depends on concrete package; a lot of npm packages works perfectly fine in Deno, especially if they're available via CDNs like Skypack.
For packages that use native Node APIs there's a
Node compatibility layer being developed as part of the standard library: https://deno.land/std@0.80.0/node. It's still lacking a lot of modules and doesn't provide seamless experience, but with every release it's getting better.
Thanks, that's useful info, we'll probably wait until there's >90% compatibility with the node APIs then. We're never going to put URLs into our imports because we want to be able to run things offline without depending on 3rd party servers to stay up & consistent over time, but if we can depend on local ./node_modules libs once compatibility is improved then that's great.
> We're never going to put URLs into our imports because we want to be able to run things offline without depending on 3rd party servers to stay up over time, but if we can depend on local ./node_modules libs once compatibility is improved then that's great.
You can achieve the same thing with Deno! By default Deno downloads all dependencies into a central cache directory, but by providing DENO_DIR env variable with a path you can tell Deno to change that cache dir. And these files are cached indefinitely; Deno will not try to fetch them again on next run (unless you opt into it with --reload flag).
Local caching alone loses all the benefits of using a package manager though.
I want to have a central repository of all the package versions with enforced monotonically increasing version numbers and a public, explicit chain of trust. Otherwise it's basically curl | sh with all its associated problems https://docs.monadical.com/s/against-curl-sh
Using URLs means there are no rules enforced, the code hosted at that URL can change out from under you without any warning. A new developer checking out our repo could fetch totally different packages than everyone else on the team and not have any warning about it being different, or any recourse if they wanted to fetch a previous version.
The beauty of Deno is that it's agnostic about where you import your code; At the moment deno.land/x only allows tags to be published - no semver range resolution, and doesn't allow versions to be removed/update. nest.land is another popular one, and is build on top of the arweave blockchain, bringing that chain of trust you mention.
The ecosystem is still in evolution but I expect that it stabilize around a few generic registries for smaller libs, and larger libs hosting their code themselves in the long run. The point is; while URLs _can_ be very loosy goosy ways to address code, they can also be made very strict - it will depend on the actual server behind it.
As a side note, npm is already pretty poor at providing those guarantees anyway, I find it interesting that it's usually assumed to be a safe way to install dependencies.
I would probably just vendor them into my own directory structure and import them as a local module.
You lose automatic updating, but I'm not sure I want that anyhow. A script that goes looking for new versions of 3rd party modules would be fairly trivial I think.
There's nothing preventing you from still using a central repository (or its mirrors). I highly recommend "Deno is a Browser for Code" by Kitson [1] which discusses this subject in more depth.
Hmm I don't find this article very convincing. From my perspective this is a security and sysadmin nightmare (for the reasons in my link above). Lockfiles actually provide repeatable builds almost all the time, the only times they fail are when depending on non-JS builds like node-gyp or other C++/etc dependencies where you can't lock on system build tooling versions because they're outside the scope of what npm can lock.
The real appeal of Deno for our org is the stdlib, which as a side effect means we can depend on fewer packages. The wholesale removal of the package manager seems like an unwanted pain that will only keep us away from switching and gaining the stdlib benefits.
You are still not thinking of it as a browser. Consider a use case where Deno is the client, not the server. One example is a replacement for piping random curls into bash:
I mean that's a neat feature and all, but my primary use for Javascript/Typescript is to write frontend and backend code using the tens of millions of lines of useful library code available in the existing JS/npm ecosystem.
It's strange to me that Deno seems as if it could decide overnight to be a drop-in replacement, but there's deliberate friction designed into the system here to try and push people away from node_modules? The upshot of that choice is that I'm unlikely to switch to Deno until that's changed (and I suspect that's the case for many other companies as well).
I love the direction the node compatibility layer is going in though https://deno.land/std@0.80.0/node, now I just wish it supported normal import statements from node_modules (not just require()). I'm quite excited about Deno overall, just waiting for it to get to drop-in point.
> It's strange to me that Deno seems as if it could decide overnight to be a drop-in replacement
The 1.0 announcement post the team mentioned:
> For some applications Deno may be a good choice today, for others not yet. It will depend on the requirements. We want to be transparent about these limitations to help people make informed decisions when considering to use Deno.
and:
> Over time, we expect Deno to be able to run more and more Node programs out-of-the-box.
I don't think they've ever claimed to be an immediate drop-in replacement.
I know I sound like a broken record at this point, but wouldn't it be helpful if Deno just built something like this in automatically to make those import maps given a lockfile:
This way you'd get both the benefits of web standards compliance with the generated explicit import maps, and backward compatibility and ease-of-migration for npm/yarn users.
How is using a URL different from using a NPM package? In both cases you can specify a module, a version, and need to trust some remote server that it is sending you the correct files.
> the code hosted at that URL can change out from under you without any warning
The same can and has happened with NPM. See left-pad.
The difference is that NPM as an org has a lot more to lose if they mess up everyones packages or serve incorrect versions than some random person's website.
Left pad was promptly fixed! That's an argument for a centralized package manager, not against. If it were hosted on some private server we'd all still be screwed.
> The difference is that NPM as an org has a lot more to lose if they mess up everyones packages
How is that important? Most Deno packages are imported from GitHub (or deno.land). Neither NPM nor GitHub want to lose your code.
> Left pad was promptly fixed! That's an argument for a centralized package manager, not against.
This is not an argument for a central package manager, but an argument for a central package repository.
Deno is already a "central package manager". Similar to NPM in Node development, Deno is the default tool to download code in Deno development. Both with Node or Deno, you can download code in other ways, too. Nobody forces you to load code from URLs via import statements or NPM packages via npm install and commonjs require. (Also, when it comes to executing random code from the internet, Deno has a sandbox. Node doesn't.)
And yes, well maintained package repositories are great. Whether centrally or decentrally managed repos are better is up for debate, though.
In any case, if you want to use NPM packages in Deno, I'd recommend https://www.skypack.dev/. It's "NPM packages from a URL", so, as we have established earlier, it's just as much reliant on trust and potentially unstable as anything in life, but at least their left-pad is patched...
I am not proposing that Deno remove URL support, I think it's great that they allow importing from URL as an option. I just wish they also supported importing from local npm packages installed in node_modules without needing to specify a URL/full path. This would allow full inter-compatibility with the existing packaging ecosystem and allow people to continue using whatever packaging method they prefer.
Ah cool, that helps a ton. The existence of this feature as the linchpin for compatibility with npm is not clear from the rest of the Deno docs though. Perhaps it could be linked to from this page: https://deno.land/manual@v1.6.0/examples/import_export.
Also, it seems as though ./node_modules/ being the default could be assumed automatically though, no?
Considering it's the default in all other JS environments, wouldn't that save the hassle of the dev having to define this for their entire tree of JS dependencies? Then Deno would be a drop-in replacement and we could move our whole codebase over to it overnight (once the Node APIs are up to par).
> Also, it seems as though ./node_modules/ being the default could be assumed automatically though, no?
No, there is no magic node_modules directory in any other JS environment, other than Node. Deno aims to be compatible with web standards. Import and import maps are web standards, require and node_modules aren't.
> Deno would be a drop-in replacement and we could move our whole codebase over to it overnight
The reason that you cannot move your existing codebase to Deno tonight is essentially the poor web compat of the existing Node ecosystem.
Node is the standard web environment, even frontend code is built 90% of the time using Webpack or another bundler running in a node environment and pulling from node_modules.
I don't think you can blame the incumbent tool with complete market dominance for "poor compatibility"...
I really like the direction of the Node compatibility layer though https://deno.land/std@0.80.0/node, I suspect it will be enough to make Deno a drop in replacement soon. Now it just needs support for normal `import` statements from node_modules instead of just `require()`.
I don't understand, jQuery works fine in node-built environments. Obviously any DOM mutation stuff stuff needs to
happen in the frontend when it runs in the browser, but you can absolutely import jquery and use parts of it during server-side compiling or rendering steps in node.
import $ from "jquery"
> Deno will never ever do that
Why would Deno take such an antagonistic approach to supporting the most common setup that everyone uses with npm? Wouldn't it be trivial to just fall back to checking node_modules for named packages? I want to use Deno! This seems like it's deliberately making transitions difficult for anyone using npm.
Sorry, jQuery was a bad example - I remember jQuery not working in Node at all, but that was roughly 10 years ago. Things have improved.
The fact remains that the most popular JS env is the browser. It has APIs such as window which are not compatible with Node and Node has APIs which are not compatible with the browser like __dirname or require. That's why tools such as browserify and webpack exist to bridge the gap.
In Deno the gap is much closer. Obviously, the Deno namespace is not available in the browser (but there's a shim for most APIs, e.g. Deno.writeFile and readFile are implemented with a virtual FS) and some web APIs are not available in Deno (yet), but the compat story is much better.
This is no surprise since web compat is a core goal of Deno. Node compat is not.
> Wouldn't it be trivial to just fall back to checking node_modules for named packages?
No, the resolution algo is not trivial (nor performant). Also, it's not necessary: There is already a standard for how to import code in JS; it's import statements. Import statements do not allow named packages, e.g. import $ from "jquery" does not work in the browser. Except, again, import maps.
If Deno was only "Add TypeScript support", "Add security capabilities", and "Add URL imports", etc. it would simply be a new version (or multiple new versions) of Node.
These (and other) disruptive breaking changes are about fixing mistakes that cannot be fixed (or at least, are hard to fix) in Node.
Maybe, in a few years, you'll say something like "Oh, I wish legacy Node would be more Deno compatible" because Deno will be the de-facto server-side JS scripting runtime. Equally, it's possible that Deno will fail, but that many good ideas will be incorporated into Node as breaking changes.
Node and Deno as well as their environments can grow further together or further apart. I think it's too early to tell which future is more likely.
_If_ the node library uses explicit file imports, and ESM instead of CJS, it _is_ already possible to just do `npm install` and import from "./node_modules" -- again the problem is that Node's import resolution algorithm is horribly complex and not very explicit, but it's possible to be explicit using Node, and that would make it compatible in Deno
Considering you can generate the import maps from a given npm/yarn lockfile (https://www.npmjs.com/package/@import-maps/generate/v/0.1.0) wouldn't it be possible for Deno to provide a command to do this and get the benefits of both backward npm/yarn compatibility and forward web compatibility with import-maps?
I think go is the perfect example of how not to do module source distribution. Because of their URL dependency system Golang projects work great as static binaries, but they're almost impossible to distribute as source builds via system package managers. A lot of my arguments for why this is a bad thing are already laid out here: https://docs.monadical.com/s/against-curl-sh (I don't know how many more replies we have left before we hit the HN nesting limit, but this is a thing I care deeply about and I'm down to keep chatting on Twitter or other forums if anyone here prefers)
I think what Deno is great because it lets me be decentralized. We did so much with open source only to throw it all away with npm and go back to a centralized corporate entity :(
Maybe... it depends on the amount of _trust_ that you put in the remote domain. My prediction is that the Deno ecosystem will aggregate around a few, large repositories that will have good guarantees araound immutability and good track records to addressing vulnerabilities.
For large projects like React, lodash, eslint whatever, I expect some of them will start hosting their libraries on their own networks, like it used to be when Javascript was only frontend and you would have a script tag importing jQuery directly from jQuery's CDN. The reason it worked was because jQuery was sidely known and trusted.
to be fair, you _should_ probably already be proxying npm and importing from an internal domain (same would be true in any language, really)
The truth is, even with a centralized repository, we're still importing user-code, made by humans that may not be well intentioned or simply not know that their code is vulnerable: proxying within your network and running periodic checks against the content of the local cache would be good practice, no where the code came from
That is interesting. Does it mean I will be able to run my Node.js program on Deno? That would be really huge in terms of letting users migrate from Node.js to Deno.
The Deno docs here almost seem to purposefully avoid answering this question: https://deno.land/manual@v1.6.0/examples/import_export