Under ESM, each of your dependencies could go back to being a (CDN or /vendor-directory) hosted "release." The release would have been packed together into a single JS file + a single CSS file by its developer, when it was published. There would be no reason to further run a bundler just to turn ~5 such pairs of files into one pair. Over HTTP2/3 that difference is negligible.
It's non-negligible for "thousands of dependencies", of course, but nobody would be doing that. "Thousands of dependencies" is for library development, not for library release packaging.
NPM (or Pika) is still useful under such a paradigm, for the same reason Cargo or Bundler or Pip or Mix or whatever other tools are useful: these tools let you specify your direct dependencies using symbolic version constraints, and then retrieve+update+lock those dependencies. This is helpful whether your direct dependencies are "baked-down" release packages or raw deep-dep-tree source.
That's worse because you lose the ability to "tree-shake".
It would go back to importing all of Underscore on a CDN versus bundling the few functions you need into the main bundle.
That never really works out in practice when there's so many versions and then you're at the disposal of a public CDN, which can be the bottleneck, especially when it's critical code.
Sure you can fallback to local copies, but that just means your app hung there for X amount of time, and the added complexity of adding fallback logic.
Plus if you look at the size of these libraries, especially utility libraries (date-fns, underscore, etc) they are huge, and you usually only need a few functions. Relying on cache will never be a bigger benefit than tree-shaking.
Sure you could manually try to extract those functions, but that's a much much worse developer experience.
So glad this was such dominant practice and advice for the last 10 years. I wonder for how many of those influential practictioners have known it's more of a lazy include than clever cache reuse. Cargo culting writ larger than most.
It really depends. I think CDNs are still great if you are making smaller content website with few libs for things like lightboxes, sliders etc. It works out better than bundling. There is much less complexity without build process. Many websites don't need too much JS and devs tend to overengineer them nowdays.
If you are talking about products with lot of dependencies then of course but don't forget that bundler situation used to be completely different. There is not that much of difference (besides number of requests) between concating libs in sequence and loading libs in sequence.
I'm speaking to the jquery example. 99% of howtos have instructed people to pull from CDN because everybody (probably) already has it. It has been conventional wisdom for a long, long time.
I just checked, and googling "jquery cdn faster" brings up plenty of examples.
It's non-negligible for "thousands of dependencies", of course, but nobody would be doing that. "Thousands of dependencies" is for library development, not for library release packaging.
NPM (or Pika) is still useful under such a paradigm, for the same reason Cargo or Bundler or Pip or Mix or whatever other tools are useful: these tools let you specify your direct dependencies using symbolic version constraints, and then retrieve+update+lock those dependencies. This is helpful whether your direct dependencies are "baked-down" release packages or raw deep-dep-tree source.