> With just a `import thing from "https://example.com/foo.js"` I get a fully intialised library even if does async requests as part of its initialisation.
This is indeed convenient if you want to quickly try a lib. However, what happens in a larger project with dozens of dependencies? Those http requests will get inefficient soon, and you'll be back having to compile everything with Babel or similar.
I agree, bundling isn't going anywhere for sure, but the web isn't all single-page-apps built with perfect engineering and top-notch frameworks.
Plenty of websites are just server-side rendered and just need to sprinkle some client side JS. For them it's perfect to be able to drop in a script and not have to worry about bundling or polluting the `window` object while loading external scripts.
If I’m serious about shipping some production app/page, I want complete control over all of this anyways. I never want to require/import an entire .min.js file. I want to tree shake the 15% of it I need.
This is a myth in production environments I feel. It's a security risk to import the library on the fly from a source you don't control, and the caching is per user, so you are banking on each specific user having visited a site that happens to have the same version of some library you use from the same domain that you included it from, so that it's cached.
You're also now fighting for response time and bandwidth from a public resource you don't control. You are beholden to their traffic spikes, their downtime and their security incidents.
Just send it from your servers, or your edge nodes. They already have the DNS hit cached, they are already getting everything else from there. Chances are high you're sending image data that far exceeds the JS library anyway. This is especially prudent if you serve users in your own country, and that country isn't the US. Chances are very high your site's largest response delays are US CDN resources if you use them.
Privacy concerns led to browsers caching per-user and per-site, so there is even less advantage to "shared CDNs" in 2023's browsers.
That said, tree-shaking can sometimes be a premature optimization if your site isn't a SPA with a comprehensive view of its tree to shake. Some MPA designs may still benefit from caching the whole ESM .min.js of a site-wide dependency and letting the browser tree-shake at runtime.
And it was always pretty minimal benefit. Depended on the exact same version of the library being cached from the same CDN... in the days of jquery hegemony, maybe jquery cache hits could be a thing, but even that was probably minimal. These days JS usage is much more diverse.
It was like an idea people had that this would be a cache benefit, but just theory not actually from real-world observation. I recall several people trying to do investigations to determine how often cache hits would happen in these cases, and finding it wasn't that often after all in real-world use. But I can't find em now, google is crowded out by people talking about what you link to above!
Due to security/privacy concerns, browser caching is now scoped to the origin of the website loading the content, so linking to popular libraries from CDNs provides no caching benefits when loading your site for the first time.
This is indeed convenient if you want to quickly try a lib. However, what happens in a larger project with dozens of dependencies? Those http requests will get inefficient soon, and you'll be back having to compile everything with Babel or similar.