> It's everybody else's fault for including such a trivial module to their code (I mean, those who included it directly), what leads to much bigger exposure to such things (because there's much more modules to depend on)
But npm disallowing unpublishing makes that entire point moot. There is no more difference.
> Times the number of the modules. Or rather, (1.0 - likeness) ^ count(modules). You end up with much more fragile application.
Thats not how it works. A tiny module making an incompatible change to its sole function is not likely to bump a minor version, but quite likely to bump a major one. A tiny module also has less updates overall because there is no need to constantly add new stuff to it, like it often is the case for large modules.
When I add lodash as a dependency, I'm N times more likely to have to update it during a given period than just adding padStart as a dependency (where N is the number of functions it has).
If I only use 30% of lodash (not a small percentage), its quite likely that 70% of those updates are about parts I don't care about and haven't taken the time to understand.
Small modules have very different dynamics and costs compared to large ones. They are only the same in name (and maybe they shouldn't be called modules, actually).
> First, the new release was probably totally unnecessary. Why not bundle several such changes together? And semantic versioning scheme calls for increase of major number whenever the change is backwards-incompatible. Too bad that the change was tiny. So no, it's not "okay if they bump the minor version number".
There we go, another cost of large modules. They are both slower to update things you care about and less likely to have an update you do care about :)
And now suppose you are another user that doesn't use that 1 particular function out of 100. But for some reason the library bumped a major version number! Why? Because they changed a function you don't care about. Great, now you have to review every single change, including those you don't care about.
> Second, if you were only using one function for those 100 functions, you screwed up. You probably didn't needed that one tiny function that much, so you added an unnecessary dependency. It's the same mentality of micromodules that leads to a dependency fractal, which in turn leads to a fragile codebase.
What if I need two, or 3, or 30%? Where does it stop? Usually when I add a micro-module, I need close to 100% of it. Seems good enough to me.
-
This only leaves us with malicious (or protesting) actors, which would make a malicious update. But that risk grows with the number of authors, not with the number of modules. Still, because the problem isn't the module size itself, there are other ways to solve it :) (keeping your own author whitelist, deep-pinning version numbers for less trusted modules to ensure they are immutable, web of trust, reputation systems and so on)
-
It saddens me that everyone took this as a chance to attack tiny modules as the scapegoat. I think that the node community is really onto something here. Its not yet perfectly developed by any means - there is a lot of work left - but that doesn't mean we have to scrap the entire idea. There is a chance here for npm to try hard to come up with nice trust/security/immutability features to accomodate these new module dynamics better.
edit: Another reason it saddens me is that everyone missed the point of azer's protest. I guess people just don't care unless their stuff is on the line. If leftPad was named `kik`, perhaps we would've had an entirely different conversation now.
> It saddens me that everyone took this as a chance to attack tiny modules as the scapegoat.
Not really. For instance, I claim that too many dependencies is a bad thing
for some time already, left-pad farce being merely an example of why is that.
It takes some long-lived code to see the cost of adding a dependency, and
several such cases (and possibly deploying the code in different ways in
different environments) to realize it was paid due to having a dependency.
> There is a chance here for npm to try hard to come up with nice trust/security/immutability features to accomodate these new module dynamics better.
This new module dynamics feels like a constant fall, or maybe like running
before a locomotive that won't ever stop, so you need to keep running or be
run over. It doesn't look healthy.
> Not really. For instance, I claim that too many dependencies is a bad thing for some time already, left-pad farce being merely an example of why is that.
I claim that left-pad is an example of a malicious actor + a system that doesn't protect from that, not of the failing of small modules. And actually, `kik` is almost the same example. "Luckily" nobody cared about kik.
I also claim that having many small dependencies is very different from having many large dependencies.
> It takes some long-lived code to see the cost of adding a dependency, and several such cases (and possibly deploying the code in different ways in different environments) to realize it was paid due to having a dependency.
It also takes some experience with small modules to see they are not like large modules at all, and you cannot apply the same thinking to them.
Example: once leftPad is performant and working, its done. You can just pin it. When is lodash done? Probably never.
> This new module dynamics feels like a constant fall, or maybe like running before a locomotive that won't ever stop, so you need to keep running or be run over. It doesn't look healthy.
And dependency hell also looked impossible before npm and commonjs brought a solution to the mainstream :) Lets see if they can solve the rest of the problems too.
> I claim that left-pad is an example of a malicious actor + a system that doesn't protect from that, not of the failing of small modules.
It is mainly that, this is right. But small modules make such a shitstorm more
probable, simply because there are many more of them.
> And actually, `kik` is almost the same example. "Luckily" nobody cared about kik.
Do you believe it won't happen in the future? The more modules, the more
likely it is. And project with JavaScript modules has an order or two of
magnitudes more modules than in other languages.
> I also claim that having many small dependencies is very different from having many large dependencies.
Of course, but that's not what everybody talks about. It's many small modules
vs. only several big ones.
> It also takes some experience with small modules to see they are not like large modules at all, and you cannot apply the same thinking to them.
But you can't apply different thinking to small modules, because you don't
just use small modules, you use big ones as well, and those are in no way
distinguishable except with the amount of code.
> Example: once leftPad is performant and working, its done. You can just pin it. When is lodash done? Probably never.
Oh, if it only was that easy. You don't control all the instances of left-pad
you include (because indirect dependencies), so pinning it in your project is
far from enough.
> And dependency hell also looked impossible before npm and commonjs brought a solution to the mainstream :)
Oh really? I thought it was solved dozen years earlier with various package
managers in Linux distributions, and then re-solved by PIP and gems, and only
after that comes npm.
> Oh, if it only was that easy. You don't control all the instances of left-pad you include (because indirect dependencies), so pinning it in your project is far from enough.
Yes, lets scrap the entire thing just because we cannot deep-pin individual dependencies. How about adding support for deep-pinning individual dependencies instead?
> re-solved by PIP and gems
Afaik, those two don't solve the problem. If within project P, dependency A depends on C v1 and dependency B depends on C v2, they just can't resolve that conflict. Which is a non-starter for tiny modules.
Sorry for using lodash as an example :) Its a good example for this debate precisely because its also available as tiny modules, so it can be used for comparisons without confounding factors.
But npm disallowing unpublishing makes that entire point moot. There is no more difference.
> Times the number of the modules. Or rather, (1.0 - likeness) ^ count(modules). You end up with much more fragile application.
Thats not how it works. A tiny module making an incompatible change to its sole function is not likely to bump a minor version, but quite likely to bump a major one. A tiny module also has less updates overall because there is no need to constantly add new stuff to it, like it often is the case for large modules.
When I add lodash as a dependency, I'm N times more likely to have to update it during a given period than just adding padStart as a dependency (where N is the number of functions it has).
If I only use 30% of lodash (not a small percentage), its quite likely that 70% of those updates are about parts I don't care about and haven't taken the time to understand.
Small modules have very different dynamics and costs compared to large ones. They are only the same in name (and maybe they shouldn't be called modules, actually).
> First, the new release was probably totally unnecessary. Why not bundle several such changes together? And semantic versioning scheme calls for increase of major number whenever the change is backwards-incompatible. Too bad that the change was tiny. So no, it's not "okay if they bump the minor version number".
There we go, another cost of large modules. They are both slower to update things you care about and less likely to have an update you do care about :)
And now suppose you are another user that doesn't use that 1 particular function out of 100. But for some reason the library bumped a major version number! Why? Because they changed a function you don't care about. Great, now you have to review every single change, including those you don't care about.
> Second, if you were only using one function for those 100 functions, you screwed up. You probably didn't needed that one tiny function that much, so you added an unnecessary dependency. It's the same mentality of micromodules that leads to a dependency fractal, which in turn leads to a fragile codebase.
What if I need two, or 3, or 30%? Where does it stop? Usually when I add a micro-module, I need close to 100% of it. Seems good enough to me.
-
This only leaves us with malicious (or protesting) actors, which would make a malicious update. But that risk grows with the number of authors, not with the number of modules. Still, because the problem isn't the module size itself, there are other ways to solve it :) (keeping your own author whitelist, deep-pinning version numbers for less trusted modules to ensure they are immutable, web of trust, reputation systems and so on)
-
It saddens me that everyone took this as a chance to attack tiny modules as the scapegoat. I think that the node community is really onto something here. Its not yet perfectly developed by any means - there is a lot of work left - but that doesn't mean we have to scrap the entire idea. There is a chance here for npm to try hard to come up with nice trust/security/immutability features to accomodate these new module dynamics better.
edit: Another reason it saddens me is that everyone missed the point of azer's protest. I guess people just don't care unless their stuff is on the line. If leftPad was named `kik`, perhaps we would've had an entirely different conversation now.