Interesting history, but I think a more interesting article for me would be about what actually caused modern software repositories/registries to be introduced and take off. Because that is the point where software reuse went into hyperdrive.
Software registries and dependencies are often sneered at by some engineers as a source of bloat and poor quality code. But I think that having an overall negative attitude towards any software registry is a sign of a lack of perspective. Those registries have been the biggest improvement in software engineering ever.
And people that suggest otherwise don't know what software engineering is or it's history.
I never saw anyone ever, even in the deepest pits of the internet, complaining about software registries per se. Even ultra-crackpots rarely get to this level. The regular complaints are always about very specific points around the culture of individual package registries. Even the most ardent NPM critic will be the first to admit that the general idea is per se great, and it is in practice it is useful, but the execution is lacking in some aspects, often exclusively cultural. Nobody is trying to throw away the baby with the bathwater.
Package registries are definitely useful and important. They don’t need someone misrepresenting valid criticism in order to be useful.
If anything it is the lack of empathy and the blindness to those tradeoffs that shows a lack of understanding of engineering itself, not only its history.
Beware this mindset. While it is completely reasonable, know that probably hundreds of new software developers enter the workforce every day, they hear us bitching about package registries like npm, and lack the experience to distinguish baby from said bathwater.
Go ahead. I'm not telling you not to complain, just reminding you that when you do complain, is you don't provide context, you deny others that knowledge
> Even the most ardent NPM critic will be the first to admit that the general idea is per se great, and it is in practice it is useful,
It's to be blackpilled about humanity. They're a natural, useful, obvious thing to do, like solving world hunger. And then everyone gets obese, and, somehow, it feels like we're worse off. It's tragic. Because of the leftpad incident, I now read Nick Land.
> but the execution is lacking in some aspects, often exclusively cultural.
Is that a thing I can do? Can I just pull the "cultural" emergency lever to justify anything?
It's fine if everyone owns 3 firearms; the reason shootings are up is a cultural issue!
> Nobody is trying to throw away the baby with the bathwater.
Reject modernity; embrace tradition. c:<
Can you think of any way to prevent the slouch toward mediocrity and supply injection attacks that come along for the "software registry" ride? I can't. It's like trying to reverse a thermodynamic reaction. It's trying to uninvent gunpowder. It's trying to have networking with 100% uptime. It's trying to have a free lunch.
I was around for the initial discussions on CTAN having run a popular FTP archive for TeX-related software in the late 80s/early 90s. Given the heterogeneity of computing platforms in those days (EBCDIC vs ASCII, bigendian vs littleendian, CR vs LF vs CRLF to separate lines, blocks vs streams for file structures), even a relatively straightforward text-based archive had its potential footguns. Add in the lack of anything resembling a reliable cross-platform scripting language (Perl was the first contender in this space, and didn’t have the insane cross-platform reach TeX did at the time) and there were some rather heroic efforts to use TeX itself as the scripting platform. The best-known and most widely used of these would be docstrip which was used as a sort of WEAVE/TANGLE for TeX code, but I remember there being an effort to also do some translation of files and out of BASE64-encoded format for reliable cross-platform transport. The original CTAN FTP server in the UK also had the cool feature that if you wanted the contents of a directory, adding .zip to the end of that directory name would return the directory’s contents zipped.
Being a first mover has had its drawbacks of course. Most notably, the lack of any sort of versioning of dependencies and that everything is functionally in a single flat directory (even if on disk it’s arranged in a hierarchical tree, that tree is flattened for all inputs). People who complain about the functionality of newer dependency managers have no idea how much better even the worst are compared to what it was like in the past.
Back in the day (late 90s if I’m remembering correctly), after experiencing it for a while, when encountering a CPAN package I would roll my eyes… and look elsewhere. I had endless trouble just getting things to build reliably. I always ended up feeling like it took much longer to use CPAN than it did to just download the script. Maybe I am alone in that, but that was my experience. And, of course, much has probably changed since then; in particular the unixverse is much more homogenous than it used to be.
With NPM, I just roll my eyes. But generally speaking it works.
I think part of it comes down to a mix of transitive dependencies (common on both platforms) and a handful of core dependencies requiring native compilation (I don’t think anything in npm does this, but I don’t have the familiarity). It‘s the latter that creates the fragility. I thought I knew make until I had to deal with cross-platform makefiles. If you had the bad luck to want to use a module which had somewhere in its dependency tree a module who compilation had never been tested against your environment, insert gif of mushroom cloud here.
I think you’re right. I used a lot of less common unices back in the day - dg/ux and HP/ux in particular - and I remember a lot of C compiler failures, though I don’t remember much else.
NPM certainly has native builds but with my use case involving the predominant targets (Linux and MacOS) I just don’t hit the barriers I used to.
That said, by the late 90s my desktop was Linux and I still avoided CPAN!