Hacker News new | past | comments | ask | show | jobs | submit login

Remember back when people recommended commiting node_modules into git?



Ah - that would explain why at my current job there was a node_modules directory in git with nearly 2 million lines of Javascript within.

It is gone now.


The days before lock files were a thing and 'it works on my machine!' was rampant.


Lock files protect you from the version changing out from under you, but modules disappearing from NPM is a thing that happens. Yes, you can use artifactory or similar as a proxy but that requires infrastructure that you may not want to run. That is all to say: there are situations where committing node_modules is the least evil.


... are we no longer doing 'works on my machine' ?


Ostensibly if it works in Alice's Docker instance, it will run in Bob's Docker instance too.


Well... unless some dev's have M1-macs, and some of the docker layers are not available for arm, or the other way around, not available for amd64. Gives interesting issues.


Except for weird Docker edge cases (extremely rare, but does happen).


Not rare at all.

Docker is a congregation of technologies held together with duct tape and glue.

Eg. permissions handling is completely different on Macs with Docker Desktop from the Linux dockerd stuff: on Macs, it automatically translates user ownership for any "mounted" local storage (like your code repository), whereas on Linux user IDs and host system permissions are preserved. Have some developers use Macs and others use Linux, and attempt to do proper permissions setup (otherwise known as "don't run as root"), and you are looking for some fun debugging sessions.


> Docker is a congregation of technologies held together with duct tape and glue.

No, it's not. What a wild conclusion to reach from the example you gave.


At companies that don't check in node_modules, build folders, and are using standard packaging tooling like maven or yarn or npm or what-have-you. Yes, I haven't experienced that in like 15 years.


Ugh

The price of letting less experienced people "go crazy" in the repo


Npm didn't support lockfiles until version 5, released in 2017, Yarn had them at launch in 2016. Before that committing node_modules was often used as a form of vendoring, to get reproducible builds.

If a new project these days commits node_modules to git, it's likely a mistake, but for legacy projects started before 2017 it was the lesser of two evils.

Edit: spelling.


Hm, this project was started in 2017. The node_modules directory was for Serverless (a tool written in Javascript), not the website itself (which was written in AngularJS - probably not the best choice in 2017 either).


s/was often used as a/was the only practical/

Prior to lock files (and potentially after, as checked-in files are beyond trivial to modify and review and that can be worthwhile) committing dependencies in some form was basically the only reasonable way to have reproducible builds, unless you wanted to build your own package manager / lock file implementation.

Which is what Yarn did.


Or sane people wanting to have some cheap, low effort way to track changes in their project's dependencies.


Based on how brittle Github actions is I'd be ready to commit node_modules except for that I'm building cross-platform software with native dependencies.


`npm rebuild` should rebuild native code in your committed node_modules


Pretty sure that recommendation came from a Git hosting service that charged by the megabyte.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: