Hacker News new | past | comments | ask | show | jobs | submit login

That might be referring to Yarn's "offline mirror" feature. When enabled, Yarn will cache package tarballs in the designated folder so that you can commit them to the repo. When someone else clones the repo and runs `yarn`, it will look in the offline mirror folder first, and assuming it finds packages matching the lockfile, use those.

This takes up _far_ less space than trying to commit your `node_modules` folder, and also works better cross-platform.

I wrote a blog post about setting up an offline mirror cache a couple years ago:

https://blog.isquaredsoftware.com/2017/07/practical-redux-pa...

Used it on my last couple projects at work, and it worked out quite well for us.




That's quite interesting, although back in the day we did that for C dependencies that weren't packaged well, and it quickly ballooned the size of our repo since git has to treat tar balls as binaries. Even if you only update a few lines of the dependency for a patch version, you re-commit the entire 43 MB tarball (obviously that depends on the size of your tarball).


You could use Git LFS to store anything ending with a tarball extension. It's pretty well supported by most Git servers (I know GitHub and GitLab support it off the top of my head). You do need the LFS extension for Git to use it.


The other similar approach is to build in containers - and use Docker layers to contain the dependencies.


verdaccio aims to do this as a proxy: https://github.com/verdaccio/verdaccio




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: