Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

FWIW I had a lot of success using https://github.com/jazzband/pip-tools to have dependencies automatically managed in a virtualenv.

* Basically I would have a single bash script that every `.py` entrypoint links to.

* Beside that symlink is a `requirements.in` file that just lists the top-level dependencies I know about.

* There's a `requirements.txt` file generated via pip-tools that lists all the dependencies with explicit version numbers.

* The bash script then makes sure there's a virtual environment in that folder & the installed package list matches exactly the `requirements.txt` file (i.e. any extra packages are uninstalled, any missing/mismatched version packages are installed correctly).

This was great because during development if you want to add a new dependency or change the installed version (i.e. pip-compile -U to update the dependency set), it didn't matter what the build server had & could test any diff independently & inexpensively. When developers pulled a new revision, they didn't have to muck about with the virtualenv - they could just launch the script without thinking about python dependencies. Finally, unrelated pieces of code would have their own dependency chains so there wasn't even a global project-wide set of dependencies (e.g. if 1 tool depends on component A, the other tools don't need to).

I viewed the lack of `setup.py` as a good thing - deploying new versions of tools was a git push away rather than relying on chef or having users install new versions manually.

This was the smoothest setup I've ever used for running python from source without adopting something like Bazel/BUCK (which add a lot of complexity for ingesting new dependencies as you can't leverage pip & they don't support running the python scripts in-place).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: