Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Sure you can pip freeze your dependencies out to a file but this includes dependencies of dependencies, not just your app's top level dependencies.

Isn't that a good thing?

> no concept of a separate lock file with pip.

setup.py/.cfg vs requirements.txt, no?



> Isn't that a good thing?

Yes, a very good thing.

> setup.py/.cfg vs requirements.txt, no?

A lot of web applications aren't proper packages in the sense that you pip install them.

They end up being applications you run inside of a Python interpreter that happen to have dependencies and you kick things off by running a web app server like gunicorn or uwsgi.

For a Python analogy vs what other languages do, you would end up having a requirements.txt file with your top level dependencies and when you run a pip install, it would auto-generate a separate requirements.lock file with all deps pinned to their exact versions. Then you'd commit both files to version control, but you would only ever modify your requirements.txt by hand. If a lock file is present that gets used during a pip install, otherwise it would use your requirements.txt file.

The above work flow is how Ruby, Elixir and Node's package managers operate out of the box. It seems to work pretty well in practice for ensuring your top level deps are readable and your builds are deterministic.

Currently there's no sane way to replicate that behavior using pip. That's partly why other Python package managers have come into existence over the years.


I don't understand the distinction you're making. Are you pip-installing or not? If not, why not?

My method for deploying a web application is to have a Dockerfile which pip-installs the Python package, but I could see someone using a Makefile to pip-install from requirements.txt instead. In fact, I use `make` to run the commands in my Dockerfile.


> Are you pip-installing or not? If not, why not?

I am running a pip install -r requirements.txt when I do install new dependencies. I happen to be using Docker too, but I don't think that matters much in the end.


Docker does matter, because the Docker image should take the place of requirements.txt (your "locked" dependencies) in your deployment process. I suggest you pip-install the package, rather than the package's requirements.txt file.


> Docker does matter, because the Docker image should take the place of requirements.txt (your "locked" dependencies) in your deployment process.

In practice it doesn't tho.

Let's say I'm working on a project without a lock file and commit a change that updates my dependencies. I get distracted by anything and don't push the code for a few hours.

I come back and push the code. CI picks it up and runs a docker-compose build and pushes the image to a container registry, then my production server pulls that image.

With this work flow there's no guarantee that I'm going to get the same dependencies of dependencies in dev vs prod, even with using Docker. During those few hours before I pushed, a dep of a dep could have been updated so now CI is different than dev. Tests will hopefully ensure the app doesn't break because of that, but ultimately it boils down to not being able to depend on version guarantees with Docker alone.

There's also the issue of having multiple developers. Without a lock file, dev A and B could end up with having different local dependency versions when they build their own copy of the image.

I've seen these types of issues happen all the time with Flask development. For example Flask doesn't restrict Werkzeug versions, so you wake up one day and rebuild your image locally because you changed an unrelated dependency and suddenly your app breaks because you had Werkzeug 0.9.x but 1.x was released and you forgot to define and pin Werkzeug in your requirements.txt because you assumed Flask would have. The same can be said with SQLAlchemy because it's easy to forget to define and pin that because you brought in and pinned Flask-SQLAlchemy but Flask-SQLAlchemy doesn't restrict SQLAlchemy versions.

Long story short, a lock file is super important with or without Docker.


Use the same method to verify in dev as in staging (Docker image). If you don't know it works in staging, then you didn't know in dev either.


yes, but don't underestimate the power of convention.

if you make pip run 'pip freeze > requirements.txt.lock' after every 'pip install whatever', you almost solve that particular problem if setup.py is configured to parse that (it isn't by default and there's no easy way to do that!)


That's the whole point of distinguishing between logical dependencies and reproducibility dependencies. I use setup.cfg to describe the logical dependencies, I supply a requirements.txt (or environment.yml, or a Dockerfile) to provide the tools necessary to create a deployable build.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: