You make sure everyone uses the same Python version by setting it as a dependency. I mentioned I only set the dependencies on minor version (e.g. 2.7) rather than path version (e.g. 2.7.3), but the latter is supported - for the python version as well as for any other package.
You make sure the exact versions you want are in use by editing and curating the requirements.txt rather than freezing it. It really is that simple, but somehow that's not a common workflow.
prod vs. dev is the only one I didn't address because I don't have experience with that - but I know people who manage "requirements_prod.txt" vs "requirements_dev.txt" and it seems to work for them.
How do you manage the specific version of transitive dependencies? For example, I install library A and it requires library B, but your code doesn’t use B directly?
A better question, what’s your workflow for installing a dependency and 6 months later updating that dependency?
Just put them in your requirements.txt even if you don’t depend on them directly.
That’s basically the whole idea of conda’s solver, I think. If your list works - fine. If n9t, it will find a set of package installs that makes it work.
I guess that’s also why my resolution times are way faster than what some people describe.
I treat requirements.txt closer to a manually curated lock file than a wishlist (which most req.txt files in the wild are).
This is a giant pain in the ass compared to every other modern language. Either you’re manually managing precise versions for all direct and transitive dependencies or you aren’t running exactly the same versions of libraries across team members and environments which would be horrifying.
There is a command (not freeze) that dumps the exact versions deployed, similar to a lock file. I never used it, because it doesn’t work well across operating systems (e.g. do it on Windows and try to use it on Linux). This is less of a Conda problem and more of a Python problem - same package name/version has different dependencies on different architectures.
But manually curating the dependencies has been painless and works fine for me and my team for almost a decade now.
You make sure everyone uses the same Python version by setting it as a dependency. I mentioned I only set the dependencies on minor version (e.g. 2.7) rather than path version (e.g. 2.7.3), but the latter is supported - for the python version as well as for any other package.
You make sure the exact versions you want are in use by editing and curating the requirements.txt rather than freezing it. It really is that simple, but somehow that's not a common workflow.
prod vs. dev is the only one I didn't address because I don't have experience with that - but I know people who manage "requirements_prod.txt" vs "requirements_dev.txt" and it seems to work for them.