I also use poetry for everything. I have 0 problems, things work on my mac, my interns pc, aws instances, I don't even see what problem people are having. Before that I was using pipenv, and before that just good old requirements.txt - there were a few occasional issues, but really not much even then. At this point, I suspect it is more about regurgitating a complaint than a real issue. But, I could be lucky and completely wrong...
- until a few months ago no way to sync an environment with a lockfile (remove packages that shouldn't be there)
- no way to check if the lock file is up to date with the toml file
- no way to install packages from source if the version number is calculated (this will likely never be fixed as it's a design decision to use static package metadata insetad of setup.py, but is an incompatibility with pip)
- no way to have handle multiple environments: you get dependencies and dev-dependencies and that's it. You can fake it with extras, but it's a hack
- if you upgrade to a new python minor version you also have to upgrade to the latest poetry version or things just fail (Something to do with the correct selection of vendored dependencies. May have since been fixed -- new python versions don't come out all that often for me to run into it. And in fairness the latest pip is typically bundled with each python so it avoids that issue)
I still use poetry because it's more standard than hand-rolled pip freeze wrapper scripts, and there's definitely progress (the inability to sync packages was a hard requirement for me but is not fixed) but it's not quite there yet
Interesting, i usually rebuild my env from pacakges so don't notice 1,2, or 3. I guess 2 should be fixable by poetry by including more from the toml in the lock file. Point 4 also didn't bother me as I in general just have the main and dev deps, this seems an easier thing to fix for poetry though. I actually have encountered 5 when fiddling around with pyenv.
If you don't need c or c++ dependencies it's ok. If you do, it's very very painful. To be fair, most of the DS libraries can be handled by conda, but if you need both conda and pip, then you're going to have a bad time. (Source: this is my life right now).
Oh man, this is my life right now, too. In my case, we're using tensorflow or tensorflow-gpu, depending on the host system and, unfortunately, only Conda offers tensorflow-gpu with built-in CUDA. Add to this that the tensorflow packages themselves are notoriously bad at specifying dependencies and that different versions of tensorflow(-gpu) are available on conda-forge, depending on your OS.
Tensorflow is the worst (along with ReAgent from FB).
I think it's because they have their own internal build systems, but they never play well with pip/conda et al.
One of my recent breakages was installing the recsim package, which pulled in tensorflow and broke my entire app. There's actually a recsim-no-tf package on PyPi, presumably because this happens to loads of people.