It's worth mentioning that the package manager component of anaconda is released as a separate (small) install: miniconda[1]. It includes only Python and the package manager, and not all of the 700+ packages installed as part of a full anaconda installation.
With its ability to install Python and non-python packages (including binaries), conda is my go-to for managing project environments and dependencies. Between the bioconda[2] and conda-forge[3] channels, it meets the needs of many on the computational side of the biological sciences. Being able to describe a full execution environment with a yaml file is huge win for replicable science.
It is an overkill for pure-python packages or packages with simple C extensions. Conda was developed specifically to handle non-python dependencies, which would be difficult to build in setup.py.
Also, a conda package is not a replacement for a distutils/setuptools package. When building a conda package, one still calls setup.py. So every python conda package has to be a distutils/setuptools package anyway.
Thanks for caveat. Nonetheless, anaconda makes my life so much easier when working with python libraries. If anybody got any other reasons to be careful of it, I'm interested!
conda environments support pip and arbitrary pip commands. So if you use pip to for example install a specific version of a library directly from github that information will be stored in your conda environment and be reproduced every time you recreate your environments.
It seems that Anaconda is under-appreciated outside of pydata circles. Before using it I had no idea that it could manage virtual environments, dependencies and different versions of python.
The fact that it's not a community-driven project might be one of the reasons.
Meanwhile, in a galaxy far away, people are also using buildout.
I have been revisiting buildout recently and I wish there's something that merge ease of use of pipenv with buildout concept. Perhaps something similar to Nix throw in the mix, but more specific to a Python project. I heard that I can do this with Conda, but I never tried.
Being able to define and install external dependencies (e.g. ImageMagick, libsodium, etc.) from a configuration file local to a project is something I missed the most, especially when I'm working on several projects at once.
Nix does everything I want, but I find it hard to convince friends and coworkers to try out Nix. I think this is partly due to Nix itself not belonging to the Python's ecosystem, so the barrier is higher than say, "Yeah, Pipenv is just Virtualenv+Pip"
- there is Miniconda that doesn't force you to install all PyData packages.
- virtual envs and needed packages are all defined in simple yaml file.
- it works well with pip. So if a package isn't in Conda repository, you can install from pip. The annoyance here is that you must try conda, fail, and then try pip.
- you can easily clone envs. So you can have some base envs with your usual packages (or one for Python 2 and another fo Python 3), and just clone them to start a new project.
For one, as a package developer, publishing a source distribution of a package on PyPI is almost trivial. Publishing on Anaconda Cloud requires you to build the binary packages on all the OS's that you want to support (and for all Python versions you want to support) which most people delegate to some CI. So there is a whole new level of complexity involved.
When I have used it, and I have to for a certain project, it is incredibly slow to resolve depenendcies. Enough so that I go for a walk or do something else for 15 minutes while it thinks about whatever it's doing.
Yeah, this is becoming a real problem. This didn't use to happen, but now conda is slow to the point of being unusable if you need to create environments a lot (like during testing)