I don't use venv and other tools (I use docker for this)
But here some points I found interesting comparing vanilla pip to npm (and tools listed in the article fixes it):
1. You have to manually freeze packages (instead of automatic package-lock.json)
2. Each time your install/remove package, dependant packages are not removed from freeze. You have to do it manually. (interesting link: https://github.com/jazzband/pip-tools)
3. Freeze is flat list (npm can restore tree structure)
https://python-poetry.org/ and pyproject configuration is pretty decent these days and handles all those points for you. We may be forever stuck with 2 layers in python (dep management and actual installation) where node has them merged into 1. But good tools solve that pretty well.
pip is not very comparable to npm imo because it never evolved the ability to really handle projects, create venvs on its own, create lock files, etc.
Poetry is what's most comparable and personally, i am not working in python projects that don't use poetry anymore. Whenever I work on a new one, i port it to use poetry
I also insist on poetry for any python modules or applications. If it's going to support multiple versions of python then throw pyenv in for development and cicd.
I agree and wholeheartedly recommend this blog post series.
It's something every python developer should read because if you're not using poetry for example, then you're making your development life more difficult than necessary.
Poetry still has to deal with Python's flat dependency structure which means it needs a complicated dependency resolver. This isn't a bad thing though, imo. There are pros and cons.
My intuition: a python interpreter gets one `PYTHONPATH` variable and so when I `import my_dep` in my app and another dependency also imports `my_dep`, they are resolved by the interpreter and cannot specify different source locations for `my_dep`, which I believe would be called "vendoring".
Again I am not sure if that is why Python can't support a tree or simply why it doesn't currently support a tree, I searched for a PEP explicitly defining dependency resolution but couldn't find one.
Sure, but then you lose the intentionality signal of an explicit version. Ruby gems works very nicely by not requiring version numbers but always respecting the lockfile unless you explicitly update one or more packages, so when you add a package with no version you get the latest by default, and you can easily do incremental upgrades that respect version dependencies. Later if you need to lock to a specific version you add it (ideally with a comment about why it's locked to that version) and still utilize the automatic dependency version resolution for updates.
Throwing this out there for criticism... I use `python3 -m pip install -t .pip -r requirements.txt` and add .pip to my PYTHONPATH. That works for me without having to use any of the python virtual env tooling, basically trying to get something more npm like.
I don't work on any significant python code bases, so I expect it has limitations when compared to the virtual env options like developing with specific (or multiple) python versions.
Hm, this sounds like something I would want to use, but I just tried it with an Ubuntu Bionic server on Dreamhost (Python 3.6) and got an exception installing flask and flup:
distutils.errors.DistutilsOptionError: can't combine user with prefix, exec_prefix/home, or install_(plat)base
I don't understand why this doesn't work, but it seems like it should.
I was able to install those 2 packages in a virtualenv (although weirdly Dreamhost requires you to build your own Python 3 to do that, python3 -m venv doesn't work)
I've been programming Python since 2003 and have no idea what's wrong :-(
TBH I thought I was a luddite for avoiding virtualenv and pip for a long time. I would downloads tarballs and use distutils to build them! But for this project I'm using flask which has more dependencies... Gah. And now I'm seeing all the downsides...
I love this kind of articles that take a tool I use often but never had enough motivation to figure out what it exactly does.
I learn best how something works if I try to re-create it. But there is not enough time to do it for all the things around. Explanations like this kind of simulate that process for me.
Regarding the big list at the beginning of the article, which may seem daunting, IMO you just need venv, and I'd also add poetry. pyenv and tox are useful if you need to support multiple Python versions.
- pyenv is used to manage which Python versions you have installed
- venv comes with Python and is used to actually create virtualenvs
- `poetry install` will create a virtualenv for you, install your packages into it, and create a lock file (assuming your project is configured to use poetry)
- tox is used to run your test suite against multiple Python versions (but you don't use it to directly manage virtualenvs)
Pyenv is not only useful, I would argue it is recommended and a must. For the simple fact: you will need a different python installation than the default on your machine. This also helps keep everything separate.
Tox is another great tool, but imho only if your project has a strict requirement for multiple python versions. Otherwise, it's a waste of effort.
I found it quite interesting that your "what's the point" section only has one point in it: avoid conflicting dependencies.
I found it interesting because I am generally in the distro-package camp vs venvs, and I do not see any other point myself. And for conflicting dependencies, I strive to solve that with barebone VMs to run individual "services" (or containers if security is not a concern).
I really find it a bit sad the lengths all the Python devs will go to just to compensate for the entrenched core deficiencies in their platform, without actually uprooting said deficiencies once and for all.
What's stopping the wider community to finally adopt any sort of namespacing / switches / gemsets / per-project environments? And I mean automatic; you `cd` into the directory and it's handled for you, similar to the the functionality of the `direnv` and `asdf` generic tools, and Elixir mix's / Rust cargo's / Ruby RVM's ways of isolating apps and their dependencies.
Why is Python lagging behind so many other ecosystems? Why is it so persistent in not fixing this? It's obvious it's not going anywhere and a lot of people are using it. Why not invest in making it as ergonomic as possible?
And don't give me the "backwards compatibility" thing now. Everyone I know that uses Python also uses several such tools on top of the vanilla experience -- so I'd argue the vanilla experience is mostly a theoretical construct for years now and can be almost safely assumed to not exist.
(And I get sadder engaging with HN these days. If you don't think I am right, engage in an argument. This downvote-and-navigate-away practice that's been creeping up from Reddit to HN isn't making this place any favours and it starts to remove the community aspect for me with time.)
Neither the practice of downvoting on HN nor the rules around it have changed in many years. If you're noticing it more, there must be some other reason for that.
1. You have to manually freeze packages (instead of automatic package-lock.json)
2. Each time your install/remove package, dependant packages are not removed from freeze. You have to do it manually. (interesting link: https://github.com/jazzband/pip-tools)
3. Freeze is flat list (npm can restore tree structure)