- pip doesn't handle your Python executable, just your Python dependencies. So if you want/need to swap between Python versions (3.11 to 3.12 for example), it doesn't give you anything. Generally people use an additional tool such as pyenv to manage this. Tools like uv and Poetry do this as well as handling dependencies
- pip doesn't resolve dependencies of dependencies. pip will only respect version pinning for dependencies you explicitly specify. So for example, say I am using pandas and I pin it to version X. If a dependency of pandas (say, numpy) isn't pinned as well, the underlying version of numpy can still change when I reinstall dependencies. I've had many issues where my environment stopped working despite none of my specified dependencies changing, because underlying dependencies introduced breaking changes. To get around this with pip you would need an additional tool like pip-tools, which allows you to pin all dependencies, explicit and nested, to a lock file for true reproducibility. uv and poetry do this out of the box.
- Tool usage. Say there is a python package you want to use across many environments without installing in the environments themselves (such as a linting tool like ruff). With pip, you need to install another tool like pipx to install something that can be used across environments. uv can do this out of the box.
Plus there is a whole host of jobs that tools like uv and poetry aim to assist with that pip doesn't, namely project creation and management. You can use uv to create a new Python project scaffolding for applications or python modules in a way that conforms with PEP standards with a single command. It also supports workspaces of multiple projects that have separate functionality but require dependencies to be in sync.
You can accomplish a lot/all of this using pip with additional tooling, but its a lot more work. And not all use cases will require these.
You are incorrect about needing to use an additional tool to install a "global" tool like `ruff`; `pip` does this by default when you're not using a virtual environment. In fact, this behavior is made more difficult by tools like `uv` if or `pipx` they're trying to manage Python executables as well as dependencies.
This assumes the Python version you need is available from your package manager's repo. This won't work if you want a Python version either newer or older than what is available.
> You are incorrect about needing to use an additional tool to install a "global" tool like `ruff`; `pip` does this by default when you're not using a virtual environment.
True, but it's not best practice to do that because while the tool gets installed globally, it is not necessarily linked to a specific python version, and so it's extremely brittle.
And it gets even more complex if you need different tools that have different Python version requirements.
>This assumes the Python version you need is available from your package manager's repo. This won't work if you want a Python version either newer or older than what is available.
And of course you could be working with multiple distros and versions of the same distro, production and dev might be different environment and tons of others concerns. You need something that just works across.
You almost need to use Docker for deploying Python because the tooling is so bad that it's otherwise very difficult to get a reproducible environment. For many other languages the tooling works well enough that there's relatively little advantage to be had from Docker (although you can of course still use it).
>> You are incorrect about needing to use an additional tool to install a "global" tool like `ruff`; `pip` does this by default when you're not using a virtual environment.
>True, but it's not best practice to do that because while the tool gets installed globally, it is not necessarily linked to a specific python version, and so it's extremely brittle.
"Globally" means installed with sudo. These are installed into the user folder under ~/.local/ and called a user install by pip.
I wouldn't call it "extremely brittle" either. It works fine until you upgrade to a new version of python, in which case you install the package again. Happens once a year perhaps.
The good part of this is that unused cruft will get left behind and then you can delete old folders in ~/.local/lib/python3.? etc. I've been doing this over a decade without issue.
> "Globally" means installed with sudo. These are installed into the user folder under ~/.local/ and called a user install by pip.
> It works fine until you upgrade to a new version of python, in which case you install the package again.
Debian/Ubuntu doesn't want you to do either, and tell you you'll break your system if you force it (the override flag is literally named "--break-system-packages"). Hell, if you're doing it with `sudo`, they're probably right - messing with the default Python installation (such as trying to upgrade it) is the quickest way to brick your Debian/Ubuntu box.
Incredibly annoying when your large project happens to use pip to install both libraries for the Python part, and tools like CMake and Conan, meaning you can't just put it all in a venv.
Ok, getting it now. I said upgrade python, and you thought I meant upgrade the system python in conflict with the distro. But that's not really what I meant. To clarify... I almost never touch the system python, but I upgrade the distro often. Almost every Ubuntu/Mint has a new system Python version these days.
So upgrade to new distro release, it has a new Python. Then pip install --user your user tools, twine, httpie, ruff, etc. Takes a few moments, perhaps once a year.
I do the same on Fedora, which I've been using more lately.
Nah, pip is still brittle here because it uses one package resolution context to install all your global tools. So if there is a dependency clash you are out of luck.
> exceedingly unlikely to because your user-wide tools should be few.
Why "should"? I think it's the other way around - Python culture has shied away from user-wide tools because it's known that they cause problems if you have more than a handful of them, and so e.g. Python profilers remain very underdeveloped.
There are simply few, I don't shy away from them. Other than tools replaced by ruff, httpie, twine, ptpython, yt-dlp, and my own tools I don't need anything else. Most "user" tools are provided by the system package manager.
All the other project-specific things go in venvs where they belong.
This is all a non-issue despite constant "end of the world" folks who never learned sysadmin and are terrified of an error.
If a libraries conflict, uninstall them, and put them in a venv. Why do all the work up front? I haven't had to do that in so long I forget how long it was. Early this century.
> This is all a non-issue despite constant "end of the world" folks who never learned sysadmin and are terrified of an error.
It's not a non-issue. Yes it's not a showstopper, but it's a niggling drag on productivity. As someone who's used to the JVM but currently having to work in Python, everything to do with package management is just harder and more awkward than it needs to be (and every so often you just get stuck and have to rebuild a venv or what have you) and the quality of tooling is significantly worse as a result. And uv looks like the first of the zillions of Python package management tools to actually do the obvious correct thing and not just keep shooting yourself in the foot.
It’s not a drag if you ignore it and it doesn’t happen even once a decade.
Still I’m looking forward to uv because I’ve lost faith in pypa. They break things on purpose and then say they have no resources to fix it. Well they had the resources to break it.
But this doesn’t have much to do with installing tools into ~/.local.
> pip doesn't resolve dependencies of dependencies.
This is simply incorrect. In fact the reason it gets stuck on resolution sometimes is exactly because it resolved transitive dependencies and found that they were mutually incompatible.
Here's an example which will also help illustrate the rest of my reply. I make a venv for Python 3.8, and set up a new project with a deliberately poorly-thought-out pyproject.toml:
I've specified the oldest version of Numpy that has a manylinux wheel for Python 3.8 and the newest version of Pandas similarly. These are both acceptable for the venv separately, but mutually incompatible on purpose.
When I try to `pip install -e .` in the venv, Pip happily explains (granted the first line is a bit strange):
ERROR: Cannot install example and example==0.1.0 because these package versions have conflicting dependencies.
The conflict is caused by:
example 0.1.0 depends on numpy==1.17.3
pandas 2.0.3 depends on numpy>=1.20.3; python_version < "3.10"
To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip to attempt to solve the dependency conflict
If I change the Numpy pin to 1.20.3, that's the version that gets installed. (`python-dateutil`, `pytz`, `six` and `tzdata` are also installed.) If I remove the Numpy requirement completely and start over, Numpy 1.24.4 is installed instead - the latest version compatible with Pandas' transitive specification of the dependency. Similarly if I unpin Pandas and ask for any version - Pip will try to install the latest version it can, and it turns out that the latest Pandas version that declares compatibility with 3.8, indeed allows for fetching 3.8-compatible dependencies. (Good job not breaking it, Pandas maintainers! Although usually this is trivial, because your dependencies are also actively maintained.)
> pip will only respect version pinning for dependencies you explicitly specify. So for example, say I am using pandas and I pin it to version X. If a dependency of pandas (say, numpy) isn't pinned as well, the underlying version of numpy can still change when I reinstall dependencies.
Well, sure; Pip can't respect a version pin that doesn't exist anywhere in your project. If the specific version of Pandas you want says that it's okay with a range of Numpy versions, then of course Pip has freedom to choose one of those versions. If that matters, you explicitly specify it. Other programs like uv can't fix this. They can only choose different resolution strategies, such as "don't update the transitive dependency if the environment already contains a compatible version", versus "try to use the most recent versions of everything that meet the specified compatibility requirements".
> To get around this with pip you would need an additional tool like pip-tools, which allows you to pin all dependencies, explicit and nested, to a lock file for true reproducibility.
No, you just use Pip's options to determine what's already in the environment (`pip list`, `pip freeze` etc.) and pin everything that needs pinning (whether with a Pip requirements file or with `pyproject.toml`). Nothing prevents you from listing your transitive dependencies in e.g. the [project.dependencies] of your pyproject.toml, and if you pin them, Pip will take that constraint into consideration. Lock files are for when you need to care about alternate package sources, checking hashes etc.; or for when you want an explicit representation of your dependency graph in metadata for the sake of other tooling.
> This assumes the Python version you need is available from your package manager's repo. This won't work if you want a Python version either newer or older than what is available.
I have built versions 3.5 through 3.13 inclusive from source and have them installed in /opt and the binaries symlinked in /usr/local/bin. It's not difficult at all.
> True, but it's not best practice to do that because while the tool gets installed globally, it is not necessarily linked to a specific python version, and so it's extremely brittle.
What brittleness are you talking about? There's no reason why the tool needs to run in the same environment as the code it's operating on. You can install it in its own virtual environment, too. Since tools generally are applications, I use Pipx for this (which really just wraps a bit of environment management around Pip). It works great; for example I always have the standard build-frontend `build` (as `pyproject-build`) and the uploader `twine` available. They run from a guaranteed-compatible Python.
And they would if they were installed for the system Python, too. (I just, you know, don't want to do that because the system Python is the system package manager's responsibility.) The separate environment don't matter because the tool's code and the operated-on project's code don't even need to run at the same time, let alone in the same process. In fact, it would make no sense to be running the code while actively trying to build or upload it.
> And it gets even more complex if you need different tools that have different Python version requirements.
No, you just let each tool have the virtual environment it requires. And you can update them in-place in those environments, too.
> This is simply incorrect. In fact the reason it gets stuck on resolution sometimes is exactly because it resolved transitive dependencies and found that they were mutually incompatible.
The confusion might be that this used to be a problem with pip. It looks like this changed around 2020, but before then pip would happily install broken versions. Looking it up, this change of resolution happened in a minor release.
You have it exactly, except that Pip 20.3 isn't a "minor release" - since mid-2018, Pip has used quarterly calver, so that's just "the last release made in 2020". (I think there was some attempt at resolving package versions before that, it just didn't work adequately.)
Ah thank you for the correction, that makes sense - it seemed very odd for a minor version release.
I think a lot of people probably have strong memories of all the nonsense that earlier pip versions resulted in, I know I do. I didn't realise this was a more solved problem now as not seeing an infrequent issue is hard to notice.
> Well, sure; Pip can't respect a version pin that doesn't exist anywhere in your project. If the specific version of Pandas you want says that it's okay with a range of Numpy versions, then of course Pip has freedom to choose one of those versions. If that matters, you explicitly specify it
Nearly every other language solves this better than this. What your suggesting breaks down on large projects.
>Nearly every other language solves this better than this.
"Nearly every other language" determines the exact version of a library to use for you, when multiple versions would work, without you providing any input with which to make the decision?
If you mean "I have had a more pleasant UX with the equivalent tasks in several other programming languages", that's justifiable and common, but not at all the same.
>What your suggesting breaks down on large projects.
Pinned transitive dependencies are the only meaningful data in a lockfile, unless you have to explicitly protect against supply chain attacks (i.e. use a private package source and/or verify hashes).
IMHO the clear separation between lockfile and deps in other package managers was a direct consequence of people being confused about what requirements.txt should be. It can be both and could be for ages (pip freeze) but the defaults were not conductive to clear separation. If we started with lockfile.txt and dependencies.txt, the world may have looked different. Alas.
The thing is, the distinction is purely semantic - Pip doesn't care. If you tell it all the exact versions of everything to install, it will still try to "solve" that - i.e., it will verify that what you've specified is mutually compatible, and check whether you left any dependencies out.
If all you need to do is ensure everyone's on the same versions of the libraries - if you aren't concerned with your supply chain, and you can accept that members of your team are on different platforms and thus getting different wheels for the same version, and you don't have platform-specific dependency requirements - then pinned transitive dependencies are all the metadata you need. pyproject.toml isn't generally intended for this, unless what you're developing is purely an application that shouldn't ever be depended on by anyone else or sharing an environment with anything but its own dependencies. But it would work. The requirements.txt approach also works.
If you do have platform-specific dependency requirements, then you can't actually use the same versions of libraries, by definition. But you can e.g. specify those requirements abstractly, see what the installer produces on your platform, and produce a concrete requirement-set for others on platforms sufficiently similar to yours.
(I don't know offhand if any build backends out there will translate abstract dependencies from an sdist into concrete ones in a platform-specific wheel. Might be a nice feature for application devs.)
Of course there are people and organizations that have use cases for "real" lockfiles that list provenance and file hashes, and record metadata about the dependency graph, or whatever. But that's about more than just keeping a team in sync.
most developers I know do not use the system version of python. We use an older version at work so that we can maximize what will work for customers and don't try to stay on the bleeding edge. I imagine others do want newer versions for features, hence people find products like UV useful
That assumes that you are using a specific version of a specific Linux distribution that happens to ship specific versions of Python that you are currently targeting. That's a big assumption. uv solves this.
(I've just learned about uv, and it looks like I have to pick it up since it performs very well.)
I just use pipx. Install guides suggest it, and it is only one character different from pip.
With Nix, it is very easy to run multiple versions of same software. The path will always be the same, meaning you can depend on versions. This is nice glue for pipx.
My pet peeve with Python and Vim is all these different package managers. Every once in a while a new one is out and I don't know if it will gain momentum. For example, I use Plug now in Vim but notice documentation often refers to different alternatives these days. With Python it is pip, poetry, pip search no longer working, pipx, and now uv (I probably forgot some things).
Pipx is a tool for users to install finished applications. It isn't intended for installing libraries for further development, and you have to hack around it to make that work. (This does gain you a little bit over using Pip directly.)
I just keep separate compiled-from-source versions of Python in a known, logical place; I can trivially create venvs from those directly and have Pip install into them, and pass `--python` to `pipx install`.
>With Python it is pip, poetry, pip search no longer working, pipx, and now uv (I probably forgot some things).
Of this list, only Poetry and Uv are package managers. Pip is by design, only an installer, and Pipx only adds a bit of environment management to that. A proper package manager also helps you keep track of what you've installed, and either produces some sort of external lock file and/or maintains dependency listings in `pyproject.toml`. But both Poetry and Uv go further beyond that as well, aiming to help with the rest of the development workflow (such as building your package for upload to PyPI).
If you like Pipx, you might be interested in some tips in my recent blog post (https://zahlman.github.io/posts/2025/01/07/python-packaging-...). In particular, if you do need to install libraries, you can expose Pipx's internal copy of Pip for arbitrary use instead of just for updating the venvs that Pipx created.
Yeah, venv is really the best way to manage Python environments. In my experience other tools like Conda often create more headaches than they solve.
Sure, venv doesn't manage Python versions, but it's not that difficult to install the version you need system-wide and point your env to it. Multiple Python versions can coexist in your system without overriding the default one. On Ubuntu, the deadsnakes PPA is pretty useful if you need an old Python version that's not in the official repos.
In the rare case where you need better isolation (like if you have one fussy package that depends on specific system libs, looking at you tensorflow), Docker containers are the next best option.
Oh wow, it actually can handle the Python executable? I didn't know that, that's great! Although it's in the article as well, it didn't click until you said it, thanks!
I would avoid using this feature! It downloads a compiled portable python binary from some random github project not from PSF. That very same github project recommends against using their binary as the compilation flags is set for portability against performance. See https://gregoryszorc.com/docs/python-build-standalone/main/
https://github.com/astral-sh/python-build-standalone is by the same people as uv, so it's hardly random. The releases there include ones with profile-guided optimisation and link time optimisation [1], which are used by default for some platforms and Python versions (and work seems underway to make them usable for all [2]). I don't see any recommendation against using their binaries or mention of optimising for portability at the cost of performance on the page you link or the pages linked from it that I've looked at.
My understanding is that the problem is that psf doesnt publish portable python binaries (I dont think they even publish any binaries for linux). Luckily theres some work being done on a pep for similar functionality from an official source but that will likely take several years. Gregory has praised the attempt and made suggestions based on his experience.
https://discuss.python.org/t/pep-711-pybi-a-standard-format-...
Apparently he had less spare time for open source and since astral had been helping with a lot of the maitinence work on the project he happily transfered over ownership to themin December
I still don't understand why people want separate tooling to "handle the Python executable". All you need to do is have one base installation of each version you want, and then make your venv by running the standard library venv for that Python (e.g. `python3.x -m venv .venv`).
But any tool you use for the task would do that anyway (or set them up temporarily and throw them away). Python on Windows has a standard Windows-friendly installer, and compiling from source on Linux is the standard few calls to `./configure` and `make` that you'd have with anything else; it runs quite smoothly and you only have to do it once.
Really? I was told Mint was supposed to be the kiddie-pool version of Linux, but it gave me GCC and a bunch of common dependencies anyway.
(By my understanding, `pyenv install` will expect to be able to run a compiler to build a downloaded Python source tarball. Uv uses prebuilt versions from https://github.com/astral-sh/python-build-standalone ; there is work being done in the Python community on a standard for packaging such builds, similarly to wheels, so that you can just use that instead of compiling it yourself. But Python comes out of an old culture where users expect to do that sort of thing.)
Having to manually install python versions and create venvs is pretty painful compared to say the Rust tooling where you install rustup once, and then it will automatically choose the correct Rust version for each project based on what that project has configured.
UV seems like it provides a lot of that convenience for python.
Lots of reasons starting. You may want many people to have the same point release.
They have early builds without needing to compile it from source and have free threading (mogul) builds. I think they might even have pro builds. Not to mention that not all district releases will have the right python release. Also people want the same tool to handle both python version and venv creation and requirement installation
Because its easier. Because it fits together nicer and more consistently. Also because UV is well written and written in rust so all the parts are fast. You can recreate a venv from scratch for every run.
Also as silly as it is I actually have a hard time remembering the venv syntax each time.
uv run after a checkout with a lock file and a .python-version file downloads the right python version creates a venv and then installs the packages. No more needing throwaway venvs to get a clean pip freeze for requirements. And I don't want to compile python, even with something helping me compile and keep track of compiles like pyenv a lot can go wrong.
And that assumes an individualindividual project run by someone who understands python packaging. UV run possibly in a wrapper script will do those things for my team who doesn't get packaging as well as I do. Just check in changes and next time they UV run it updates stuff for them
I guess I will never really understand the aesthetic preferences of the majority. But.
>Because its easier. Because it fits together nicer and more consistently. Also because UV is well written and written in rust so all the parts are fast. You can recreate a venv from scratch for every run.
This is the biggest thing I try to push back on whenever uv comes up. There is good evidence that "written in Rust" has quite little to do with the performance, at least when it comes to creating a venv.
On my 10-year-old machine, creating a venv directly with the standard library venv module takes about 0.05 seconds. What takes 3.2 more seconds on top of that is bootstrapping Pip into it.
Which is strange, in that using Pip to install Pip into an empty venv only takes about 1.7 seconds.
Which is still strange, in that using Pip's internal package-installation logic (which one of the devs factored out as a separate project) to unpack and copy the files to the right places, make the script wrappers etc. takes only about 0.2 seconds, and pre-compiling the Python code to .pyc with the standard library `compileall` module takes only about 0.9 seconds more.
The bottleneck for `compileall`, as far as I can tell, is still the actual bytecode compilation - which is implemented in C. I don't know if uv implemented its own bytecode compilation or just skips it, but it's not going to beat that.
Of course, well thought-out caching would mean it can just copy the .pyc files (or hard-link etc.) from cache when repeatedly using a package in multiple environments.
- pip doesn't handle your Python executable, just your Python dependencies. So if you want/need to swap between Python versions (3.11 to 3.12 for example), it doesn't give you anything. Generally people use an additional tool such as pyenv to manage this. Tools like uv and Poetry do this as well as handling dependencies
- pip doesn't resolve dependencies of dependencies. pip will only respect version pinning for dependencies you explicitly specify. So for example, say I am using pandas and I pin it to version X. If a dependency of pandas (say, numpy) isn't pinned as well, the underlying version of numpy can still change when I reinstall dependencies. I've had many issues where my environment stopped working despite none of my specified dependencies changing, because underlying dependencies introduced breaking changes. To get around this with pip you would need an additional tool like pip-tools, which allows you to pin all dependencies, explicit and nested, to a lock file for true reproducibility. uv and poetry do this out of the box.
- Tool usage. Say there is a python package you want to use across many environments without installing in the environments themselves (such as a linting tool like ruff). With pip, you need to install another tool like pipx to install something that can be used across environments. uv can do this out of the box.
Plus there is a whole host of jobs that tools like uv and poetry aim to assist with that pip doesn't, namely project creation and management. You can use uv to create a new Python project scaffolding for applications or python modules in a way that conforms with PEP standards with a single command. It also supports workspaces of multiple projects that have separate functionality but require dependencies to be in sync.
You can accomplish a lot/all of this using pip with additional tooling, but its a lot more work. And not all use cases will require these.