Python had multiple different points in its history to fix its env and package problems. For reasons I don’t understand the PSF has not taken any hard stance on a solid way forward, it’s always pushed back as a community issue.
The problem is lack of leadership on this more than anything
- "-m" was introduced.
- Wheel replaced eggs.
- Manylinux target now exists.
- pypi was scaled uo, a new api
introduced and 2fa added.
- setup.py got replaced by
setup.cfg and now
pyproject.toml is gaining speed.
- the py launcher is a thing.
- Import priority changed.
- Import lib changed.
- Zipapps were added.
- Venv ships with python.
- Pip replaced easy_install,
then ensurepip was created.
- The new dependancy solver
was funded.
- Distutils has been sunsetted.
Setuptools is on its way.
This doesn't include the dozens of third party projects that target improving package that came out in the last 2 decades. Pip-tools by itself has changed the game.
A lot has been done, and the situation is 10 times better than it used to be. It is still bad, but not because nothing was done, simply because there is still a lot to do.
People dont realize the sheer scale of the task, and the inertia that comes with a language as successful and old as python.
You have a language that is 4 years older than java, highly dynamic, yet uses fortran/c/assembly code in very popular lib (scipy/numpy) accross all popular OS, and webassembly and arm, in 32 and 64 bits. People routinely install multiple versions of python on their machine, linux repos freeze the upgrades and split the packages, mac and windows stores ship broken versions, and half the python userbase is composed of people that are not professional coders that can't use a terminal. Of course companies will complain if you change anything that breaks their 5 years old server, and devs will complain if they don't get the last new great feature.
It's a really, really hard problem, dealt by a FOSS community that 10 years ago was still running on mostly volunteers and the budget of a small start up. Of course, just to get heat by comments on social media and no thanks for all the effort, as if you own total strangers your free work, with no flaws on top of that.
Not to mention you can imagine packaging is not the only things the core devs have to work on.
> This doesn't include the dozens of third party projects that target improving package that came out in the last 2 decades.
One would think that two decades of package improvement initiatives should have been a strong enough signal to PSF to prioritize building an official standard solution.
A third party can ignore legacy. It can break stuff. It can avoid supporting platform. It can have a big bug and push a fix the day later. It can skip red tapes. It can document after the fact. It can drop features the next year. It can requires pypi dependancies when the stdlib is not sufficient. It can be by a single author that don't have to ask anybody what they think about before creating something. It can skip security issues and focus on practicality.
CPython cannot.
So any change to the packaging story will always take years for the smalles thing.
That's the same for removing the GIL: touching the c api is a huge deal for the scientific and machine learning stack.
That's why requests has never been included in python despite being vendored with pip: you can never update it fast enough once it's in the stdlib.
That's why we don't have node_modules equivalent yet: autoloading code in the current directory would be a big security risk for a language that is included in many os by default so we need a good design.
Don't assume the team is incompetent or deaf. Given what has already been achieved, that would be a total mismatch.
> Don't assume the team is incompetent or deaf. Given what has already been achieved, that would be a total mismatch.
I am not making any such assumptions (and I don't think most sane people are either); people are just unhappy with lack of an officially standard solution given that it was clear quite a while ago that it was needed. I also don't understand the comparison with GIL or C API - changes to them could be a breaking change whereas a fresh officially recommended solution would by definition be a new solution i.e. not a breaking change. It can be introduced and it can go through iterations while people can take their time to migrate from the myriad of third-party packages (or not, if they are happy with what they are using).
What people, who don't want to deal with deciding between or putting their faith in longevity of third-party tools, are looking for is a solution that is official, part of the standard library and is managed by PSF. That way the decision is basically made for them e.g. like it is done in Rust via cargo.
Fresh now solutions is what put us in this situation: distutis2 was fresh, setuptools was fresh, pip was fresh, wheels were fresh... So now you have a lot of complexity that come from having a lot of solutions in parallel that are slow to sunset.
But the dirty secret of python packaging is that most of the problem don't come from packaging, but from boostrapping python. And this is a huge can of worms I didn't even mentioned.
Also, plenty of things require the participations of other communities like debian splitting pip out of the main package or anaconda not being compatible with pip.
All in all, the way you hand wave the problem is typical from the critics that have a very narrow view of the situation.
>Don't assume the team is incompetent or deaf. Given what has already been achieved, that would be a total mismatch
To be perfectly clear, I don't think that. I just think its a missed opportunity over the years to better ways of environment and package management. For instance, on the folder auto loading code: make it a feature of only virtual environments. I know python can detect that its in one. Or engage with distros not to build Python to allow this. They already do custom builds with it (its always missing something from a standard Python installation after all).
Node also has madness that Python does not, like support for circular dependencies. I would definitely not point to Node as a positive example like Cargo.
Cargo is the golden standard, but reaching that level is going to take 10 years at least. I'm not kidding, this is what will be needed to solve all the road blocks.
The first thing we need is a whole ne way of installing python, because half of the problems stems from that. That's just a huge endeavor.
It certainly seems that way. Guido wasn’t interested in solving it and it was probably one of the things, like a gradual typing system which he was interested in, that he should have made a core issue because of its centrality.
Curiously, I wonder whether current leadership are unwilling to acknowledge that the current problems are essentially a leadership problem. See for example my short conversation with Pradyun on Mastodon (https://mas.to/@maegul/109726564552419983) where I’m not sure they were being open and rational (though they clearly know more than me).
Not disputing that there's a lack of leadership on the matter, but there's also conflicting use-cases: for example, the requirements for the "Python is just a tool that comes with my OS, and if I need any additional modules I'll install them with my OS package manager" mindset, and the "I'm developing a web application with Python and I want an isolated development environment with controlled dependencies" mindset, are quite different.
> "Python is just a tool that comes with my OS, and if I need any additional modules I'll install them with my OS package manager
Imo this isn't really viable because you will eventually run into version conflicts in the transitive dependencies of your the Python applications you're using/developing on your system, on most operating systems.
The version(s) that ships with an OS should only be used for shipping applications that are themselves part of the OS/distro.
The problem is lack of leadership on this more than anything