Hacker News new | past | comments | ask | show | jobs | submit login
GIL Become Optional in Python 3.13 (geekpython.in)
109 points by thunderbong 4 months ago | hide | past | favorite | 193 comments



I kind of have the opposite feeling of many other people here... to me, Python is already pretty much perfect for my use, which is things like build scripts and random tools I make, plus it's used by a bunch of software I need to compile. The one big problem it has is that it keeps changing. I'll try to compile some dependency that I haven't updated in some time and I'll find that it fails to build because Python changed in some backward incompatible way. Or a tool I made will suddenly start printing deprecation warnings because Python has deprecated a library.

I get that for people for whom Python is their main language, all these changes make their lives better. But for me it kinda just gets in the way. Personally, I wish Python would continue to be maintained as is, but stop changing so much. Stop breaking my builds.


This reads a bit like "My neighbourhood is already perfect, I wish it would stop changing". Except as a sibling pointed out, you can live in your old unchanging neighbourhood just fine, you don't have to update it.

Python changes because of three reasons: Addressing bugs and insecure patterns; support for new usage patterns in the community; updates for compatibility with newer devices and systems.

The first one is a given. The second and third are basically "the world is evolving, with or without you". Even if you built a VM on your perfect machine, with your perfect python and perfect libraries and froze everything for 10 years, the moment you connected it to the internet you'd be faced with "Ah, my perfect version does not support this new thingamajig being used on the web".

Freezing everything in time is just archiving.


> Freezing everything in time is just archiving.

None of this explains why python has the need to drop modules for the standard library.

They claim "nobody is maintaining them"… but they are well funded and can easily afford someone to maintain them.

I think at the core they don't care about backwards compatibility, but instead of doing it all at once like with python2/3 are now doing it little by little, creating a python4 without calling it so.


> They claim "nobody is maintaining them"… but they are well funded and can easily afford someone to maintain them.

You still have to find someone who both wants and can maintain them. That's not always a given.

And also, why? If I donate to the PSF, do I really want my money to go into the maintenance of ... checks notes ... MacOS 9 path manipulation functions (https://docs.python.org/it/3.7/library/macpath.html)?

Just because you're well-funded doesn't mean you get to spend money on useless things. And most of the python community would consider this useless. Nobody even made a pypi copy of it.


Agreed. The ironic downside of not having Windows-style telemetry is you can't prove how rarely-used obscure old packages are, or identify the small set of people still using them a decade after everyone else stopped. (You can search github repos for references, as a poor alternative.)


More useless or costly than not implementing rate limit at all on pypi, so that noobs at gigantic companies never ever use a cache or a mirror?


No idea how that’s related


> now doing it little by little, creating a python4 without calling it so.

That's the unfortunate modern trend. The web folks that move fast and break things love living on the bleeding edge, and create strong pressure for everyone to keep up, and that the answer to library compatibility is "wait until someone complains, fix the direct reason, cut a new release, and let the dependents fix their own code".


> updates for compatibility with newer devices and systems.

Python is a couple abstraction layers above that, it's not supposed to even be directly exposed to things that vary frequently between devices. That's a job of the OS, the C compiler, and the libraries (most of which just link to C ones anyway).


The rest of the world keeps changing. I can expend the effort to maintain, package and distribute my own Python fork, but things like the build scripts for libraries I use would eventually start requiring newer Python versions.

I don't think Python is perfect, I don't mind changes. But I do mind backward incompatible changes a lot.


> I get that for people for whom Python is their main language, all these changes make their lives better

Python is my main language and backward compatibility is far more important to me than performance.

Almost everything I do spends most of its time waiting for IO, or in a C library, or another process, or performance is unimportant. That is why Python is my main language.

I think it is possible that Python's popularity with people doing numerical stuff might end up making it worse for the rest of us.


>I think it is possible that Python's popularity with people doing numerical stuff might end up making it worse for the rest of us.

I've been using Python since long before it became the go-to choice for numerics inside and outside academia, so let me tell you: It was even worse back then. The transition from 2.7 to 3.0 was both regarding the fundamental idea and the implementation the worst thing I have ever seen happen in any major language. This could (maybe should) have been a language killer. They should have just branched it out as something completely new. Van Rossum is totally right in saying that he never wants a Python 4 for that reason, so it seems at least he learned a little bit since then.


It is only in the past few years that the 2 -> 3 transition stopped hurting for me personally. A lot of Google's Python build system code (which I used to build e.g WebRTC and Chromium) got transitioned to work with Python 3 relatively quickly, HOWEVER it still assumed to be able to execute a Python interpreter by invoking the `python` command. It would work regardless of whether this python interpreter was 2 or 3, but it had python scripts running `python path/to/script.py` using `subprocess`.

As systems started shipping Python 3, the convention among most of them became to leave `python` as a Python 2 interpreter and then introduce `python3` as a Python 3 interpreter. As they then stopped shipping Python 2, they generally just removed /usr/bin/python, so the only Python on the system was /usr/bin/python3.

In my own scripts to build Google stuff, I always had to make a folder with a single script called `python` which would contain just `exec python3 "$@"`, and then prepend that directory to the `PATH` when invoking Google's tools.

Google eventually did update their stuff to work on systems where the only Python is `python3`, but man this transition was rough. As I understand it, Google had good reason to do it their way at the time too; there was simply no consistency in how systems handled the 2 -> 3 switch. I suspect that this is due to a severe lack of guidance from the Python foundation.


I don't really understand such critics which I often see for the python 2 to 3 transition, the OS didn't want to change `python` to be v3 by default which is understandable for backward compatibility, and the fix is simple and immediate with e.g. the python-is-python3 apt package, or an alias python=python3 in the shell, or a symlink in PATH, etc.


We went from a situation where essentially every UNIX-like system had the same-ish Python available as `python` to a situation where every system has different names for different versions of Python. That's a pretty big change.

Imagine if GNU released a new, highly backwards incompatible version of bash called Bash 6, and then some platforms kept /bin/bash as Bash 5 and introduced /bin/bash6, some systems replaced /bin/bash with Bash 6 and introduced /bin/bash5, and some systems removed /bin/bash entirely and /bin/bash6 was the only Bash on the system.

Python had the opportunity to be, and was, to some extent, just a standard part of a UNIX-like system, like bash. But now it's just another language where you need to care a lot about interpreter versions and how different distros package it differently and keeping code up to date to work with the latest Python and keeping Python up to date to work with the latest code.

This doesn't mean that Python is bad, but it would have been much more useful to me, and quite a few others, if it had stated a ubiquitous tool which is more like Bash and Perl.


> Van Rossum is totally right in saying that he never wants a Python 4 for that reason, so it seems at least he learned a little bit since then.

Just because the BFDL doesn't increment the major version doesn't change the fact that myriad breaking changes arrive in Python 3 PEPs.

This is purely an optics hack. Python 3.0 is very different from Python 3.13. Had the frog not been boiled slowly, it might as well have been Python 4.

The people who could stop this churn will not. It's a full-employment ticket for them, chasing dot-release upgrades and educating us about changes that we couldn't live without but somehow did for decades.

It takes a big person to say when something is good enough despite its warts. Contrast the Python churn with Linux and git: never break user space.


Python 3.0 is older than Windows 7 and Gnome 3.0. And if GUIs can change in 16 years programming languages can do as well.


Gnome 40 is a thing.

Windows 11 is a thing.

I didn't say things can't change. I said don't play games with major version numbers to window dress optics.


Me too. I have used Python mostly for web stuff (lots of Django) although I have done bits of everything (ML, desktop GUI for existing code, embedded, one off little scripts, ...).

A big attraction of Python is that it can do everything. Very useful if you need more than one of the above in a single project.

On the other hand I do feel that Python has been bad for my personal development because I can always find a Python library for any and everything it has reduced my motivation to learn other languages properly (i.e. actually using them for real work - I have learned a few in theory). I have been stuck with the same main (to the point of dominance!) language for far too long.

Yes, 2 to 3 was pretty painful, although it was not too bad for me because I was mostly able to delay moving until i was sure all the libraries I used had made the transition.


> worst thing I have ever seen happen in any major language

Survivorship bias. Python just barely survived the transition, and you only still use it now because it did make it. Take a look at Perl 6 to see a major language that failed.


At least Perl had admitted it.

Dealing with Python after mostly taking a pause around time 3.0 was introduced, I was shocked at the mess. And unlike with Perl 6, which took quite a clean break even before it decided to rebrand, you ended up with situation where both were around semi-equally, and there were "fun" bugs introduced in Python 3 that caused explosive bugs that we couldn't figure out in beginning because things would always work on developer machines...


Perl strikes me as the only major language which failed a transition though. I mean where the actual developers of a language developed a new version, released as an official new version of the language, and then the entire community stayed behind on the old version to the degree that the new version was renamed and made into its own separate project. Can you think of any other example?


I think it depends on which numerical bits though. Some people want to write Python that is fast, vs deferring the fastness to libraries in other languages (and/or using tools like numba or cython which transform Python into something faster). The former group seem to want no-GIL, and there's been a noticeable amount of breakage recently over CoW not being as effective as it use to be (which is reflected in the switch away from using fork be default on POSIX systems).


>I think it is possible that Python's popularity with people doing numerical stuff might end up making it worse for the rest of us.

It drives me _nuts_ when I find a simple library to do a small-ish thing that I don't want to write/maintain myself only to discover that it uses numpy or pandas because they're "fast".


> or another process

Does it by any chance a process started by multiprocessing? /s.

On a serious note, I feel like this is a valid usage and as someone who really like the no-gil approach I still would want python to handle the gil and no-gil flags in a better way. I don't know yet how this will manifest in real life and if keeping gil will be an option on the long run but it is a trade-off. There is no fundamental reason why python cannot have proper multithreading support.


It affects single-threaded performance, which is still, and likely always will be the primary use case. The only reason this has gotten so far is because there has been some major improvements in single-threaded performance which is being used to mostly cancel out the negative effects of no-gil support. I personally would've preferred to have access to that boost instead of tacking on this major change.


> Stop breaking my builds.

You don’t have to install the latest version if you don’t want to. If you use a version manager, you can keep using whatever version you want for many years.


The value of Python is its ubiquity. That disappears if you have to make sure to install old Python versions because the system Python got updated and the update broke something.


most tutorials I'm aware of actively discourage use of the system python version for this very case.

i don't think it's too surprising that versions differ. there are good tools to manage the python version used -- i personally like pyenv.


This is the exact problem I have. No bash scripting tutorial would tell you to not use your system bash. No perl tutorial would tell you to avoid your system perl. No C tutorial would discourage using your system's gcc or clang.

As I said, a lot of the value I see in python is in its ubiquity. This disappears if you can't use the system python.


You can always use the system python to create a venv. It's ubiquitous in the sense that you don't need root access to use properly, not that it comes set-up and ready to use for all purposes.


Hopefully no complaints when you break something in system python and your system has a negative reaction to that breakage.


I don't understand this comment.


There's the risk of breaking the system when using system python. Install some dependency for your script that changes something that a core service is depending on in an unexpected way can lead to hours of tracking down the issue. Or a full reinstall in the extreme. That's why users are to avoid using system python.

But if you value that ubiquity more than your sanity, I hope you find and fix any breakage yourself and not pester other devs for support.


You can just use rye and retain your sanity like everyone else.


As I said, a lot of the value I see in python is in its ubiquity. I don't understand why I need to keep repeating this.


>>The value of Python is its ubiquity.

+1

Most Python use case today is some what on the similar lines as Perl. Having a stable large install base. Availability of libraries, backwards compatibility, performance matter way more than more features at this point.

There is no new replacement language in sight, so Im guessing we will be using Python for long into the coming future.

I also hope Python had something on the lines similar to CPAN. We are decades into the journey, and Perl still shines like the Sun in this regard.

Part of me feels sad Perl 5 didn't go the way Python 3 did.


Python 2 -> 3 transition is probably considerable part of why it's becoming less ubiquitous.

Especially considering how a major distro[1] that defaulted to Python 2.7 only just was dropped by many[2] - but not all, because some "heroes" jumped at the last moment to provide binary compatible support options

[1] CentOS 7

[2] Especially if you wanted to be in any case acceptable to sell to FedRAMP-requiring or similar clientele, but in Europe NIS 1 (already in force) and NIS2 (starts enforcement this year) ban software that is post end-of-support


Python 2 -> 3 transition is probably considerable part of why it's becoming less ubiquitous.

What do you mean? I've been using python since 1.5 and it is more ubiquitous today than it has ever been at any other time in the past.


BLUF: It's definitely more ubiquitous in terms of general use, but I find it becomes less ubiquitous as in "always available on a Linux system" and increasingly painful.

I remember Python around 2.2 becoming ubiquitous and "obvious" option as "better than perl and also certainly available". Python people would even trash talk other communities for not being able to "just download the libraries from distro repo" or needing specific versions whereas you at worst grabbed updated python for your distro.

(The following is a somewhat chronological rant)

Fast forward few years and me getting out of university which for various reasons was (apparently blissfully) Python free. In between we get through the "huge"transition from Ruby 1.8 to 1.9/2.0 (had to help professor update coursework even).

The 2->3 transition interrupted that, and I am suddenly facing that a) python3 actually got released b) despite several years, it's not even guaranteed default c) you now have to explicitly use python3 or python3 in shebangs because you don't know what version of python is default d) virtualenv wtf e) there's still new code being written depending on py2 f) the library situation can be a maze.

Since I mainly worked as "DevOps" person, not a python programmer, I get to deal with increasingly borked python deployments. Suddenly authors names in comments cause deployments to fail on servers but work on dev machines as well as mine. Took long to figure, the same code working when connecting over SSH vs not working when started by systemd shows it's python deciding encoding by locale and barfing on 8bit characters (including utf-8).

Virtualenv, increasingly necessary multiple python versions, all driving increasing replacement of python in admin/management tooling. Chef and Puppet (finally) laughing from their omnibus setups at ansible exploding depending on what python is default on target distro and people who pushed ansible confused because "it only uses SSH, there's no agent". Even more movement to Golang because of static binaries.

Virtualenv becomes the effective default, forget about installing packages through distro (subjective take, I know, this is a vibe I know it's still possible).

Python 2 goes EOL and out of support but it's still default on distros used by paying customers (this changes - but not fully - this year on June 30th... Thanks to government regulation - thanks Obama /s)

I now mostly see python deployed with not just batteries but the whole jungle included (Docker). If it's s good container it doesn't have a complete copy of development environment.

Increasing mess with packaging. Distros reducing python dependencies, system python increasingly seems mostly for use only by OS components. Python equivalents of rbenv/rvm increasingly suggested as "right tool". Package build items exploding because transitive rust deps.

Customers sometimes demanding OS installs without Python. Increasingly encountering distros with no python at all.

Python packaging still feels worse today than rubygems were a decade ago.

Finally increasingly suggesting to prospective projects that benefit/cost of python goes below 1.

I used to love python back in 2002-2006 timeframe. Rails making it so damn easy to handle some projects and finally starting to grow Lisp (thanks to a detour through Haskell due to XMonad) made me look away. Even "ML ecosystem" mostly gives me more reasons to rant against Python (Tensorflow packaging as PIP wheel makes me problems even when I am going to use Python.

So, yeah, lots of code in Python, ubiquitous python jobs. Retreat from "write your tool in python, it will be easy to distribute, every distro effectively preinstalls python unless it's some LFS or Slackware weirdo"


> The one big problem it has is that it keeps changing

What prevents you from using an old Python version?


I'd love to stick with python 2.7 forever, but it isn't getting even minor updates just to keep up to date with operating system updates, and is starting to stop being packaged.

My current solution is a docker container which includes python 2.7 and a downloaded copy of every package I use.


What do you prefer about it?


That it has a lot of my old code working perfectly for years, and I don't have spare time to port it to another language and keep catching subtle bugs for months.


Out of curiosity, how many lines of code are we talking about here?


"Port it to another language". You're saying this as if the minor asinine breaks in python 2 Vs 3 make it a whole new programming language, when in reality it is mostly bullshit like print going from statement to function.


My problem is some libraries which never transitioned from 2 to 3. I don't want to try to convert them, particularly as they involve Unicode which is one of the hardest bits to get right in my experience.


Improved unicode (string vs byte separation) handling was exactly one of the larger reasons for the breaking change to 3.


You are right that unicode is much better in 3, but in my experience this is a pain point mapping programs 2 -> 3, because often programs seem to work fine until there is some exciting unicode, or malformed characters, in the input, then they crash when they used to just rumble on.


That sounds about right. Fail fast and loud so things have to be fixed, rather than continue silently and likely corrupt data.


Being able to use the system's Python (or in the case of macOS, Homebrew's Python) is extremely convenient. Especially when Python is used as a language for build scripts across a team. I'm not interested in forking Python taking on the responsibility of maintaining, packaging and distributing Python across a range of platforms.


> What prevents you from using an old Python version?

I think the fact that it's not on the distributions is a major factor here.


Usually us plebs use the system python.

With no python version manager (I don’t even know any off the top of my head).

Also virtualenvs only get you so far, I’ve come back to many projects on my machine with broken virtualenvs, I guess due to some dynamically linked dependency of python that my system no longer has.

:(


> With no python version manager (I don’t even know any off the top of my head)

Try a tool called `pyenv`. It's pretty great. I pair it with the package manager called `poetry`.


Looks like pyenv does the same thing as asdf, except only for python.

https://github.com/asdf-vm/asdf

And here's a list of everything it supports: https://github.com/asdf-vm/asdf-plugins/tree/master/plugins


Or https://github.com/astral-sh/uv can replace both. Various features are tagged as "experimental", and I guess the api might change, but it seems to work already. I have high hopes for it eventually making python's fragmented package management issues go away.

Edit: Somebody's going to respond with the XKCD aren't they? I know about the XKCD and I nevertheless have high hopes for `uv`.


It's the sledgehammer approach but... Docker?


Using Docker in order to avoid ever updating Python sounds like a great way to end up in severe tech debt tbh


Arguably it was also non-trivial part of why some places started using Docker...


Using an old Python version means you're stuck using libraries that work with that version. I think you have to figure out which ones work on your own as well.


Because libraries also change, for the same reason as python changes: To keep up with compatibility of newer devices, systems, and usage patterns.

Beside, "figure out which ones work" is a solved issue with a package manager such as poetry.


That is if all libraries were super stringent in their dependency bounds. Which in my experience no one is. So it's a crapshoot if it will work or not.


This is more of an indictment on how bad Python is about mixing system level packages with library dependencies than anything else. The classic example I always give is psycopg2 depending on a system install of Postgres client library.


The irritating thing about python is that since it is mostly glue around C, all the annoying things about C leak through.


It's not a solved issue, as others have pointed out. To elaborate, most packages are not maintained continuously. Some don't get updates anymore and some get updates that break older versions of Python. The authors of these libraries definitely aren't rooting out all the problems that can occur due to mismatched versions. And when you change one library, any other related library can be affected. It's like solving a puzzle sometimes. In many cases people throw up their hands and say "forget this, I'm not upgrading anything!" This is also the big driver of solutions like virtualenv and Docker. All languages have this issue to some extent but Python breaks stuff more often than anything else I use.

I don't especially like language-specific package downloaders either but that's not a Python-only problem, and there are tradeoffs to pushing the work downstream into more stable systems.


As somebody who welcomes killing off the GIL. I would agree that Python is changing too much. It seems to be in this constant state of flux starting from around the Python 2 to 3 era (though maybe it was always like this and I wasn't familiar enough with it to know how it was before that). While all languages change, Pythons changes seem to be existential in nature.

For such a long lived language to have this much immaturity is worrying. I guess the side affect of such popularity later in life.


I think partly to blame is the people who refused to move to 3.x. I remember in 2016-17 I worked on a new project and people say and argued that we simply had to use 2.x and that it would never go away. There wasn't even 2.x dependencies for the project. And I feel like that kind of mentality really slowed down migration and ultimately helped to create a culture that python would always be changing.


It's an interesting point, both the argument it makes and the background.

I'm one of the people who was unkeen to move to Py3. My reason was mostly the numerous and totally unnecessary minor syntax changes (print -> function, map and filter becoming lazy, str/unicode etc) with no major benefits in the upgrade for my use case (data science), other than compatibility. Some of the new syntax was neither bad or good (what's so wrong with print being a statement? Add a printfn function if you really want one), some were IMHO outright bad (map and filter becoming lazy, reduce being kicked out to a module), str->unicode was maybe the only good one, but even that is controversial. The deliberate neutering of `bytes` was unnecessary too. Python 3 brought various implementation improvements and nice new syntax, which could have been implemented in a Python 2 syntax too (dict and set comprehensions for example)

So I saw no reason why Python 3 couldn't in fact be syntactically compatible with 2, and saw the whole thing as a bit of an upgrade circus: wanna keep using the language? Make these arbitrary changes because someone thought print ought to be a function.

You could argue it was the same disregard for stability as the now many versions of Python 3.


Why print as a function?

From the PEP[0]:

    print is the only application-level functionality that has a statement dedicated to it. Within Python’s world, syntax is generally used as a last resort, when something can’t be done without help from the compiler. Print doesn’t qualify for such an exception.
    At some point in application development one quite often feels the need to replace print output by something more sophisticated, like logging calls or calls into some other I/O library. With a print() function, this is a straightforward string replacement, today it is a mess adding all those parentheses and possibly converting >>stream style syntax.
    Having special syntax for print puts up a much larger barrier for evolution, e.g. a hypothetical new printf() function is not too far fetched when it will coexist with a print() function.
    There’s no easy way to convert print statements into another call if one needs a different separator, not spaces, or none at all. Also, there’s no easy way at all to conveniently print objects with some other separator than a space.
    If print() is a function, it would be much easier to replace it within one module (just def print(*args):...) or even throughout a program (e.g. by putting a different function in __builtin__.print). As it is, one can do this by writing a class with a write() method and assigning that to sys.stdout – that’s not bad, but definitely a much larger conceptual leap, and it works at a different level than print.

[0] https://peps.python.org/pep-3105


Yes, if Python was made from scratch, then perhaps function is better (I don't really mind).

But breaking one of the most commonly used statements for what is essentially aesthetics? No thanks. Introduce a function that does the same if you like.

That's precisely my point, this syntax change (and many others) was entirely unnecessary, put people off upgrading, then got them told off for being dinosauric luddites.

Frankly, it's not even that consistent. Is `del dict[key]` necessary when you can call `dict.pop`?

Either way, these would be fine discussions to have, but not when you consider the people upgrading thei codebases.


> Is `del dict[key]` necessary when you can call `dict.pop`?

Reason for Python 4 detected!


Can't override/hack `print` as a statement, which is something I've found myself doing often enough to be happy that it's a function.


I'm still not clear on what the real benifit of python3 over 2 was.

I use python3 now by default, as it has fstrings, and probably some iteration things that I now rely on, and is faster for a lot of things.

But for the longest time the only difference that I bumped into was that print isnt a keyword anymore, but assert still is.


The main breaking change was Unicode for strings, which is what caused all the pain as lots of code was written to assume ASCII strings.

But it's good that it's Unicode.


Ahhh yes, that brings back memories. This also meant that stuff coming from things like UARTs and sockets were binary, rather than strings in python3.


For anyone who doesn't remember: A python3 design decision they went partially back on meant it was easier to go from 2.7 to 3.3 than it was to go from 2.7 to 3.0

(I think it was 3.3 when they compromised, maybe 3.4?)


3.3 - they reallowed unicode literals.


I mean I do think Python 3 introduced some pretty bone-headed things. Their strings are horrible if you need to interact with the system. Most system APIs on UNIX-like systems don't promise that anything is UTF-8-encoded, so you can't e.g use strings to store paths. You need to use byte strings, and they have WAY worse ergonomics in many ways than strings.

I would never use Python 2 now, but I do understand why some people would choose 2 back when 3 was new (and even slower than 2!).


> so you can't e.g use strings to store paths. You need to use byte strings, and they have WAY worse ergonomics in many ways than strings

Would you not use Pathlib?


But isn't exactly that the reason for the string/byte split?

Rust has a similar thing with OsStr. In my opinion a clear type based separation between different kinds of strings (Text, data, os path, ...) is just required to write robust softwsre


oh foo.

I wrote some python programs that did multithreading, but they didn't work and I had to go elsewhere.

So... it is possible you won't find many people in your sample that disagree because the folks that went elsewhere... aren't part of the sample.


“640K ought to be enough for anybody.” Bill Gates on RAM (he didn't really say it apparently).


I’m sorry, I’m not sure how the quote is relevant to the comment you are replying.

Mature languages don’t tend to break working code. That was true when the quote was pronounced and that’s still true today.

Python is fairly unique in its propensity to be an unstable target. I personally can only think of the JS ecosystem has being worse but even then that’s mostly a library thing as the VM will run even the oldest JS code you can find. I wish the same could be said of Python.


I honestly don't know where this "python is brittle" comes from. I've been using python heavily since the 2.5 days, and I can count on maybe 2 hands the times I've had a library or dependency break on me. Most of those were due to weird library function signature changes, that type-hints have now arguably fixed anyways.


I avoid type hints (and instead use other linters/static analysis tools like pylint) because type hints break between Python versions...


> I honestly don't know where this "python is brittle" comes from.

From Python being brittle?

It’s one of the rare language which deprecates and removes API between minor versions. They have to provide migration guides with every release.


I've yet to have any python feature/function/API change on me between minor versions. Yes, the big 2->3 thing did happen, and there were a few breaking changes that came along with that. And I've been the semi-proud owner of codebases that had to simultaneously cater to versions 2.5, 2.7 and 3+ at the same time. Other than the print statement, new forms of try-catch statements and the obvious addition of type-hints eventually, I've always been able to make the code cater to all three versions. Eventually, 2.5 fell away, then 2.7 fell away, and now it's just 3+, which has been a breeze. These days I'm excited to install the latest version, and I don't even worry about any breaking changes.

I'm also the type of person that doesn't do version-pinning as I think the concept is an anti-pattern that came from the JS world as a way to manage the chaos borne-about by the constant API changes, thousands of tiny libraries, and library dependency updates that it circularly-co-caused.


can you give an example? I'm struggling to find an example of something removed in a minor version.


Why are people building their software in Python when they need performance? There are other languages that excel at this. Python strengths lie elsewhere and they are increasingly being diminished with all these "upgrades". Splitting the ecosystem with async was already a major blow and now we are looking at another split along gil/non-gil axis. It's just sad.


Python is the de-facto default and main language in data science and ML community. You start because it is easy and have a great ecosystem but then after some time you start to worry about performance at some point. And it takes time, efforts and money to re-write in rust,C++.. etc.

I think running inference engines is one particular case of that. You can train and tweak your model using python with its wonderful pytorch and numpy. Then at some point when scale and performance matter then this becomes a potential problem.


To be a bit pedantic, I guess may be it is fairer to say python is the main glue language for scientific computing where DS and ML are part of. The “big bang” seems to have started from Numpy/Scipy and friends and cascading from there.

Companies like Facebook also have huge interests in speeding up python as they have a lot of stacks written in python.


Mostly projects / products that start off with very low usage thus python is perfectly fine (Why overoptimize). And then it becomes useful - and a rewrite isn't worth it.


How is that Python’s problem though? If you paint yourself into a corner by choosing the wrong language then eat the rewrite or eat the hardware costs.

There was a consensus after the Python 3 debacle of “No major breaking changes”. We seem to have lost that because of moneyed interests and that’s sad.


That's such a strangely distorted world view. If a car company releases a, idk, trunk-extension in response to customer feedback would you go "How is this Ford's problem? If you didn't think about the trunk size eat the loss and buy a new car" ? Python developers want python to remain useful to the developers who want to keep using the language. It's not an incomprehensible motivation.


If I attach something to my car it doesn’t affect your car.

Who does no-GIL benefit? For the majority who use Python for single threaded code, no-GIL will make their code slower because a thread-safe interpreter is always going to be slower than one that can assume ST operation.

For the minority who want parallelism, there’s two other options: OS processes and subinterpreters. If you can use either of these then you will get better performance with a GIL for the same reason.

So no-GIL will only be faster for a minority of a minority who want parallelism but can’t use OS processes or subinterpreters.

Meanwhile everybody else writing libraries has to make sure that their code is no-GIL safe, to support this tiny minority, and if no-GIL ever becomes default then everybody else has to do something to turn it off.

It’s such a stupid idea.


> If I attach something to my car it doesn’t affect your car.

Yes, that was my point.

> For the majority who use Python for single threaded code, no-GIL will make their code slower because a thread-safe interpreter is always going to be slower than one that can assume ST operation.

I'm almost sure the python developers said that they will compensate the slow down with other optimizations, so that you'd never have single-threaded performance degradation version-to-version.

> So no-GIL will only be faster for a minority of a minority who want parallelism but can’t use OS processes or subinterpreters.

One would hope to 1. open new use cases for python thus attracting developers that would have otherwise not given the language consideration and 2. other users could benefit from new optimizations that could be implemented further down the dependency stack.

Of course there's no guarantee that that will materialize, but the idea the adding support to an established, lightweight and well-supported concurrency primitive is so obviously a "stupid idea" shows to me that your (rudely expressed) opinion is entirely self-centered and nearsighted.

I might add that the move from python 2 to 3 was incredibly painful, but I assume most agree (with the benefit of hindsight) that it was entirely correct.


> I'm almost sure the python developers said that they will compensate the slow down with other optimizations

Those optimisations are not there to compensate anything; they will improve performance of single-threaded code with or without GIL.


That I meant to express yes. That those non-GIL-related optimizations would soften the blow of any slowdown from the GIL removal project.


Maybe, but making GIL optional (rather than removing it completely) solves both problems.


I however creates another, arguably bigger problem, it fragments the ecosystem.


You can then rewrite performance critical bits in a fast language. Much Python usage is a result of people making use of this.


There is a certain beauty with the GIL and without it there will be more unnecessary complexity for the 80% of applications in the future and potential maintenance issues for current software.

To also enable this as default in the coming years is crazy.

I would love to hear what Guido thinks about this.


It's a thirty year old design decision that even at the time was fairly naive but somewhat excusable given the scarcity of multi processor systems at the time. Undoing this and tackling all the technical debt is overdue. The main challenge never was that it's not doable (see other languages for a variety of solutions) but simply that it's a bit of work.

Basically it won't become a default until after it works. Which is the opposite of crazy; it's very reasonable.

Guido has spoken out on this and I don't think he's against this. It's more that he's in favor of stability and moving forward.

The challenge with python is that you can't do a lot of things in it very efficiently that people keep on doing in it anyway. Like trying to use multiple threads. Making those things work a bit better is not a bad thing.

There's a simple solution for code that depends on the GIL, which is simply to don't do what's fairly pointless with the GIL anyway: using more than 1 thread. The GIL only serves a purpose if you use more than 1 thread. And since it makes doing that kind of impractical anyway, most python code doesn't need the GIL because it is single threaded. That code will only break if the GIL is gone and multiple threads are used. Simple solution if that affects you: don't do that.


Tell that to the dependency of the dependency you can find no decent alternative for that does find threads useful.


He's ok with all the breaking changes I think, from what I can see on the python forum.


Python 3.12 had a ton of breaking changes too so that doesn't surprise me. They clearly didn't learn from the Python 3 experience! Or maybe they did and their lesson was "still make breaking changes but just don't change the major version number".


Yes I think that was their takeaway, so now instead of having a slooooow migration we just have tens of libraries that break at every python update.

Also there's no way to get access to pypi and fix the abandoned libraries. So they get patched in distributions and remain broken forever on pypi.


Which libraries are you claiming break at every Python update? (3.12 or later)


It's not every Python update. 3.12 was particularly bad. Blessed is an example. I believe they had fixed it in the latest version, but we weren't using the latest version.

Another example of breaking changes in the Python ecosystem: when Numpy updated to 2.0 (with breaking changes) we had a package that depended on `numpy >= 1.something` but it didn't actually work with Python 2.0.

You can say "well they should have known to use the very unobvious non-default syntax 'numpy >= 1.0, < 2'" but even that isn't right because it turns out Python packages don't use semver; they just deprecate things and then remove them at arbitrary versions, so there's no reasonable upper bound you can actually put there.


The last updates deleted modules from the standard library like distutils. Any library that used those modules is now broken.


But that's not "break at every Python update", I asked you to substantiate your claim "breaks at every Python update". distutils was deprecated and being phased out since 2014 [https://peps.python.org/pep-0632/]. It was removed in 3.12. People were preparing for this in 2021 already [https://news.ycombinator.com/item?id=26159509].

Python almost never does breaking changes in non-major versions, and only when necessary.

And even at that, "Python" didn't break, only those (abandoned) packages that depend on distutils, and they already got/are getting fixed/removed. Honestly this isn't "breaks at every Python update".

All those of us who aren't using those distributions don't even see the abandoned libraries in the first place. (I asked you guys for a list of the libraries.)


why is it crazy? Java, c-sharp, and more all have normal threading enabled, why couldn't Python? clearly it's going to be massively useful.


Because there’s decades worth of libraries that written under the assumption that Python has a GIL. All of these need to be updated, it’s Python 3 all over again.


There are many C extensions written with that assumption, but if you were relying on the GIL to avoid Python level concurrency bugs you probably didn’t, they’ll just be slightly rarer and harder to reproduce.

That was certainly my experience working on TruffleRuby, and I doubt Python is very different. We found a few bugs because we didn’t have a global lock, but they were often problems that showed up in a soak test on an implementation with a GIL, just not quite as often so they had escaped notice and were likely causing real failures in production somewhere.


AIUI if you want the GIL, you can just not disable it. I don't think the GIL is going away in the near future.


It would be if it wouldn't be massively slower too.


Glad to see it, but it's early days and care is needed when porting existing code or writing new. There's also question of modules. Will those get updated? I am expecting a slow migration over the next couple of years. Python has made me money for over 15 years, but if I want to write code that can use all those cores and threads I use Golang.


Exciting! I'm going to be locking all my projects to --no-gil or whatever and targeting 3.13 by default as soon as possible.

Everytime I've needed just a bit of multicore performance and have gotten stuck using multiprocessing sucks - I'd much prefer just to use threading and not worry about the pickling process.


I'm really looking forward to PyTorch supporting this as it'll make writing custom dataloaders way easier, being able to use multiple threads directly rather than having to deal with all the wierd edge cases and copying that comes with multiprocessing.


I wonder if I will see Python as a browser scripting language in my lifetime



Will -—disable-gil ever be the default that should be targeted by everyone? Or will python have “requires-gil” libraries and “allows-no-gil” libraries?


Yeah it will be the default in the future. They laid out their roadmap a while go. It's a long roadmap to allow libraries to update. IIRC it would be default by 3.16 at the earliest. I think there was even a stage where there would be GIL and GILless distributions.

https://www.infoworld.com/article/2338862/python-moves-to-re...


The world needs both more and less concurrency FUD. It needs more because too many people think their libraries are safe due to things like the GIL when they likely have bugs that will occur occasionally, and it needs less because that belief makes people terrified that removing the GIL will break their code.

If your pure Python code fails without the GIL it can probably fail with the GIL, and testing it without the GIL might help you find those bugs a lot quicker.


> If your pure Python code fails without the GIL it can probably fail with the GIL

No, if you remove the GIL there is a whole class of bugs that will appear in code that was safe with the GIL in place.


For example? I don't think the GIL affects the behavior of pure python code.


Yeah. You're right. It guards some CPython internals, but I don't think it does affect code the way I thought it did. Thanks for the prompt to investigate.


After GIL they must add Result/Option and drop exceptions.


In Python, exceptions are effectively free, unlike JVM or .NET. Every for loop in Python relies on those exceptions.


It's good to see Python finally able to get rid of GIL. Looking forward to see how much performance can it improve.


We already know the upper bound of perf improvement – existing perf * number of cores. It will be worse than that though, as all the GILectomy plans make single-threaded performance worse.

So if you're expecting something better than that, you will be disappointed.


All the GILectomy plans IIRC also include single threaded performance improvements to offset any such costs. So while performance vs GIL is maybe worse for single threaded for the same Python version, performance will still be ahead of where it is today for single-threaded python (assuming everything goes according to plan). That's also why multi-threaded performance will be more than just existing perf * number of cores (vs what it is today, not what removing the GIL alone provides).


But it doesn't offset anything since you get all the other improvements anyway, they're not tied to gil/nogil


I could be misremembering, but I thought that the MSFT team proposed those performance improvements specifically to offset any concerns about single threaded performance degradation from removing the GIL. Thus even if development is happening in parallel by independent (which I thought it wasn't - I thought it was all 1 team doing this work), it was predicated upon nogil being accepted in the first place. Thus if GIL were to remain in Python, then this performance work wouldn't be happening.


Maybe the work wouldn't be happening without the noGIL work, but once it's happened it's not tied to the GIL, you can pick those improvements and continue with a GIL-only Python


This post is literally about step 1: add this behind an unsupported experimental flag to get more insights. Step 2 is mid-term to make it a supported option based on readiness (within another 2 years). Step 3 is making it the default & then removing the GIL [1]. Steps 2 and 3 may not happen if some major unsolvable obstacle appears. But I doubt it's going to be so easy to reverse this direction. Given MSFT is driving all of this right now, it's hard to imagine there's going to be much appetite to break their trust; MSFT is more likely to cut funding before completion which would create some chaos than the steering committee is to violate an agreement around funding (MSFT has made specific long term commitments they're going to keep, but those commitments are only for a few years IIRC).

[1] https://developer.vonage.com/en/blog/removing-pythons-gil-it...


Why is MSFT so interested in funding this?


MSFT are keen to invest in popular development tools such as Python, Javascript, and Git. They hired Guido; they bought NPM; they bought GitHub.

I don't know what the plan is, but they haven't succeeded (outside of the corporate world, really, for C# and the functional world in F#) in making modern tooling and languages people want to use.


If you have too steep a hill making inroads into a desired community, and there's enough money, just buy the thing that brings said community together.


The GIL causes a huge performance hit in data processing/ML by forcing the use of multi-process, which leads to a bunch of unnecessary copying of memory between processes unless you put in a bunch of effort to explicitly share memory. So in some cases the savings will be gigantic, from no longer unnecessarily copying huge dataframes between processes.


But usually, in spaces where you need speed Python is just an orchestrator or glue between pipelines, and actually, calculations are done by db or some c/c++/fortran library.


Yes pandas/numpy calls C++ to do calculations efficiently, but the "glue" can still introduce significant slowdown relative to that when it's copying tens of gigabytes of dataframe unnecessarily between processes. Of course that slow part itself could also be moved to C++, but that's much more effort then just parallel mapping over the dataset in Python with no copying/multiprocessing, as will be possible with no-gil.


Bad code/quick hacks will always be slow (but can be great for prototypes), and sometimes it's worth planning how you're going to process something rather than piling on multiprocessing. Once you reach the point of multigigabyte IPC, it's worth spending the time doing it right.


Building libraries on a GIL-less Python would enable people to access that power without them all building it from scratch themselves.


GIL-less Python isn't magic pixie dust, the same group of users who have slow, poorly structured code are at best run into deadlocks. GIL-less Python can be used by well-designed libraries to achieve speedups, but that's not code written by the aforementioned pandas users, and speaking from experience, there's a lot more room for order of magnitude speedups from fixing quick hacks than running things in parallel, and usually it's a lot easier than managing multithreaded code.


> GIL-less Python can be used by well-designed libraries to achieve speedups, but that's not code written by the aforementioned pandas users

Yes, that's why having something like Pandas use it would be better than getting all users to write their own version.


I would be shocked if pandas wasn't already using multithreading where they could. Naturally, free-threaded Python (to use the actual name it's being called) gives libraries like pandas more options (which I think is a good thing, even if I think things aren't going to be as smooth as people would like), but there's only so much pandas can do for badly written code. This would be like postgresql moving from multiple processes to multiple threads, sure there may be speedups for some users, but for users that haven't added any indices, there's a lot of performance left on the table.


If the libraries are thread safe can they not release the GIL to avoid copying.

I am pretty sure you are going to say there is a reason this cannot be done, would just like to know what it is!


What libraries? If you're writing some pandas code and want to parallelise some part of your data pipeline, as far as I'm aware Pandas doesn't have much support for that, you need to manually use multiprocessing to process different parts of the dataframe on different threads. Yes there are pandas alternatives that claim to be a drop-in replacement with better parallelism support, but the more pandas features you use, the more likely you are to depend on something they don't support, meaning you need to rewrite some code to switch to them.


But that's such a small fraction of total Python use, that it cannot serve as a validation to make it the default.


It is a fraction of usage that is commercially important to people who fund a lot of Python development.


Aka a power grab for short-term gain.


I would use python much more if every version did not have these many breaking changes, especially with the removal of the GIL. Shame they did not learn from 2 to 3.


[flagged]


I was with you right up until the last sentence. Is the GIL really a "political" issue?


Awesome to see Python turning more into a full fledged programming tool. Would be neat to see some kind of static typing, or atleast enforced duck typing.


No, I'd rather it stayed close to its original design in that regard. I sometimes feel like the success of Python has attracted fans of features available in other, less popular programming languages. Those people want to turn Python into their favourite language. I am not a fan of those attempts. Python is a forgiving and malleable programming language, but I sometimes want to quit when I open the repo at a new client only to find code written by someone who was told that functional programming and GraphQL are the future only to get bored halfway through the exercise of writing the app. You literally have to torch the code and start from scratch in those cases.

To those who want to turn Python into Golang/Rust/Haskell/Java ... guys, other programming languages are available.


As someone that has used Python on and off since Python 1.6, mostly for UNIX scripting stuff, if Python is imposed on me as big boys programming language, I at very least expect the same tooling experience as Common Lisp, Smalltalk, Scheme, SELF, JavaScript, in performance, and not having PyPy feeling like an outsider on its little corner.


I understand the sentiment. I don't think Python can satisfy one group of devs without loosing another, so the best way forward would be to stay true to itself while allowing those groups to adapt it to their expectations. I too have my own little wishes like having an option to turn off GIL much earlier. Python has become such a popular language that it's now harder to steer that ecosystem in one direction or the other.


I think it’s a bit disingenuous to put it that way. The comment you are replying to is not asking for a widely different programming style. They want typing which is perfectly understandable.

It’s common that as project grows larger and more unwieldy, people ask for more warranty and help from their tooling. I think Python position on this strikes a good balance - provide the infrastructure for adding type hint but delegates type checking to a third party.


Python uses dynamic typing by design. Removing it would remove one of it strengths (and weaknesses as some insist). It is one of the things that made Python so successful. We have learned over time that adding mypy to the pre-commit hooks is a good thing, but I think that's where we should stop. Give devs an option, but don't force it upon them.


I think you could also argue for some sort of setting that adds additional optimisations based on type hints, but it would have to default to off.


If it’s a small application starting as functional application makes a lot of sense, it’s like putting rebar before pouring concrete. Even if you start from scratch again you know someone has already started working with the previous structure. Meaning that work was not wasted.

Anything to avoid the over complexity associated with delivering content with nodejs it’s like you missed out on building your computational foundation.


There is a problem and it is not connected to any particular language. The majority of courses, reference material, and existing code is not built around functional programming paradigms.


Agreed to disagree


Do you mean like mypy and optional type hints or something different, enforceable like Typescript?


Its not runtime enforceable though. Personally whilst I do provide type hints (I have a the linter setup to remind me) They are only really useful as a documentation tool.

As they are not enforced at runtime, you can easily return bollocks and not know.

I really like how c# does it, which is have strict typing on by default, but allowing you to turn it off for things where you're wanting to be loosey goosey. Not having to do a bunch of type checks on every operation would also speed up a bunch of things inside python

However, that would make it a different language. perhaps python 4? (ducks)


See typeguard[0] for runtime type checking. I activate it only during testing, avoiding unnecessary overhead in prod.

[0] https://pypi.org/project/typeguard/


It would be nice to have typing in the core language/interpreter, including runtime type errors, instead of this weird add-on approach. I constantly run into situation where the code is fine and mypy complains or the code is broken and mypy says it's fine. It all just feels like one gigantic fragile workaround instead of a proper type system that you can depend on.


I wouldn't really say Typescript is enforceable, given that the types aren't actually considered at runtime.


That is the case for most typesystem considered safe, save for introspection use cases. For instance, haskell, a language with strong type guarantees, does erase types for runtime.


In fact, people should see Typescript as a linter / bundler tool than a programming language by itself, ignoring the design mistakes of how enums and namespaces came to be.


Almost no languages have types considered at runtime.


Well python does, after the famous PEP to pack all types into strings got parked (probably forever).

There's packages to typecheck at runtime, even to check the parameters to a function.


Pyre has optional strict mode typing if you want to enforce it. It also is more sound than mypy.


Interesting, first I've heard of Pyre. Seems to be a Facebook project? How does it compare to Pyright, which is also more or less sound (unlike Mypy)?


There is also the rarely mentioned pytype from Google, written in Python. And pyright from Microsoft is written in Typescript, pyre at Facebook in OCaml. Last time I checked, these had better type inference algorithms (Hindley-Milner?) than mypy.

https://github.com/google/pytype https://github.com/google/pytype https://github.com/microsoft/pyright


In my experience Hindley Milner type inference is a misfeature. Pyright doesn't use it and its typing system is very good. My only typing complaint with it is that it exactly types dict literals by default, which is usually not what you want.

By that I mean if you do

  foo = {"a": 2}
  foo["b"] = 3
you will get a type error because the inferred type of foo includes knowledge of the keys. But well written code doesn't use dicts when you know which keys are going to be present - you use a dataclass.

I guess you could argue most Python is not well written, but then most Python developers are too amateur to use Pyright anyway.


Now it only needs a proper JIT, to catch up with Common Lisp in 1984.


Common Lisp doesn't specify anything about JIT.


So what?

Lisp compiles to native code since 1962, having a JIT is pretty much given.

I only mentioned Common Lisp, because of the wide library, having already the implementation experience of Lisp Machines from Xerox PARC, TI, Genera, and being as dynamic as Python if not more so, already to sidestep the usual dynamism excuse for lack of Python JIT's.


Lisp's dynamic nature does not come from having or not having JIT; none of the implementations you mention had JIT.

Compiling to native code isn't JIT, it was very common to run just straight interpreted code on the Lisp Machines (since compiling took a long time, and then to load the final file object on slow disks).


JIT is a synonym for dynamic compiler in CS speak.

Feel free to show us the manuals of those implementations without a chapter about (compile ....), (disassemble ....) or similar.


Your claim was that JIT was something CL had, it doesn't. But now your switching topics.

The Lisp Machine compiler is incremental, but it is not a dynamic compiler, it never changes the compiled object when it has been compiled during run-time. This is similar with all Lisp implementations, and even Python. Code is also not compiled by default on the Lisp Machine, it is interpreted.

https://docs.python.org/3/library/functions.html#compile https://docs.python.org/3/library/dis.html

Python has all the building blocks that Lisp does in this regard, and has had for many many years. But I'm not allowed to quote the Lisp Machine manual on the topic, since it has chapters on the compiler so I'll jump out of this conversation now...



That describes incremental compilation. Something Python is also capable of.

Dynamic compilation (i.e. compilation during execution and/or modifying the emitted compiled output during run-time -- depending on which school one prefers) was never a thing on the Lisp Machine, and is not even a thing in modern Lisp implementations like say SBCL which just does static compilation (compiled object is never modified when it has been compiled).

    (print-herald)
    LM-3 System, band 4 of AMS-LISPM-2. (LM-3)
    2048K physical memory, 16127K virtual memory.
     Experimental System     300.0
     Experimental Local-File  54.0
     Microcode               323
    AMS Lisp Machine Two, with associated machine FS.
    
    (defun foo () "bar")
    FOO
    (compiled-function-p #'foo)
    NIL


From https://en.wikipedia.org/wiki/Just-in-time_compilation

"The earliest published JIT compiler is generally attributed to work on LISP by John McCarthy in 1960"

As for the LM-3 example, I would rather be proven wrong by you pointing me to PyPy as counter example, but the poor fellow not even for this comes up.


Yeah, but that's wrong. It's the usual confusion about Just in Time dynamic compilation by the system and incremental compilation by the user.

LISP I had an incremental compiler. The user calls the compiler with a list of functions or function definitions to compile. It's not compiling dynamic by the system and also does not use information about the running function (call statistics, call argument statistics, ...).

There is no "Just in Time" functionality provided. The developer is responsible to decide what code to compile and when to compile the code. A function needs to be compiled by the developer before it is invoked, otherwise it won't run compiled. In a JIT compiled setting the system decides when to compile the code: either on start or during runtime triggered by the system. The code will be "Just in Time" compiled.

This Wikipedia article explains the difference:

https://en.wikipedia.org/wiki/Dynamic_compilation

"Just-in-time compilation is a form of dynamic compilation."

"Unlike dynamic compilation, as defined above, incremental compilation does not involve further optimisations after the program is first run."

Later Lisp systems may have aspects of JIT compilation. For example some CLOS implementations optimize code (-> method combinations) at runtime, possibly via compilation.


COMPILE is AOT, not JIT. Typical Lisp compilers will not dynamically optimize code based on runtime performance metering.


From https://en.wikipedia.org/wiki/Just-in-time_compilation

"The earliest published JIT compiler is generally attributed to work on LISP by John McCarthy in 1960"

I can naturally also go hunthing for those papers from my digital cellar.


Other than type hints?


Single threaded performance in the no-gil build is currently 45% slower, which is far higher than figures advertised when the feature was proposed:

  def f():
      x = 0
      for i in range(10000):
          for j in range(10000):
              x += i*j
      return x

  print(f())
With Python-3.13.0rc1:

  ./configure --enable-gil: 0m5.981s
  ./configure --disable-gil: 0m8.683s
Stock Debian 3.7:

  0m5.973s
So, in the regular GIL build there are no speed improvements compared to 3.7 and the no-gil build is far slower for single threads.


Stop creating new accounts to post the same comment on each No-gil submission on HN.


Stop pretending to be a scientist if you suppress benchmarks. That is how data "science" is done, eh?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: