Hacker News new | past | comments | ask | show | jobs | submit login

> All three of those have declined. It's less readable than it used to be, it's definitely more complicated (not just complex, complicated), and the standard library is declining rapidly in relevance as it ages.

I find it much more readable, and more importantly more expressive. Certain new features are missteps IMO, but I just don't use them. But more importantly, the language has been moving away from cryptic %-encodings and other C idioms.

As for the standard library, that was already happening for a long time, and is inevitable. The world has fundamentally changed. In Python's heyday it was much harder to download and install and use a third-party library, so a rich standard library was an asset. Now it's full of specialized code that handles obscure and increasingly irrelevant data formats; multiple overlapping hacks for binary data; terrible and confusing date support; awkward interfaces that haven't stood the test of time (particularly all the networking stuff; Requests is one of the most downloaded PyPI packages, along with its dependencies which are probably almost never downloaded for any other reason); etc.

Lots of people still seem to think that the 2->3 migration was a mistake. They couldn't be more wrong. The old way of handling "strings" was abysmal, and spit in the face of the Zen. Error messages were confusing and implicit conversions abounded.

Also, just for the record: Guido van Rossum was in favour of the walrus operator. In fact, he co-authored the PEP (https://peps.python.org/pep-0572/), along with Tim Peters.




The walrus operator is nice, except in comprehensions. f-strings are great, except for the `=` debugging operator. Dictionary merging and update operators contradict the "one way to do it" with weird and confusing syntax that's completely redundant to methods that already exist.

Type hints are a sore spot for me. They're good enough when you just don't remember whether an argument is an object or a string, for example, but once you start type hinting deep into data structures, your hints become a mangled soup of nonsense for basically no real benefit. Typing errors are a rare occurrence—perhaps once a year in most projects—yet we clutter our codebases with verbosity to satisfy type checkers instead of prioritizing clarity for developers.

There's a lot that's just straight up redundant. Dicts are ordered now, but is OrderedDict deprecated? No, because it's just slightly different in weird and mostly unimportant ways. `frozenset` is a builtin, for all 3 programmers worldwide who use it. Python resisted match/case syntax for decades, but when it finally arrived, it did so in a way that’s anything but standard—good luck figuring it out without consulting the documentation.

Obviously some improvements are real. Every new version of Python brings valuable enhancements. But just go back to Python as it used to be -- pseudocode that runs. That's just not true anymore. The simplicity has slipped away and will never ever come back.

And the standard library? A very real problem, right now, in computer security is the software supply chain. Remember polyfill from like, yesterday? This is the era when we should double down on having a million dependencies from all over GitHub, from unknown developers with no commitment, because ... npm's hellscape is a model to follow?

I would argue the contrary. There's dependency hell, of course, but there's also dependency risk. If you were evaluating a product right now, and you saw its lockfile depended only on a specific version of the Python Standard Library, that gives you exactly 1 product to evaluate, exactly 1 team of developers to depend on. pip is great and all, but dependency resolvers have quietly let in a hundred trojan horses and a thousand unmaintained dependencies into tons of projects, and no one noticed it was even happening.

Python in 2005, when everyone depended on the standard library, was a safer place than npm is today.


> f-strings are great, except for the `=` debugging operator

What's wrong with `=` debugging operator in f-strings?


>Dictionary merging and update operators contradict the "one way to do it" with weird and confusing syntax that's completely redundant to methods that already exist.

I find the unpacking syntax elegant. There are yet more unrealized possible generalizations of it that I can think of.

> Type hints are a sore spot for me.... Python resisted match/case syntax for decades, but when it finally arrived, it did so in a way that’s anything but standard

Many people expect match/case to be "a switch statement" but it really is not designed for this purpose. I agree that it's an awkward fit and I don't use it. Similarly, I only use type annotations for documentation purposes.

> Dicts are ordered now, but is OrderedDict deprecated? No, because it's just slightly different in weird and mostly unimportant ways.

Large amounts of existing code are dependent on those ways, because the code was written to use that design. The ordering of dicts since 3.6 is an accidental consequence of an unrelated space optimization. In my view, the team erred by deciding in 3.7 to guarantee that ordering. I have concretely identified a further space optimization which is prevented as a result.

> Python in 2005, when everyone depended on the standard library, was a safer place than npm is today.

The flip side of dependency risk is security risk from lack of maintenance. For example, the standard library `json` module is a frozen-in-time old version of `simplejson` (it even remembers a useless version number). That project is still actively maintained (https://github.com/simplejson/simplejson) but none of those improvements - even if they fix security - will make it into Python except by parallel work by the core dev team. (Or accepting a patch; but that also requires either the maintainer or a third party to notice that a recent `simplejson` change is a security fix, figure out how to backport it to the much older version, and make a PR.)

There are other ways to solve the problem. For example, an organization similar to PyPA could publish a set of "core" libraries, versioned independently from Python but explicitly tested as part of the CPython release process. (That would also allow for fixing the problem that the standard library isn't namespaced - which is at the root of the problem whereby beginners e.g. put their toy lottery project in `random.py` and get an error from a circular import, or - I swear I'm not making this up - having a `token.py` in the current working directory breaks the interactive REPL help - see https://stackoverflow.com/a/75068706).

So, yes, it would be nice to see lockfiles that "depend only on a specific version of the Python Standard Library". Right now, that dependency goes undeclared, and the maintenance work is distributed among people who are also busy with developing the actual language.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: