Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your first example has to do with the fact that tuples are copied by value, whereas lists are "copied" by reference. This is a special case of an even larger (IMO) misfeature, which is that the language tries very, very hard to hide the concept of a pointer from you. This is a rampant problem in memory-managed languages; Java has similar weirdness (although it's at least a bit more consistent since there are fewer primitives), and Go is doubly odd because it does have a user-controllable value vs. pointer distinction but then hides it in a lot of cases (with the . operator working through pointers, and anything to do with interfaces).

I think the whole thing does a misservice to novice or unwary programmers. It's supposed to be easier to use because you "don't have to worry about it" - but you really, really do. If you're not familiar with most of these details, it's way too easy to wander into code that behaves incorrectly.



    > This is a special case of an even larger (IMO) misfeature, which is that the language tries very, very hard to hide the concept of a pointer from you.
When I came to Python from Perl, it only took me about one day of Python programming to realize that Python does not have references the same way that Perl does. This is not flame bait. Example early questions that I had: (1) How do create a reference to a string to pass to a function? (2) How do I create a reference to reference? In the end, I settled on using list of size one to accomplish the same. I use a similar trick in Java, but an array of size one. In hindsight, it is probably much easier for junior programmers to understand the vale and type system in Python compared to Perl. (Don't even get me started about the readability of Perl.) Does anyone still remember the 'bless' keyword in Perl to create a class? That was utterly bizarre to me coming from C++!


> Your first example has to do with the fact that tuples are copied by value, whereas lists are "copied" by reference.

My mental model for Python is that everything is '"copied" by reference', but that some things are immutable and others are mutable.

I believe that's equivalent to immutable objects being 'copied by value' and mutable ones being '"copied" by reference', but "everything is by reference" more accurately reflects the language's implementation.


Yeah, I know that's how it works under the hood - and why you have things like all integers with values in [-5, 256] being assigned to the pre-allocated objects - but I don't think it's a particularly useful model for actually programming. "Pass-by-reference with copy-on-write" is semantically indistinguishable from "pass-by-value".


There is no copy on write and no pass by reference.

Python is "pass by value", according to the original, pedantic sense of the term, but the values themselves have reference semantics (something that was apparently not contemplated by the people coming up with such terminology — even though Lisp also works that way). Every kind of object in Python is passed the same way. But a better term is "pass by assignment": passing a parameter to an argument works the same way as assigning a value to a variable. And the semantic distinctions you describe as nonexistent are in fact easy to demonstrate.

The model is easy to explain, and common in modern programming languages. It is the same as non-primitive types in Java (Java arrays also have these reference semantics, even for primitive element types, but they also have other oddities that arguably put them in a third category), or class instances (as opposed to struct instances, which have value semantics) in C# (although C# also allows both of these things to be passed by reference).

The pre-allocated integer objects are a performance optimization, nothing to do with semantics.

The model is useful for programming, because it's correct. We know that Python does not pass by reference because you cannot affect a caller's local variable, and thus cannot write a "swap" function. We know that Python copies the references around, rather than cloning objects, because you still can modify the object named by a caller's local variable. We know that no copy-on-write occurs because we can trivially set up examples that share objects (including common gotchas like https://stackoverflow.com/questions/240178).


> I don't think it's a particularly useful model for actually programming

I think “everything is by reference” is a better model for programming than “you need to learn which objects are by reference and which are by value”. As you say, the latter is the case in Go, and it’s one of the few ways the language is more complex than Python.

You could argue that in Python you still have to learn which objects are mutable and which are immutable - but if it weren’t for bad design like `+=` that wouldn’t be necessary. A object would be mutable if-and-only-if it supported mutating methods.


> I think “everything is by reference” is a better model for programming than “you need to learn which objects are by reference and which are by value”.

The model is: everything is a reference (more accurately, has reference semantics); and is passed by assignment ("value", in older, cruder terms).

> but if it weren’t for bad design like `+=` that wouldn’t be necessary. A object would be mutable if-and-only-if it supported mutating methods.

`+=` is implemented by `__iadd__` where available (which is expected to be mutating) and by `__add__` as a fallback (https://stackoverflow.com/questions/2347265). The re-assignment of the result is necessary to make the fallback work. Reference for your "major wtf" is https://stackoverflow.com/questions/9172263.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: