I only use python casually, but it seems to me that python library authors are always refactoring and breaking interfaces.
Recently, there was a bug in a second order dependency, and the version that fixed this bug was after a version that moved a function into a submodule.
So I had to make a local patch to my dependency that changed all of the imports.
Was the new interface more consistent? Yes, but could they have left a shim for backwards compatibility, or just lived with the old slightly less consistent interface? Seems to me like they could.
I think that depends on library. The ones I use seem to use appear to not do that. In fact I think I actually have seen that more in Go. What goes for Go though is that you won't compile your code until you fix it. In Python you could use mypy, but unfortunately is optional and will work only if you and library author uses annotations.
Recently, there was a bug in a second order dependency, and the version that fixed this bug was after a version that moved a function into a submodule.
So I had to make a local patch to my dependency that changed all of the imports.
Was the new interface more consistent? Yes, but could they have left a shim for backwards compatibility, or just lived with the old slightly less consistent interface? Seems to me like they could.