> I'm completely and utterly tired of fixing the next 14916498th bug that's caused by package X and Y having shared dependency Z, then X updating Z to Z+1 and breaking Y
This is not "Linux", it's users who mess around with (and ultimately break) dependencies in order to get the latest version of programs at any cost.
If one uses, say, an Ubuntu version and 3rd party repositories for that version only, they're not going to get any dependency issue, since all the programs will share the same library versions (dependency bugs do happen once in a long while, but they're rare, and they're exceptions).
I used Ubuntu and even Debian for many years. I strongly disagree with you, actually I find Arch significantly more stable than Ubuntu. At least in Arch when dependencies are broken, both X and Y are swiftly upgraded.
At this point, I'm also tired of convincing others that linux package management is broken beyond repair. Until you experience my pain, you won't be convinced. I understand many people will never be able to sympathize with my pain but all I can truthfully and honestly say is that I'm a very experienced GNU/linux user, I used all kinds of distros from Ubuntu to Fedora to Arch to Gentoo and, no, it's all a complete and utter mess when it comes to dependency management. I end up spending hours every month fixing broken versions. The sweet spot is using pacman for very basic things, and AppImage for everything else. I don't care about memory efficiency, I want `MuseScore4.appimage` to contain everything about the app, I want it to behave exactly the same every single time I click on it. No, I don't tolerate even the slightest behavior difference, I do not want glibc to upgrade from x.y.z.t to x.y.z.t+1 because it causes insanity when t+1 causes a behavior change in some random software synthesizer I happen to use. I know that this probably doesn't make sense to 99% of users, but maybe I have a special case. Case in point, in the year 2023 while trying to ship a lot of work I'm producing, a single update broke tons of my workflows, and I finally decided that anything not frozen in AppImage is cursed. If it doesn't work you, I respect your patience and expertise, I just hope some people can understand the pain other users go through.
I have a degree in CS, I write code full time, I manage linux containers in my day job, and I still can't manage the mess I have at home in my local Ubuntu/Arch installs. I don't know how people who don't know how to code do, but all I know is that I'm done spending hours at a time on fixing glibc at this point. I just want to work on my hobbies, thank you very much.
EDIT: And before people come here, no, OSX and Windows are even worse. I won't consider using them either.
Zypper has excellent dependency resolution, it hasn't given me any trouble. DNF is just as good, from what I hear. Apt and yum are an earlier generation of package manager, and I've seem them get themselves into jams on more than one occasion so I understand what you're saying and can sympathize with your perspective, but those shortcomings aren't inherent to the premise of traditional package management.
I use AppImage for the sort of software that won't get packaged (proprietary or obscure) and zypper for pretty much everything else (excepting a few programs I build myself due to reasons.)
This is very gross disinformation and/or incompetence (which explains why "others" won't listen).
The default Debian tools, apt and apt-get (and related tools, like gdebi/aptitude) won't put the system in an inconsistent state, unless the user forces them to do so. Dpkg will do, which I suspect is the tool you're using.
Even when one adds a repository with incompatible package versions (say, the system is Focal and the repository distribution is Mantic), the upgrade will stop before upgrading the packages.
What's happening is that you're forcing broken dependencies, likely to chase the "latest and greatest versions", then complaining that the package manager is broken.
Uh... Such is linux life, when you're hit with issues and complain, it's "This is very gross disinformation and/or incompetence". No wonder some people won't bother dealing with it.
Again, I truthfully experienced all of this and my day job is to handle things like this, so I personally know that it's neither of those things.
What's happening is a variety of issues (not one) and it can be anything from package maintainer not knowing whether a version is incompatible with their package, to dependency being noted wrong in package. For example, if I write package X and depend on numpy, but don't realize numpy 2.24 breaks it, Debian will happily accept the update breaking my package for users. It's the maintainer's fault but ultimately if the OS didn't upgrade dependencies for every program it would have worked. Which is why freezing dependencies is the way to go in terms of stability.
There are countlessly other scenarios, for example using a program compiled outside of Debian and dyn linking then Debian upgrading the dependency not knowing about the 3rd party program. It's all a trap for broken software.
This is not "Linux", it's users who mess around with (and ultimately break) dependencies in order to get the latest version of programs at any cost.
If one uses, say, an Ubuntu version and 3rd party repositories for that version only, they're not going to get any dependency issue, since all the programs will share the same library versions (dependency bugs do happen once in a long while, but they're rare, and they're exceptions).