HEADLINE: Keep selected packages up-to-date on stable
DESCRIPTION: If you are using Debian, especially stable, you have to put up with outdated packages. This is especially a problem with browsers, although you do include security updates and track Firefox ESR, if I understand correctly. But things like Webkitgtk do not recieve updates, and lack feature and security wise after a while.
I think keeping up-to-date versions and having a stable distribution is not per se a conflict. Stable means to me no breaking changes, no need for reconfiguration when I update. It shouldn't mean frozen in time.
It would be great if certain packages would recieve frequent updates even in stable:
- packages that are not dependencies, have a good track record of backwards compatibility, and are unlikely to break
- packages that have to be updated because of security issues (which I think is already addressed now)
- or because of a fast moving ecosystem - even if it was safe, it is frustrating to use a very outdated browser component. I think many networked packages could fit in this category, e.g. Bittorrent or Tor clients, if there are protocol changes.
I disagree. Stable should mean fixed, except for security bugs. You say "no breaking changes", but the Debian devs can only test the base system, they can't ensure that it won't break whatever software is running on the user's machine. In fact, my system may have a workaround for a bug that shipped with stable, and which might break if the bug is fixed later! Stable is stable, (non-security) bugs and all.
Plus, the point of the pre-release freeze is to make sure the whole system is stable. If you're constantly releasing new versions of packages, you can never guarantee that kind of stability.
Personally, I think people should just try Unstable; it works fine for a desktop/laptop system. I haven't had a major issue in years.
I haven't used Debian in a while, so yes, maybe "stable" is not the right branch for me.
I would have been looking for something that you install once, and then it just works without intervention for a long time. But at the same time I also want to have access to some latest software, without relying on third party repositories (because the chance of breakage is very high) or compiling myself (because then I have to track versions manually).
When I think of a "stable" system, I expect that the underpinning is stable, no structural changes. Don't laugh, but Windows XP was a bit like that for many people. Install it, confirm the occasional update, and it just keeps running for a decade. (Of course, not with the level of security I would expect from a Linux system...) If I really want a frozen version of an app, I can just pin the version or compile it myself.
I think this is a very general problem in the Linux world today, you can choose between "frozen" (stable, LTS) and "unstable" (frequent releases, rolling). There is no "stable underpinning, fresh apps" distribution, although the Linux/glibc undersystem is incredibly backwards compatible. If you compile all the neccessary libraries, you can install almost everything on a distro a few generations old. But it is very odious, and would be great if you didn't have to do it.
Probably you're right and I should give unstable a try next time. Or maybe an alternative would be to make backports more self-contained, so that they would pull in dependent packages (thereby breaking other apps)? Maybe containers like snap etc. are a solution?
I think snaps + Debian stable could be that solution. You get the stability of Debian at the system level with the reliability of app updates from snaps.
This is supposed to be what backports is for, but it’s very much a volunteer effort. It would definitely be good to see the Debian project commit to supporting a set of leaf packages on a rolling basis in stable. Fragmenting the project in this way might just be too much work for the maintainers though. How many different Debian installs can you realistically support?
I partially agree. In the sense that I would port regression fixes constantly (THIS is important security-wise), I would NOT like to have updates to major releases, as this means breaking-changes and it's against the whole Stable philosophy. GCC 6.4 is going to be released soon and I hope Debian Stable will get all those regression fixes that update will inevitably bring to the table. After my experience with Jessie (which still offers GCC 4.9.2 although version 4.9.4 fixed an impressive list of bugs and it's not even available in the backports), I'm not going to be so optimistic. This is so important it should not be left to the volunteers' will.
DESCRIPTION: If you are using Debian, especially stable, you have to put up with outdated packages. This is especially a problem with browsers, although you do include security updates and track Firefox ESR, if I understand correctly. But things like Webkitgtk do not recieve updates, and lack feature and security wise after a while.
I think keeping up-to-date versions and having a stable distribution is not per se a conflict. Stable means to me no breaking changes, no need for reconfiguration when I update. It shouldn't mean frozen in time.
It would be great if certain packages would recieve frequent updates even in stable:
- packages that are not dependencies, have a good track record of backwards compatibility, and are unlikely to break
- packages that have to be updated because of security issues (which I think is already addressed now)
- or because of a fast moving ecosystem - even if it was safe, it is frustrating to use a very outdated browser component. I think many networked packages could fit in this category, e.g. Bittorrent or Tor clients, if there are protocol changes.
I think the situation has improved a lot (https://blogs.gnome.org/mcatanzaro/2017/06/15/debian-stretch...), and it would be great to have a stable basis in future and still have up-to-date applications on top as far as possible.
DISTRIBUTION: stable (but also others)