Almost all linux distros that use the concept of a "release" will stay on the version numbers they originally released for. Security fixes are backported but no other functionality is imported, thereby keeping the feature set fixed to the version numbers in the original release.
If you are maintaining or developing any kind of server and services where you need an uptime guarantee above 0%, you will appreciate this. Otherwise you will need staff on standby 24/7 being ready to develop fixes for breaking backwards compatibility changes. Imagine if you had to deal with funky errors like these every minute of the day - https://gitlab.com/gitlab-org/gitlab-ce/issues/36028
For Ubuntu, you should follow their "usn" security notices and you would stay informed: https://usn.ubuntu.com/usn/
I can't speak for Ubuntu, but when I was FreeBSD Security Officer we would regularly backport patches because importing an entire new release would regularly break existing functionality or even add new security vulnerabilities. It annoyed the heck out of vulnerability scanning tools, but I decided that giving users a system which didn't randomly break when they applied security patches was far more important.
PHP has gotten better about "no BC breaks in patch versions" over the years, but the Debian/Ubuntu teams still insist on making people effectively run e.g. 7.1.8 while the version indicator says 7.1.1.
Ubuntu and Debian's PHP packages are largely the same.
> I've since convinced [most PHP devs I talk to] to stop using distro-provided packages, in favor of deb.sury.org.
You do realise that Ondřej Surý (of sury.org) is the primary Debian PHP maintainer? The point of his repositories (AIUI) is to allow users to mix and match PHP versions. The downside is that he's one person with (AFAIK) a bus factor of 1 when it comes to security updates. That's irresponsible to use in production.
In contrast, Debian and Ubuntu's PHP packages, essentially provided by the same person, has in addition teams (both in Debian and Ubuntu) who can pitch in when required.
AFAIK neither Debian nor Ubuntu lie about version numbers. They have stable versions that they backport security patches to, but they don't change the version number. That issue you link to doesn't contradict this.
This question is brought up every time this happens. Ubuntu uses stable (old) versions and does not upgrade to latest (unstable) releases very often. However, they do back port and apply security patches to these old versions.
Point in case: 16.04 TLS is still on 2.7.4 but it is fully patched and secure.
Yes, you're right. I was expecting to see the update by checking "git --version" and seeing 2.11.0-1 or something like that, but it's not visible that way.
Indeed if I check with "apt-get show git", the package is on version 2.11.0-2, and then I have to browse to the package web page at https://packages.ubuntu.com/zesty/git and click on the changelog and finally I get to the update information, which clearly contains the text, "SECURITY UPDATE: Arbitrary code execution on clients through malicious ssh URLs."
So it was patched as expected, it just wasn't easy for me to see that without going through a few extra steps.
It means that one group of developers is busy improving and fixing packages, and another different group is cherry-picking commits in order to maintain the illusion of stability (as mentioned above).
That second group has to duplicate all the testing work done by the first group, and additionally ensure that there are no new problems introduced.
It's very vulnerable to human error and adds a lot of unnecessary work.
It's better to have a competent "second group" do the backport job rather than every single user of a distro having their rug pulled out under them every time they patch. New versions of software can quickly break your environment. For example, https://gitlab.com/gitlab-org/gitlab-ce/issues/36028 . If you have to re-test every script and web service to ensure compatibility with the daily churn of vulnerability fixes you would get nothing done and you could almost never be sure your service is stable.
I agree, this is a problem, but it's not being solved by this practice.
The example you gave is specifically about a client application of git that wasn't updated to work with the current version of git - probably because of some twisted logic that they didn't have to worry about it until some future date.
It's an example of how things break if you upgrade to new versions of software. If you avoid going to new versions of software, and instead just backport the essential security fixes, there will be much fewer surprises like this.
Imagine you had a daily backup script that uses git to commit and push to a remote server perhaps managed by someone else. You then upgrade to the latest and greatest version of git to try to patch this vulnerability, and surprise, your backup script fails! Hope you weren't planning on spending the rest of the day getting the guys running the git server to implement a workaround, who might have already left for the weekend, leaving you with no backups or having to manually try to fix your git to work with that server again.
Well then, I guess with that line of reasoning we should just turn off the computers and go live in the woods. There's obviously too much risk in doing anything.
I see it differently. Where do new bugs and vulnerabilities come from? When the main developers add features or make changes to existing features that go beyond fixing bugs.
From the point of view of many server administrators, using the latest versions of everything is inherently risky. What they want to use is a stable, solid version that has all the latest security fixes.
It's unlikely that these opposing viewpoints will ever be reconciled.
It's important for package developers to be aware of other software that depends on their interfaces or functionality.
Some cases will slip through sometimes, but over a couple of releases these should be gone.
>> Where do new bugs and vulnerabilities come from? When the main developers add features or make changes to existing features that go beyond fixing bugs.
Do you have stats for that?
Semantic versioning was supposed to be the fix for that, but as Rich Hickey has pointed out, that is also broken.
Everyone is their own server admin these days. We all want "a stable, solid version that has all the latest security fixes" but it's difficult accept that that might be impossible.
Ideally, we'd rather see upstream release this fix but we've all read https://news.ycombinator.com/item?id=14051106 and I don't want to be too impatient with software that upstream has graciously provided for free (in both senses of the word!)
I guess my point is backporting might make sense if there are small changes we can make to enhance release. However, I agree that there is too much duplication of effort going on.
And, I'm not too surprised, KDE has a broken development and maintenance process.
I'm sure they think they're doing a great job - but that's because they don't even see or receive most bug reports.
> And, I'm not too surprised, KDE has a broken development and maintenance process. I'm sure they think they're doing a great job - but that's because they don't even see or receive most bug reports.
KDE actually fixed the bug but it hasn't gone to a tagged release after two months. Maybe they think this small change doesn't warrant a release? I asked if KDE neon can pick it up but they (understandably) seem to see this as feature creep. KDE applications means something different for the neon folks (only the things they ship with neon from what I understand).
I agree. I have no doubt we are all trying to do our best. =)
The update this morning for Ubuntu 17.04 installs git 2.11.0
.. NOT the patched 2.11.3 NOR the latest 2.14.1
Why isn't Ubuntu fully up to date?