Others have pointed out some errors with the methods employed in this report.
Still, who else is collecting this kind and fidelity of data? I found the trends very informative. First that we're still a long long way from having hardened base operating systems, but also that the trend is positive and slowly moving in the right direction.
Even just getting a breakdown of CVEs is interesting (though I would have liked better granularity than "bypass something") for both trends and to understand just how many DoS issues come up per year versus say code injections or overflows.
If the data collection is faulty, then it doesn't matter how interesting the results are or how needed they may be, because they're compromised and unreliable. Yes, this should be studied, but to be at all useful the study has to be accurate.
"For each distribution, we downloaded all its packages, and analyzed the hardening schemes of their enclosed binaries... Our findings confirm that even basic hardening schemes, such as stack canaries and position independent code, are not fully adopted."
Is this finding that basic hardening techniques are not applied to every available binary? I would expect some hardening techniques to break some binaries.
Still, who else is collecting this kind and fidelity of data? I found the trends very informative. First that we're still a long long way from having hardened base operating systems, but also that the trend is positive and slowly moving in the right direction.
Even just getting a breakdown of CVEs is interesting (though I would have liked better granularity than "bypass something") for both trends and to understand just how many DoS issues come up per year versus say code injections or overflows.