Hacker News new | past | comments | ask | show | jobs | submit login

Personally, I find it amazing that a project the size of Linux has relatively few automated tests and testing done by its maintainers, leading to projects such as this (and the LTP, etc.) to come about to actually ensure ongoing quality.

How many other major projects the size of Linux have as little upstream testing?




There's a lot of upstream testing, even if not public.

You can bet pretty much every hardware vendor is running some kind of kernel CI internally.


The bet I usually make about hardware vendors is that, since their core competency is hardware, in many cases software is an afterthought.

Not to pick on hardware companies. Nearly every type of company has a few select areas that they focus on, hire for, and are truly good at. With everything else, they do what it takes to get by. Not because they don't care but because it takes a concerted effort to develop your organization into one that has high competency in any particular area.


Maybe PC hardware vendors has some automated kernel testing. I think it's different with embedded. The SoC company I worked for a couple of years ago didn't have anything like that.


That's because SoC companies for the most part simply never update the kernel. Whatever kernel they were using when the chip tapes out is the kernel they're still using when the products hit EOL. Wireless routers with Broadcom 802.11ac are all running a kernel branch from 2010.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: