Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Let's say somebody gives you an USB stick and you plug it into your laptop. Which of the following scenarios would you like to see?

1. 0-day in the kernel's USB code. You're part of stuxnet now.

2. 0-day in the kernel's USB code. You're part of stuxnet now. You also get a message that tells you how and where to report the bug that was exploited.

3. 0-day in the kernel's USB code. Your computer crashes. You're not part of stuxnet. You also get a message that tells you how and where to report the bug that was exploited.

Linus is wrong (it happens). Exploit mitigation techniques aren't debugging tools. They're exploit mitigation techniques. The fact that they also produce useful debugging information is secondary.



This is exactly the kind of thinking Linus is talking about. In 3, I lost my work. Possibly very important work. To most people, being a part of stuxnet, while undesirable, is preferable to losing their work.

And you neglected a scenario 4: nobody is attempting to compromise my machine, but a buggy bit of USB code just crashed my system and took all my work with it.


You never learned at school to save what you are working on often? It's crazy, you are either too old to have required computer for school work or too young to have not lived through years of constant bluescreens.

Both 3 and 4 are mitigated by you saving your document often... it's not so bad, considering it can happen whatever you do.

Nowadays, Word is made to keep saving your change for that reason... They learned and designed it for the worst situation which is a whole system crash. If you can't handle that, well you aren't doing your work well.


Again, passing the buck. You know what I do instead of use your software that crashes all the goddamn time? I use someone else's software that doesn't.

Yes, of course we should save often, have decent backups, etc. But nobody is perfect and shit happens, and it'd be nice if the software you use didn't intentionally make it worse.


The problem is, what actually happened (in a previous commit) was:

The IPv6 stack does a perfectly sensible and legal thing. The hardener code misunderstands the legal code, and causes a reboot.

That it was Linus is worried about -- often it is hard to tell the difference between "naughty" code which can never be a security hole, and genuine security holes.

They should all be fixed ASAP, but making code that previously worked make a user's computer reboot, when it is perfectly fine, is not a way to make friends.


Bugs in the hardening code are obviously bad and annoying but that's besides the point. All bugs are bad and annoying, especially ones that cause a kernel panic. I don't think anybody is going to argue with that.

That's not what Linus said though. What he said is:

    > when adding hardening features, the first step should *ALWAYS* be
    > "just report it". Not killing things, not even stopping the access.
    > Report it. Nothing else.
and:

    > All I need is that the whole "let's kill processes" mentality goes
    > away, and that people acknowledge that the first step is always "just
    > report".
"Not killing things, not even stopping the access." Oh boy.


Step back a bit: when developing a new selinux policy, won't you develop first on permissive mode, and only after it's working without warnings, enable enforcing mode? It's the same thing here: the hardening should be developed first in a "permissive" mode which only warns, and then, after it's shown to be working without warnings, changed to be "enforcing" (in this case however, after some time the "permissive" mode can be removed, since new code should be written with that hardening in mind).


I didn't mean that to sound like I'm in favor of turning the thing on right away.

(Also, the quotes I chose don't really help me make my case but I don't want to edit now since you've already commented on it. His first mail is way worse: https://lkml.org/lkml/2017/11/17/767)

Basically what I'm disagreeing with is that exploit mitigation's primary purpose is finding and fixing bugs. That's just not true. Its primary purpose is to protect users from exploitable bugs that we haven't found yet (but someone else might have).


By first step, Linus just means "for a year or two". Yes it would be nice to put super high security on today, but instead we slowly turn up the setting, from opt in to opt out to forced on, to ensure we don't break anything.


4. 0-day in the kernel's USB code. Your computer crashes. You're not part of stuxnet. You also get a message that tells you how and where to report the bug that was exploited, but the part of your computer that was supposed to log the message died with the rest of the system, so you never see it and the bug never actually gets reported. Your computer continues to crash randomly for the next few days as an infected computer keeps trying to spread.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: