Hacker News new | past | comments | ask | show | jobs | submit login

If autopilot created a situation in which a crash was inevitable with no warning to the user. E.g. if it suddenly grabbed the steering wheel and swerved right resulting in the car spinning out in only a few hundred ms for no good reason.

Even if autopilot caused a crash such as above, you still have to run a cost benefit analysis of whether or not it prevented more crashes than it causes. Considering the failure rate of humans it's ok for autopilot to have a non-negligible failure rate, and it doesn't really matter if those failures happen in the same places as the human failures.




Wouldn't Tesla say "you should have had your hands on the wheel, therefore it's not Autopilot's fault"?


Hands on the steering wheel doesn't mean affirmative control. That's why the example includes putting the car in an unrecoverable scenario faster than an attentive person can react.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: