Hacker News new | past | comments | ask | show | jobs | submit login

Of course, right now, when you get in your car, you are trusting your life to each driver of an oncoming vehicle (and side traffic, and...).

So the better question is whether you trust the median driver more than you trust the process used to approve the self driving system. This is a much different test than your disgruntled engineer.




If I get in a self driving car, I am still trusting my life to each driver on the road, but now I am also trusting the software of the car.

In an ideal world I would not mind self driving vehicles nearly so much if everyone had one. Advocates of self driving too often forget that there is an adoption period for technology and that for a vehicle it will span a decade. During that time, why I should I trust the firmware/software to react appropriately to other people's insane/suicidal driving maneuvers? For that matter, when those people are a minority on the road, I'm not sure that they wouldn't become worse, engaging in ever more stupid maneuvers because they can assume the other car is 'smart enough' to move out of the way.

It's a great idea to remove the human element from driving completely in 99.99% of scenarios. The problem is that its the 00.01% of scenarios that kill you. The other problem is that adoption doesn't happen overnight and during that stage I see no logical reason to put myself in more danger by removing my ability to react to my surroundings and instead trusting that the software of my car can handle the suicidal driving of angry high-way drivers.


Not really. The first rule of driving a car, is assume that everyone else driving a car is trying to kill you, is blind and is absolutely rubbish at driving.

If you keep that rule in mind at all times, you'll be pretty safe.

I would absolutely trust a random driver far more than a self driving car.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: