Hacker News new | past | comments | ask | show | jobs | submit login

> This happens over and over. So far, they've hit a crossing semitrailer, a street sweeper, a car at the side of the road, a fire truck at the side of the road, and two road barriers. This has been going on for five years now, despite better radars and LIDARs.

And in that time, how many people have crashed into other people, whether they're in cars or not? Buildings? Walls? Children?

Are you claiming that Tesla's safety performance is worse than that of people?




Man, this kind of arguments is getting tiring... It is obvious that 100% of human in the OP situation having no particular medical condition would perform better than tesla autopilot.

Tesla’s autopilot doesn’t fail because of malfunctions such as short-circuits or bad weather conditions, or anything special happening (as opposed to humans), it fails under perfectly normal condition.

You can’t even start to compare autopilot safety performance to people performance unless you work for tesla’s marketing, for the simple reason that no human is in theory authorized to let that thing drive itself without supervision. What you’re comparing is human + autopilot vs only human.

What seems extraordinary, is that human+autopilot actually performs worse than just humans in some conditions, such as OP, simply because of the way autopilot was advertized.


> human+autopilot actually performs worse than just humans in some conditions,

This is well studied in the aviation field, and an entire subfield of psychology to study this exists called "Human Factors".

There are many many fascinating results from studies in Human factors in aviation, but one that I can point out is that humans pay less and less attention as you present them with more and more reliable automation, to a point where automation can start to decrease safety, as humans get so complacent, relaxed, and distracted (my instructor called this "flying fat dumb and happy") that when they ARE called upon to take over and act suddenly, they simply are entirely unaware of the situation and need more time to get the hang of it than there is time before decisive action is needed.

Some reading:

https://www.skybrary.aero/index.php/Cockpit_Automation_-_Adv...

https://www.ise.ncsu.edu/wp-content/uploads/2017/02/Parasura...


There's a very simple fix, which is to have a human driving and allow autopilot to work only in an emergency, when it detects the car is about to crash. Then you get all of the touted safety advantages, and none of the bizarre accidents.


unfortunately i don't think detecting potential crash is that easy ( and i think it's part of the original problem).

eg in OP. There could also be other situation where autopilot think there's a problem when there's none.


Tesla has shipped ~1M cars total, USA registers ~250M new cars every year. Accidents happen around 1 in 50 cars per year in the USA. Most of the human accidents happen while drunk, distracted, speeding, or driving recklessly. Suspiciously the DriverKnowledge source adds these up to 103% of causes.

But we can say that a Tesla autopilot is never drunk, distracted, should never be speeding or driving recklessly. Crashing into an upturned truck, a fire truck, a street sweeper, a parked car, while sober, paying attention, driving carefully and attentively, is worse than human driving.

[1] https://www.statista.com/statistics/502208/tesla-quarterly-v...

[2] https://www.statista.com/statistics/183505/number-of-vehicle...

[3] https://www.driverknowledge.com/car-accident-statistics/


The answer would depend on no. of incidents per km driven. Apples to apples.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: