Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting timing, as last night a Tesla crashed in my neck of the woods, bursting into flames and killing the 2 passengers. No one appeared to be driving, so it seemed they were stupidly relying on Autopilot.

https://www.click2houston.com/news/local/2021/04/18/2-men-de...



This is the reason Musk exaggerates in the other direction. For some reason* there is a fixation in the media such that every Tesla accident is a story. You wouldn't know the last time a Ford crashed off the top of your head.

Per [1], there are 22.5 billion miles driven in Teslas. Per [2], there have been 171 fatalities in Teslas. This comes to 0.76 deaths per 100 million miles driven. Wiki says US deaths per 100 million miles is about 1.1.

Tesla drivers are wealthier, and they're driving new cars in safer places. So this comparison is not one to one. But evidence does not support the claim that Teslas are less safe than anything else.

[1] https://lexfridman.com/tesla-autopilot-miles-and-vehicles/

[2] https://www.tesladeaths.com/

*tinfoil hat: there is a _lot_ of money that would prefer Teslas be thought of as unsafe.


> .. batteries kept reigniting. At one point, Herman said, deputies had to call Tesla to ask them how to put out the fire in the battery

FWIW this was what happened when there was a Tesla accident in my area. But for sure, these are only anecdotes. We have to compare the data with how often non-Tesla cars burn


That article does not make it clear if there was actually no human in the driver's seat. It says two men died, and it says there was a person in the front passenger seat and a person in the back seat. That implies, by omission, that there was no person in the driver's seat. Was there a third person who did not die, a person sitting in the driver's seat?

Obviously if there was no human in the driver's seat, these guys were "looking for trouble". And failing to negotiate a curve at an excessively high speed isn't something the autopilot should be expected to deal with (unless it has precise location data and precise map data which informs the reasonable limits of performance... but I don't think it has these).

It's absurd if people want to blame Tesla for this.


Yea how dare they blame the liars for lying about the vehicle's capabilities!

Tesla needs to stop calling it autopilot and FSD, this is complete and utter bullshit.

How many of these do fanboys need to see before they give up the mental gymnastics.

>It's absurd if people want to blame Tesla for this.

Are you fucking kidding me? That's some Olympic-level mental backflips right there. =)


The world, particularly the US, is full of over-promised marketing that borders on outright lying.

How many products and services on the market in the US use bold names and labels to suggest or even directly state a feature that in actuality doesn't come close to the expectation?

But on the autopilot topic, I would say it is you and the general public that has deemed "autopilot" to mean something that it historically does not mean. The most common use of autopilot is in aircraft, and in that domain it can be very crude and simple or highly automated. But with the exception of advanced autopilot systems in modern commercial jets, typical autopilot system merely hold a set heading, pitch, and level.

Even the FAA says this: "While the autopilot relieves you from manually manipulating the flight controls, you must maintain vigilance over the system to ensure that it performs the intended functions and the aircraft remains within acceptable parameters of altitudes, airspeeds, and airspace limits."

There are a lot of modern devices which can be very dangerous to use without some education. I would think someone willing to trust their car to drive them with no supervision is just dangerously foolish, and is possibly the same kind of person who might accidentally kill their buddy by flying a drone too fast and hitting him with it.

You can't dummy-proof everything. I doubt that the Tesla owner's manual says, "This autopilot system does absolutely everything. You can't crash it!" Instead I believe it tells you all the rules that you need to follow when using the autopilot. If you don't follow the instructions, that's on you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: