Hacker News new | past | comments | ask | show | jobs | submit login

Tesla claimed[1] that:

> Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of. There are over 200 successful Autopilot trips per day on this exact stretch of road.

Which might be technically true, but conveniently ignores other, very similar crashes.

[1] https://www.tesla.com/blog/what-we-know-about-last-weeks-acc...




It took people a day to reproduce the crash and end up in a near-collision:

https://www.youtube.com/watch?v=6QCF8tVqM3I


my question which is the same as others, does Tesla register events where the driver overrides the system? If they register it how do they account for it?

if they are not even considering that the driver overrode the system because it was going to crash then all their claims are suspect.


Why? The car isn't driving by itself. What matters is whether Autopilot is a net safety improvement, which it currently appears to be. Why would you add the weird requirement to the system that it be better without human input at all?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: