Hacker News new | past | comments | ask | show | jobs | submit login

Note that it would need to drive those 275 million miles without incident to be safer than a human.

Which for Tesla's FSD is obviously not the case.

https://www.motortrend.com/news/tesla-fsd-autopilot-crashes-...




Your video and my response were talking about fatal crashes. Humans don't go 100 million miles between crashes.

Has FSD had a fatality? Autopilot (the lane-follower) has had a few, but I don't think I've heard about one on FSD, and if their presentations on occupancy networks are to be believed there is a pretty big distinction between the two.


Isn't "FSD" the thing they're no longer allowed to call self driving because it keeps killing cyclists? Google suggests lots of Tesla+cyclist+dead but with Tesla claiming it's all fine and not their fault, which isn't immediately persuasive.


> Google suggests lots of Tesla+cyclist+dead but with Tesla claiming it's all fine and not their fault, which isn't immediately persuasive.

With human drivers -- are we blaming Tesla for those too?

You do you, but I'm here to learn about FSD. It looks like there was a public incident where FSD lunged at a cyclist. See, that's what I'm interested in, and that's why I asked if anyone knew about disengagement stats.


It appears that the clever trick is to have the automated system make choices that would be commercially unfortunate - such as killing the cyclist - but to hand control back to the human driver just before the event occurs. Thus Tesla are not at fault. I feel ok with blaming Tesla for that, yeah.


Is that real? I've heard it widely repeated but the NHTSA definitions very strongly suggest that this loophole doesn't actually exist:

> https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_Da...

The Reporting Entity’s report of the highest- level driving automation system engaged at any time during the period 30 seconds immediately prior to the commencement of the crash through the conclusion of the crash. Possible values: ADAS, ADS, “Unknown, see Narrative.”


"It appears" according to what?

Stuff people made up is a bad reason to blame a company.


From here[1]:

> The new data set stems from a federal order last summer requiring automakers to report crashes involving driver assistance to assess whether the technology presented safety risks. Tesla‘s vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.

[1] https://www.washingtonpost.com/technology/2022/06/15/tesla-a...


You also need to cite them using that as a way to attempt to avoid fault.

Especially because the first sentence you quoted strongly suggests they do get counted.


Yeah, their very faux self driving package.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: