Hacker News new | past | comments | ask | show | jobs | submit login

Yes, but there was a time for all of those things in which they did not have a century of development. Deploying them is how you get that century of development. The risk of this stuff is really quite small and local. If the technology has faults, they'll become apparent relatively quickly and harm a relatively small number of people. I see no reason to be overly cautious here. It's not a massive correlated risk, like say, global warming.



We put the new tires and brakes on the the fastest vehicles we could make, then had a professional run them around a track with a dozen other lunatics. Anything that broke was fixed forever, and relatively few lives were lost. The number that died was low because there were fewer participants. The critical technology was well-tested before it was sent to the production line.

That's how we got our century of development. Testing, verifying, refinement. It wasn't by pushing out new stuff to a batch of users and seeing if everything worked out allright.


The number of recalls issued weekly by the car industry gives lie to that claim. No, we just live with the risk. Because we love our cars so much.

Just recently it was discovered a series of air bags manufactured in a small window of a couple of days, would instead of protecting the occupants, would emit high-speed shrapnel into their chest and face killing them. Quite an oversight.


That recall is actually far larger than a few airbags made over a couple days (it's more like 65-70 million airbags) but this is exactly my point. The operating principle of an SRS system is pretty straightforward. Computer detects a crash, computer fires the airbags. It's actually a separate module that's only got the one job. Despite its relative simplicity there are also recalls where it sometimes fails to fire the airbag [1], or sometimes fires it for no reason at all! [2]

If the automakers can't catch problems in a simple system like this, just imagine what sort of new and exciting failures we'll see in an inadequately-tested car autopilot! These things need to be very thoroughly tested before we allow lots of them on the road, and that needs to happen before they omit the steering wheel.

[1] http://www.automotive-fleet.com/channel/safety-accident-mana...

[2] https://www.autoblog.com/2015/10/30/fca-recalls-894k-total-v...


Don’t you think that the number of recalls would be even higher without motor racing being used as a proving ground?

If your airbag fires you’re already in an accident and might die anyway. You’ve already accepted that risk. There are probably better recall examples but I’m not sure it’d disprove the underlying point though.


But isn't that what they are doing right now? The only difference in terms of your argument seems to be that it's not other race car drivers and spectators at risk but other drivers and pedestrians. Then again they are not racing, but testing autonomous vehicles with drivers as a backup.


The only difference.

Other than that, Mrs. Lincoln, how was the play?


What's the reasonable test surface for a new type of brake or tire or electrical system? These are things that you can reasonably well do some fundamental engineering analysis on and say "that seems like a good design" or not. It's also something that you can test in a way that gives you a high confidence that your tests are sufficient to provide high confidence in the design. And even then there are problems that slip through.

Switching from a human driver to software automated driving is a much bigger change in automobiles than has ever come before. It's also one with a much larger reasonable test surface, by orders of magnitude. Systemic risk? How about meltdown and spectre vulnerabilities? The failure mode of most new hardware in cars requires either edge cases or corner cases to result in the worst case scenario of a crash at speed. And yet that's a very common failure mode in autonomous driving. You can't realistically have a car that can autonomously drive itself 90% or even 99% of the time, it must be nearly exactly 100% of the time to be worthwhile and safe.




The deadline for YC's W25 batch is 8pm PT tonight. Go for it!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: