Hacker News new | past | comments | ask | show | jobs | submit login

> Responsible, ethical, and careful developments of dangerous technology, even if the danger promises to be reduced compared to the norm in some future iteration, is a must.

Agree. And I'd like to see an approach similar to an NTSB investigation of an air crash. Not to place blame, but to identify root causes and mitigate the chance that they recur.

Air travel is exceptionally safe. Has been for many decades. Yet we still investigate any air crash because any crash means something went wrong, perhaps something that can be corrected.

Self driving cars need to be treated the same way, at least for now, and until well after they have demonstrated themselves to be safer than human drivers.




This is the way to go, and it would help (I think) to ease the calls I’ve seen around for holding individual devs responsible. The point is not to collect scalps, but prevent tragedy and malfeasance. Holding companies responsible, sure, but some random programmer who missed a bug being the scapegoat seems like a way to ensure companies avoid responsibility by foisting it off on a low totem member; thst can’t be allowed to happen with lives on the line. Like an aircraft, self driving car software is never going to be the work of a single Dev. The NTSB isn’t perfect, but as these things go, very admirable.

IIRC the NTSB is investigating this case too.


Responsible and ethical are usually not words which are used together with Uber, but in this case it seems like it was simply an accident. Computers cannot beat Physics, if the woman suddenly stepped in front of a car which is unable to brake, it’s a tragic accident.

The idea of the investigations compared to air-travel is pretty good and also much easier to execute given they record everything on camera.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: