Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Given that these gentlemen are known to have at least ordinary prudence, even in the face of a $10k loss, an operational definition might take the form

  L5 = the person asserting that L5 has been achieved is willing to be driven by the vehicle, without access to the controls, through mixed conditions for X hours.


I feel that is pretty bad. It doesn't require system to actually work too well. It doesn't name anything about effectiveness of system. Just safety. Still, effectiveness might come to abstract things like:

L5 = vehicle can achieve similar travel times and destinations to average human driver in mixed conditions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: