Hacker News new | past | comments | ask | show | jobs | submit login

I'd love to try it out but I can't bring myself to trust (on any level) something that's essentially a mobile phone running a hacked-together mashup of C and Python. There's no redundancy, no independent components cross-checking each other, nothing as far as I'm aware to try and provide any given level of reliability.

(Don't get me wrong, Python is great for research - I wouldn't regard it as highly reliable or good for high-availability systems, though.)




I’m not trying to convince you, but the a-ha moment for me was when I was watching George talk about reverse-engineering the firmware for part of the cruise control in his car, and after decompiling it, he found that there was no functionality for the manufacturer to verify that the firmware had been written properly when flashed by the manufacturer. So just because a big, expensive car company made something, doesn’t mean there is any redundancy or cross-checking.

By the way, the only cars you can install OpenPilot onto are cars that already have lane keeping assist (i.e. a camera that watches the lines) and adaptive cruise control (i.e. a millimeter wave radar that makes sure you don’t get too close to the car in front of you). So OpenPilot is just a vastly improved version of the software that those guys who didn’t write a firmware checksum routine wrote.


Oh, I'm not saying car companies are perfect and I'm well aware that the way the sausage gets made is nowhere as rigorous as we'd like (just take the Toyota unintended acceleration debacle!) I'm just saying I wouldn't add on top of that something that's essentially a phone app.

If nothing else, injecting stuff into your car's critical data bus seems like a bad idea. What if that less-than-carefully-written firmware you're talking about gets tripped up by a malformed CAN packet (which it would otherwise never have encountered) and fails in some fatal way?


On the other hand, maybe I lowered the chance of that happening by removing my stock LKAS system. I believe your implicit argument is that the manufacturer and the subcontractors and integrators who made the stock vehicle have a lower defect rate than the OpenPilot engineers, which I’m not sure if there’s any data to support.


A lot more people drive the stock version than the OpenPilot one. All else being equal, it's more likely that OpenPilot has a serious bug than the stock system.


While it was revealed at the Toyota software was terrible, was it actually shown to be the cause of the acceleration problems? Mostly it was the floor mats and people hitting the wrong pedal.


I don't know if they ever proved beyond possible doubt that any particular incident was caused by a software bug, but they did show that there were myriad ways in which the software in question could cause the acceleration problems. (In particular the process which handled reading the accelerator pedal position could crash and leave the throttle value fixed without tripping the watchdog timer, which iirc was doubly bad because in push-to-start vehicles there was no key to turn off.)

Source: http://www.safetyresearch.net/blog/articles/toyota-unintende...

More: https://embeddedgurus.com/barr-code/2013/10/an-update-on-toy...


With respect to Python and RTOS, this is a copy paste of what I posted few months ago on a similar thread, here on HN: ----

No safety relevant code is written in Python. All the safety relevant code runs real-time on a STM32 micro (inside the Panda), it's written in C and it's placed at the interface between the car and the EON. This code ensures the satisfaction of the 2 main safety principles that a Level 2 driver assistance system must have: 1- the driver needs to be able to easily disengage the system at any time; 2- the vehicle must not alter its trajectory too quickly for the driver to safely react. See https://github.com/commaai/openpilot/blob/devel/SAFETY.md

Among the processes that runs on the EON, you can find algorithms for perception, planning and controls. Most of it is actually autogenerated code in C++ (see model predictive controls). Python code is used mainly as a wrapper and for non-computational expensive parts. To use functional safety terminology, the EON functionality is considered QM (Quality Management). This means that any failure in delivering the desired output at the right time is perceived as bad quality and has no safety implications. So, how often those algorithms deliver the wrong output because some parts are written in Python? How often because RT isn't enforced? Negligible. Pretty much all the mistakes of a level 2 driver assistance system are due to the quality of the algorithms, the models, the policies etc… There is a long way to go before changing the coding language will be the lowest hanging fruit to improve the system. Until then, using the simplest and most agile coding language (given performance constraints) is probably the best way to maximize quality.


Claiming that the only "safety-relevant" (much less "safety-critical"!) functions of an auto-steer system are "must be able to easily turn it off" and "must only steer/accelerate/brake slowly" is pretty sketchy in the first place.

Claiming that these functions are adequately implemented by some software running on a single microprocessor? Just nope. No hard-wired shutdown/disconnect system, no redundancy/failover/self-checking, no visible attempts to follow any kind of coding standard... the whole thing's a science project, not a well engineered high-reliability system.

Edit: The reason I'm so adamant about this is that, while I don't consider myself a 'safety expert' or anything of the sort, I'm currently being forced to deal with this stuff (machine safety, not self driving cars) in my day job and it is WAY more indepth, rigorous and tightly regulated than any of the hand-wavey stuff that's being discussed here.


The question in mind is, how do you ensure that the safety code does not have a bug which makes it impossible to disengage the system.


In real life, eg aviation, you would formally verify the code and the firmware that interacts with hardware.

https://en.wikipedia.org/wiki/Formal_verification


>There's no redundancy, no independent components cross-checking each other, nothing as far as I'm aware to try and provide any given level of reliability.

And yet, it works beautifully. The redundancy is you; it's a level 2 system that works the same as cruise control. There have been no accidents with thousands of people using this to drive thousands of miles a day every day. You can continue waiting and hoping for a hypothetical perfect system, or you can have the future right now.


If anyone has been in a crash while they were driving with this, we won't know about it. Adding a third party, unregulated, quasi-legal modification to your car which can take over steering, brakes, and acceleration will instantly invalidate your insurance. Not to mention it being potentially illegal to use this on the road. Nobody is going to say they were in an accident while being driven by an openpilot.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: