Hacker News new | past | comments | ask | show | jobs | submit login
Openpilot – open-source self-driving agent (github.com/commaai)
262 points by boramalper on May 12, 2019 | hide | past | favorite | 111 comments



I work as a lead autonomous vehicle engineer.

Here's a few thoughts:

* This isn't really open-source. The most important part, the neural network perception pipeline, is closed sourced. Without the training data and the network architectures, the supporting code is not that useful.

* This system is non-deterministic, which is a no-no in critical systems like autonomous vehicle. There's no guarantee that vehicle detection or throttle/brake commands will be executed on a acceptably safe timeline.

* I think the supporting tools like Panda and Giraffe are great for those wanting to get into car hacking.

------------------------------------------------

By the way, if you want to to become an autonomous vehicle engineer I put together a resource guide here: https://becomeautonomous.com


1) They actually open sourced the vision pipeline code a few months ago. The visiond binary has been replaced by source code.

2) Yeah. The proper way to do these things is with a real RTOS, but at the same time 20hz is pretty slow anyways.


I should also mention that the RT approach isn't the holy grail either for time sensitive robotic control systems. Read up on the Pathfinder/vxWorks priority inversion bug if you'd like to see how hard deadlines can sometimes cause more problems than they solve...


priority-inversion is well-known and easily coped with.

Hard real-time, deterministic code is simply a prerequisite for safety in these types of systems, you have to be able to reason about how your system will behave in worst-case situations.


What’s the answer then?


The model, training tooling, and data appear to still be closed, no? So it’s still a black box, just now the “load the model binary and run it” part is open source.


> Without the training data

Big data engineer here...

This is what a LOT of open source hackers don't understand about modern systems.

It's all about the data.

Most "futuristic" technologies that we're going to see in down the road are going to be built using absolutely massive data sets.

Data sets that aren't in the public domain.

This effectively locks out hackers and open source enthusiasts and means that only companies like Facebook, Google, and Amazon get to play with all the cool toys.


Open data isn't glorified to the same degree as open source is by the common hacker, but it should be. That's where the future is headed


I'd love to try it out but I can't bring myself to trust (on any level) something that's essentially a mobile phone running a hacked-together mashup of C and Python. There's no redundancy, no independent components cross-checking each other, nothing as far as I'm aware to try and provide any given level of reliability.

(Don't get me wrong, Python is great for research - I wouldn't regard it as highly reliable or good for high-availability systems, though.)


I’m not trying to convince you, but the a-ha moment for me was when I was watching George talk about reverse-engineering the firmware for part of the cruise control in his car, and after decompiling it, he found that there was no functionality for the manufacturer to verify that the firmware had been written properly when flashed by the manufacturer. So just because a big, expensive car company made something, doesn’t mean there is any redundancy or cross-checking.

By the way, the only cars you can install OpenPilot onto are cars that already have lane keeping assist (i.e. a camera that watches the lines) and adaptive cruise control (i.e. a millimeter wave radar that makes sure you don’t get too close to the car in front of you). So OpenPilot is just a vastly improved version of the software that those guys who didn’t write a firmware checksum routine wrote.


Oh, I'm not saying car companies are perfect and I'm well aware that the way the sausage gets made is nowhere as rigorous as we'd like (just take the Toyota unintended acceleration debacle!) I'm just saying I wouldn't add on top of that something that's essentially a phone app.

If nothing else, injecting stuff into your car's critical data bus seems like a bad idea. What if that less-than-carefully-written firmware you're talking about gets tripped up by a malformed CAN packet (which it would otherwise never have encountered) and fails in some fatal way?


On the other hand, maybe I lowered the chance of that happening by removing my stock LKAS system. I believe your implicit argument is that the manufacturer and the subcontractors and integrators who made the stock vehicle have a lower defect rate than the OpenPilot engineers, which I’m not sure if there’s any data to support.


A lot more people drive the stock version than the OpenPilot one. All else being equal, it's more likely that OpenPilot has a serious bug than the stock system.


While it was revealed at the Toyota software was terrible, was it actually shown to be the cause of the acceleration problems? Mostly it was the floor mats and people hitting the wrong pedal.


I don't know if they ever proved beyond possible doubt that any particular incident was caused by a software bug, but they did show that there were myriad ways in which the software in question could cause the acceleration problems. (In particular the process which handled reading the accelerator pedal position could crash and leave the throttle value fixed without tripping the watchdog timer, which iirc was doubly bad because in push-to-start vehicles there was no key to turn off.)

Source: http://www.safetyresearch.net/blog/articles/toyota-unintende...

More: https://embeddedgurus.com/barr-code/2013/10/an-update-on-toy...


With respect to Python and RTOS, this is a copy paste of what I posted few months ago on a similar thread, here on HN: ----

No safety relevant code is written in Python. All the safety relevant code runs real-time on a STM32 micro (inside the Panda), it's written in C and it's placed at the interface between the car and the EON. This code ensures the satisfaction of the 2 main safety principles that a Level 2 driver assistance system must have: 1- the driver needs to be able to easily disengage the system at any time; 2- the vehicle must not alter its trajectory too quickly for the driver to safely react. See https://github.com/commaai/openpilot/blob/devel/SAFETY.md

Among the processes that runs on the EON, you can find algorithms for perception, planning and controls. Most of it is actually autogenerated code in C++ (see model predictive controls). Python code is used mainly as a wrapper and for non-computational expensive parts. To use functional safety terminology, the EON functionality is considered QM (Quality Management). This means that any failure in delivering the desired output at the right time is perceived as bad quality and has no safety implications. So, how often those algorithms deliver the wrong output because some parts are written in Python? How often because RT isn't enforced? Negligible. Pretty much all the mistakes of a level 2 driver assistance system are due to the quality of the algorithms, the models, the policies etc… There is a long way to go before changing the coding language will be the lowest hanging fruit to improve the system. Until then, using the simplest and most agile coding language (given performance constraints) is probably the best way to maximize quality.


Claiming that the only "safety-relevant" (much less "safety-critical"!) functions of an auto-steer system are "must be able to easily turn it off" and "must only steer/accelerate/brake slowly" is pretty sketchy in the first place.

Claiming that these functions are adequately implemented by some software running on a single microprocessor? Just nope. No hard-wired shutdown/disconnect system, no redundancy/failover/self-checking, no visible attempts to follow any kind of coding standard... the whole thing's a science project, not a well engineered high-reliability system.

Edit: The reason I'm so adamant about this is that, while I don't consider myself a 'safety expert' or anything of the sort, I'm currently being forced to deal with this stuff (machine safety, not self driving cars) in my day job and it is WAY more indepth, rigorous and tightly regulated than any of the hand-wavey stuff that's being discussed here.


The question in mind is, how do you ensure that the safety code does not have a bug which makes it impossible to disengage the system.


In real life, eg aviation, you would formally verify the code and the firmware that interacts with hardware.

https://en.wikipedia.org/wiki/Formal_verification


>There's no redundancy, no independent components cross-checking each other, nothing as far as I'm aware to try and provide any given level of reliability.

And yet, it works beautifully. The redundancy is you; it's a level 2 system that works the same as cruise control. There have been no accidents with thousands of people using this to drive thousands of miles a day every day. You can continue waiting and hoping for a hypothetical perfect system, or you can have the future right now.


If anyone has been in a crash while they were driving with this, we won't know about it. Adding a third party, unregulated, quasi-legal modification to your car which can take over steering, brakes, and acceleration will instantly invalidate your insurance. Not to mention it being potentially illegal to use this on the road. Nobody is going to say they were in an accident while being driven by an openpilot.


> The most important part, the neural network perception pipeline, is closed sourced.

Knowing what things are is certainly a part, and it may be the least solved part, but surely there is a lot of value in rolling everything else that goes into autonomous driving into a car-pluggable package. Trivial setup for sensing, fusing, actuating, and thrusting is pretty big for most people even if you don't get to see the more fuzzy stuff.


What do you mean it's non deterministic? Because it is using floating point operations? Can you elaborate?


I'm not OP, but I think he means that the final decisions acted upon are not deterministic. So given an input, it won't necessarily produce the same decision every time.

For instance, if the model predicts with a 95% certainty to accelerate, then 5% of the time it'll do some action other than accelerating.


might it also be 'non-deterministic' if it accelerates 100% of the time, but accelerates at different speeds 5% of the time? or is that too granular a view?


Deterministic code would be running under a hard real-time kernel such as VxWorks or bare-metal and would be designed such that all timing has fixed upper-bounds.

This involves reasoning about the scheduling of tasks to ensure that in worst-case situations, your timing requirements are still being met.

All safety-critical systems would be deterministic, you simply would not get certified otherwise.

Note: Theres much more to safety-critical application design, but basically, you have to be able to prove the behaviour of you code under all circumstances, which typically means no garbage-collection, dynamic-allocation, etc.


Not a real-time OS?


Yep, it seems built to run on NEOS, which is an Android distribution: https://github.com/commaai/neo


I'd guess it's about guaranteed response times (which is the definition of a realtime system - it must respond within the specified time). Since it's running a non-realtime OS, you could get arbitrarily long gaps in execution, which is not great for something that's driving a car.


Maybe they are talking about the numerical stability of the algorithms?


I looked at your website. I just want to work on the perception part (CV and Deep Learning pipeline). Do I still need to learn all the robotics stuff you mentioned?


I worked on Openpilot for around a year. Although many of the "cool" parts are not open source, there are still lots of interesting goodies in there worth checking out.

For example, there is a semi-standalone CAN processing library: https://github.com/commaai/openpilot/tree/devel/selfdrive/ca...

A shitload of hacks and workarounds that make the system work on dozens of cars: https://github.com/commaai/openpilot/tree/devel/selfdrive/ca...

Saving state on devices that arbitrary drop successful file writes when the battery dies.

https://github.com/commaai/openpilot/blob/devel/common/param...


The value is definitely in the platform-relevant hacks.

Stuff like this makes the reader think the authors have never seen a basic transactional database before: https://github.com/commaai/openpilot/blob/devel/common/param...


I've been using Openpilot in my 2018 Corolla for the past few months. Some takeaways:

It's mind-blowingly good. Easily on par with Autopilot (That is, Autopilot's ACTUAL capabilities, not what they show in marketing promos). For long highway trips, it's easy to go hours without a disengagement.

The full setup costs about $800, is completely plug-and-play, and you can order everything you need on the site (EON/Panda/Giraffe). See: https://www.youtube.com/watch?v=UfF9l2-orTk

There's a huge active community working on adding support for more cars and even custom forks with more advanced features like dynamic follow. Someone even recently got a VW bus working with it. https://discordapp.com/invite/avCJxEX

Also it's just really freaking cool to have a self driving Corolla. Can't stress how much this thing has changed the way I view driving and going places. It's no longer a pain, but a joy to sit in stop and go traffic. It's not perfect, but it's pushing the border of level 3 autonomy and far better than anything that any manufacturer besides Tesla is shipping.


I had no idea the software was in use and this stable! Is the discord community the best way to follow progress or are there other resources?


>I had no idea the software was in use and this stable! Is the discord community the best way to follow progress or are there other resources?

Yep, the Discord is super active and there's plenty of people to help you out if you hit any snags. But the hardware is super easy to install and ready to go if you have a compatible car. There's also a lot of active Youtubers like ku7 (https://www.youtube.com/channel/UCXmUBvIuFLjLRuJ0mX298Ng) and arne (https://www.youtube.com/user/arneschwarck) that are building their own forks.


i'm surprised i didn't get mentioned, since I have a fork dedicated to the Corolla. but yes the system is actually in use and comma have logged over 10 million miles already. on top of that, the Discord community reports any and all accidents and none have been due to openpilot. there have been a few accidents were people got rear-ended or sideswiped, but fortunately there has never been an accident with the system engaged or due to openpilot.


Openpilot is not on par with Autopilot. There's a long list of things Autopilot can do that Openpilot will never be able to do with existing hardware.


Of that long list, what do you think Tesla owners value the most? ACC+Lane Keeping (what openpilot does) is the answer: https://pbs.twimg.com/media/D4y0Fb7XsAEKhFh.jpg


Well yes, of course those are valued highest. What's the point of auto lane changing if you can't ACC and lane keep?


>There's a long list of things Autopilot can do that Openpilot will never be able to do with existing hardware.

Such as what? Autopilot is completely vision/radar based as well. Sure it has a few more cameras for a greater margin of safety, but its' effective capabilities are the exact same. It's still completely reliant on tracking lane lines, the user has to disengage at stop lights, and it can't make 90 degree city street turns. The only real difference I can see is automatic lane switching, which is available in some Openpilot forks.


Such as Navigate on Autopilot(freeway interchanges and exits), speed based lane changing, merging, object detection. There are 8 cameras, 12 ultrasonic sensors, and a radar. They are not the same.


For those who haven't seen Openpilot before, or recently:

* George Hotz (not "Hortz") is working on other stuff now [0], and is no longer CEO.

* While "I'm running open-source self-driving software from GitHub" sounds terrifying, the software is very good at what it does, which is very limited (LKAS / ACC / driver monitoring). Browsing test videos on YouTube (https://www.youtube.com/results?search_query=openpilot+0.5) is a good way to get an impression for how well it works; the consensus I've found is that it's better and safer than any stock systems except Tesla AP and Super Cruise.

[0] http://pixie.tech


Damn, I was actually planning on applying to Comma for an internship because George Hotz was there, just to see if I could learn anything from him.


I've toyed with the idea of applying to comma, seemed like a good idea to learn tons in computer vision. but if anything Hotz himself would be why I wouldn't do it. I enjoy the roots of his f u attitude - but : we're so much smarter than anyone else in this market, is simply not true


I knew he stepped down but I didn't know he was starting a new venture already. Is there any indication what the new venture is about? The site doesn't say much unfortunately.


Per the site and his instagram [0], it's an AR thing of some sort. He was playing around with a North Star prototype [1] on stream a few days ago [2]. But it's still very early–if he actually sticks with this project I imagine it will be at least few months before anything substantial is announced.

[0] https://www.instagram.com/p/Bw3GEQNHywS/

[1] https://developer.leapmotion.com/northstar

[2] https://www.youtube.com/watch?v=hnQBlUvejWc


One side of me thinks this is really, really cool.

The other side is running away as fast as possible. If anything shouldn't be developed by committee, it's something that controls a few thousand pounds hurtling down the highway at 70MPH with at best an observant human watching over it.

Here's my reasoning:

    Mistakes in Django? Bad things happen.
    Mistakes in the Linux kernel? Potentially really bad things happen.
    Mistakes in self-driving cars? Really bad things certainly happen.
Oh…and it looks like anyone can install it with just a few extra electronic parts?


To be fair, all self driving systems need to be fully open, to be inspected by the public. They are way too safety critical to be closed silos controlled by each car company.

But I agree, projects like this are also potentially dangerous. How extensive is the testing? Where is the liability? Is that unit running a read-time OS that will not deadlock or have thrashing which could result in a crash? I've written about these and other concerns a few months back:

https://penguindreams.org/blog/self-driving-cars-will-not-so...

Drivers do need to be held to having their hands on the wheel and their eyes on the road (I think it's the Chryslers that actually track your eye movements and alert if your eyes aren't forward).

There is some early evidence to suggest these safety features could make people less attentive or more likely to let themselves be distracted.

Personally I'll stick to my 2006 5-speed car with no auto-cruise or lane assist. It forces me to stay fully aware, and I have to use all four of my appendages constantly. It's a very active, non-passive driving experience. I live in a city and don't drive often (I've fueled up once since December), so I prefer driving to be engaging.


Your car is 13 years old and you could stand to benefit from the safety advantages available in a modern "no auto-cruise or lane assist" vehicle. The safety tech has advanced a lot in that time. This is a quite good way to invest your scrip imo.


As long as it's statistically better than a human driver, it's all good. Human drivers fail all the time, resulting in mass casualties in every single country.


Some might argue the transparency and eyeballs are even more crucial in such a situation.


Anyone could anyway. I'd imagine this is safer than a solo effort.

I think it's cool. Driving is risky anyway, this sort of thing could help improve that. I'm not sure it's any worse than a closed source alternative.


What if a bug in this software kills an innocent pedestrian?


How do you think the propriety systems work?


Well, they're often (always? can't say that as I haven't gone all the way into all of them, I suppose) built using actual real-time systems with latency guarantees rather than Python running on off-the-shelf Linux, to start...



ROS (Robot Operating System) is not a RTOS (Real Time Operating System). ROS itself is very very non deterministic and has no guarantees about scheduling.


What's the alternative for something as complex as self-driving system?


Of course it's extremely cool. I wish this existed when I was in uni and we built self driving prototypes. I also remember how often we messed up completely and would have killed our self if it weren't for us driving at 10 km/h on closed off air strips.

The thing is how this project explicitly tells people to use it is way over the line.

"THIS IS ALPHA QUALITY SOFTWARE FOR RESEARCH PURPOSES ONLY. THIS IS NOT A PRODUCT. YOU ARE RESPONSIBLE FOR COMPLYING WITH LOCAL LAWS AND REGULATIONS. NO WARRANTY EXPRESSED OR IMPLIED."

This is not going to help Hortz or the other programmers in the project when the government is coming for them.


This would have been great for the DARPA team back in grad school. But yes, even back then our DARPA crew limited car speed to 10mph during a lot of the initial work, and it was either out in the old football field or the campus police would let us tape off the parking lot in the engineering building on weekends.

Even as far as the open source tech has come, videos fulling explaining how to use this in production on real roads is pretty irresponsible; especially since there are no warnings in the videos and no claims of liability.


This seems mind bogglingly dangerous. At least with commercial self-driving vehicles, there is a responsible legal entity (the car manufacturer). The thought of thousands of amateurs and enthusiasts releasing their hacked self-driving cars on the public roads is terrifying.

The disclaimer "YOU ARE RESPONSIBLE FOR COMPLYING WITH LOCAL LAWS AND REGULATIONS" is nice, but I can't that stopping the mindset of "I've just tweaked the algorithm a bit; let's see how it goes".


> At least with commercial self-driving vehicles, there is a responsible legal entity (the car manufacturer).

That’s not correct, the responsible legal entity in self-driving cases is likely to be the “backup” driver. For example, Uber was not found criminally liable for the Arizona accident a couple years back that killed a woman, but the driver could face charges of vehicular manslaughter: https://www.insurancejournal.com/news/west/2019/03/06/519721...

Uber has settled the civil case out of court with the woman’s family on the driver’s behalf, there is nothing they can do for the criminal case (besides provide representation for the driver)


The video[1] says (verbatim):

>OpenPilot allows you to be driving hands free and feet free for long periods of time without intervention.

Holy fucking shit no. This is completely reckless.

If they want their code to be used for research purposes, then set up a site with contact information which researches can use to get a copy of the source code. Having YouTube videos saying "buy this hardware, hook it up to your car and now it's self-driving" (yes they say self-driving in the description) is in no way responsible.

[1] https://www.youtube.com/watch?v=ICOIin4p70w


Some forward thinking car manufacturers are already trying to get ahead of this. My new Jeep for example has a physical box upstream from the ODB2 port that acts as a firewall blocking all non-read commands while in motion. (I of course disassembled my dash and installed a bypass so I could permanently disable the auto off at stop lights)


There is nothing illegal about modifying your own car.

People have been doing so since cars were invented.

"Amateur" car modifications are already very wide spread.


Sure there is, depending on what you’re doing and where you live. There’s all sorts of things you could do that would make a vehicle no longer street legal and/or a danger to yourself and others. And I would hope people would take the latter very seriously, even if what they’re doing isn’t illegal.


And nothing described here makes a vehicle no longer street legal.

This stuff is well within the realm of perfectly legal car modifications.


Is it? IDK, IANAL.


Yes, I appreciate that. Although it's worth mentioning that some modifications (such as removing seat belts or tail-lights) would render the car un-roadworthy and are effectively illegal.

My concern is less with legality and more with lethality.


How funny that this pops up just a few hours after I looked into it again.

It's an exciting product. Even if it has no direct use, I feel better being driven around by an open piece of software that I could in theory inspect and change. Closed-source stuff is just a black box expected to do magic. Doesn't feel good.

The story behind the man is quite interesting as well. When I was 9 he was the guy I got the jailbreaks from that would allow me to install cool tweaks and other non-standard software Apple wouldn't allow on my iPod Touch 3G.

Apart from this, I think this rap [https://youtu.be/9iUvuaChDEg] he published when he was sued by Sony for allowing alternative operating systems to be installed onto the PlayStation 3 demonstrates his attitude quite well (admittedly the self driving situation is more complicated, but what a response to a law suit).

You might guess that I'm still a GeoHot fan. While I'm not sure this will be competitive with commercial systems in the future, I'd love it. Having the world's best self driving system be free for everyone to inspect and use in their cars would be a societal gain similar to that of Linux.

Open Pilot will be the only self driving system retrofittable to older cars (by consumers themselves), which is already very cool (it's just unfortunate that none of the cars are old enough for me to consider buying).

I wish them the best of luck (and may they get the tests to a level at which everyone is happy with them).


"Closed-source stuff is just a black box expected to do magic"

Open Pilot is about as much black box as all the other self driving applications. The interfacing is open source.

The machine learning is closed source. I've looked into the repos and didn't find any sign of the actual hard self driving parts, as it were last time I checked comma.ai.

But sure, you can tune the steering P-controller for your Subaro if you would like too.

https://github.com/commaai/openpilot/blob/devel/selfdrive/ca...

        apply_steer = apply_std_steer_torque_limits(apply_steer, self.apply_steer_last, CS.steer_torque_driver, P)
Or maybe you can patch using "volatile int" for thread concurrency in the machine model inerface (I guess it is that).

https://github.com/commaai/openpilot/blob/devel/selfdrive/vi...


visiond has been open sourced a few months back


I work in AI for general usage applications.

I think that there's some valid criticism here, although I don't think the target demographic is kept well in mind when considering the project as a whole.

In the current iteration, a majority of commercial market consumers wouldn't be skilled enough to perform the installation properly, or at the least, adequately to achieve acceptable system performance. I think that the barrier to entry is just high enough to keep Cousin Vinny from sleeping in the back of his cab while his "self driving truck" got him from A to B.

I've personally purchased a vehicle with the explicit reason to install OP after learning of Comma back in 2015. It's been one of the funnest, and most rewarding personal projects I've taken on.

I never drive with my hands far from the wheel, or my eyes off the road. While there is certainly going to be some hyperbole surrounding new technology, my assessment is that this alpha quality, DIY solution has been adequate for my use cases over the last 2,000 miles in various weather and road conditions.

I think that the comments here about loss of life and limb are overblown. If someone engages in sex and uploads the video publicly to PH while mowing down a pedestrian in the process in their "Tesla", I think that speaks more of the driver than it does the technology. One could argue that the tech enabled them to do so, but so what? See footnote: Gun Control Debate.

I think what Comma is doing is rather smart as a company. Keep the barrier to entry high which naturally attracts skilled, and otherwise knowledgeable people, build out their codebase, and eventually release (or lease) their tech to car manufacturers that need help catching up in the marketplace. I don't really see another way a scrappy startup could accomplish as much in the self driving market without some risk.

I don't really lift weights, but I've disassembled my vehicle's EPS motor. I think I have a fair chance of overpowering it, even if the controller doesn't include "deterministic, real RTOS" or whatever flabblebabble. Commercial release? Sure. #shipit

But for now, I'd be more concerned about people staring at their crotch while driving browsing Instagram.


Cool, I'd really like to use this for a small scale toy car. I'd never put it on something that could do more than bruise an ankle.


I have two recommendations for you Duckietown (https://www.duckietown.org) and AWS DeepRacer (https://aws.amazon.com/deepracer).

Hope that helps!

-------

If you want to get more serious about robotics then check out my website https://becomeautonomous.com



Also https://diyrobocars.com and https://diyrobocars.com.au in Australia. If anybody wants an invite to our Slack group and our "Hack and Race" meetups, send an email with your location to the address on my profile.


In the first video, the voiceover tells people to buy&install the hardware, and use this software, while showing what appears to be autonomous driving among other vehicles on an active public road. (Not "This is for research only, from the privacy and safety of your own driveway.")

Who thinks this is a good idea? Where in the US is this legal? In a crash, will insurance cover it, and will there be additional civil liability and possible criminal charges?


I'm reading a lot of commenters criticizing their strategy and quality of their system. Does anyone want to talk about the fact that they have 10.5 MILLION miles recorded without an at-fault accident? To me, this is quite remarkable.


By the way, the CEO of comma.ai will speak at Big Data & AI conference in Dallas, June 27-29, 2019. https://bigdataaiconference.com.


Seems like a neat project to get auto-steer driver assist going on a car that supports it. It's basically a Tesla Autopilot 1 ( Mobileye ) mod.


Are the main maintainers French-Canadian? I noticed that a lot of demos were filmed in the province of Québec.


This is going to be an interesting legal case when a vehicle using Openpilot is involved in an accident.


This is stupid. All the demos are on public roads. George Hortz needs to chill before he ends up in jail.


>This is stupid. All the demos are on public roads. George Hortz needs to chill before he ends up in jail.

There are literally thousands of community members using this software on a daily basis, driving thousands of miles in real life situations daily. I've been commuting to work with it for 3 months now and it's the most lifechanging piece of tech I've come across since the iPhone. I get on the highway, engage the system, and sit back sipping my coffee for the next 45 minutes. It costs $800 to get fully setup with an EON/Panda/Giraffe and almost every Toyota, Honda, and Hyundai since 2017 is now supported. Check out some of the videos on youtube and you'll want one too.


>I get on the highway, engage the system, and sit back sipping my coffee for the next 45 minutes.

Please, do not do this.


>Please, do not do this.

I get that people are unable to comprehend the fact that true level 2 autonomy is here and ready to use, but it is. It's perfectly safe to sit back completely disengaged from the controls on OpenPilot, so long as your eyes are on the road (which the software enforces with head tracking). Any sort of situation that could require you to take control is warned in advance.


>Any sort of situation that could require you to take control is warned in advance.

And what about false negatives, where something happens but it doesn't warn you?

The fact that comma.ai pulled out of releasing the comma one when the NHTSA asked them some fairly basic questions is a massive red flag. They can't even guarantee that openpilot will detect low slung trailers or motorcycles [1]. If they want to develop and release autonomous cars of any level they should be going through proper procedures and regulations, not just sticking a "for research purposes only" sticker on everything and distancing themselves from any responsibility.

There are far, far too many unknown unknowns when it comes to autonomous vehicles. If you're already forced to keep your eyes on the road, stop being lazy and do everyone around you a favour: grip the steering wheel, keep your feet on the pedals, and focus on the road.

[1] https://opc.ai/faqs/does-openpilot-see-people-animals-motorc...


> If you're already forced to keep your eyes on the road, stop being lazy and do everyone around you a favour: grip the steering wheel, keep your feet on the pedals, and focus on the road.

You do that, and I'll continue enjoying my self driving car.


Until it kills an innocent pedestrian. What are you gonna do then?


This is extremely reckless


>This is extremely reckless

I find driving without it to be reckless at this point. Humans are susceptible to distraction, OpenPilot is not.


That might be the elephant in the room: contemporary driving is reckless _to begin with_. Autonomous assists are bringing this to light - not necessarily making it more or less reckless.


No more reckless than doing the same with Tesla Autopilot.


It's unlikely he chills until someone dies or property in excess of hundreds of thousands of dollars is damaged, and an attorney pursues him personally. A hacker ethos is no defense against a personal judgement you can never repay in a lifetime. Unlike most judgments, punitive damages awards are not dischargeable in bankruptcy as long as the relevant cause of action was based upon willful actions.

The audience won't be a room full of domain experts, but a jury of twelve.


No Tesla programmers have been arrested for Autopilot-related crashes.


Tesla makes it extremely clear to its users what the shared responsibility model is for Autopilot features (through warning messages at each Autopilot engagement and in the user's manual for all Tesla vehicles), and has general counsel on staff to reinforce this through legal actions indemnifying their staff. Openpilot is a drastically different model. It is a software breadboard for your own vehicle.

For example, this is laughable (from the repo):

(Top) > openpilot is an open source driving agent. Currently, it performs the functions of Adaptive Cruise Control (ACC) and Lane Keeping Assist System (LKAS) for selected Honda, Toyota, Acura, Lexus, Chevrolet, Hyundai, Kia. It's about on par with Tesla Autopilot and GM Super Cruise, and better than all other manufacturers.

(Bottom) > THIS IS ALPHA QUALITY SOFTWARE FOR RESEARCH PURPOSES ONLY.


I remember Germany regulators wanted Tesla to stop branding it as "auto-pilot" when really their features are more like Lane Assist and Adaptive Cruse that many other vehicles have.

Also, Tesla's have had their share of issues (the car that ran into an off-ram barrier killing the driver, the guy who was passed out for hours before his Tesla crashed), but they're also a large enough company they've probably settled some or all of these cases out of court.


It is laughable, because it is probably true. The quality of the systems produced by the named manufacturers is abysmal.


Well, what you mention Tesla does is the disclaimer part, but what Elon boasts about is the headline/first paragraph part...


Tesla probably have the paperwork and certs in order when NTSB knocks the door. NHTSA have already shut down comma.ai products before. EDIT: I personally thinks it's a bit to reckless to let the general public use the Autopilot in it's present state.

https://comma.ai/shop/products/eon-gold-dashcam-devkit

"Retrofit your car with a copilot." "Note: This product is not designed to drive a car. This product is designed to be a dashcam."

With comma.ai hosting firmware via Github to do self-driving with it, it's hard to imagine that NTSB will buy that.


To me, the fact that Hotz didn’t seem to realize that NHTSA was a thing that existed and that he should be talking to them makes me not trust anything the guy does. If someone can be that colossally ignorant about something that’s so central to what they’re trying to do, and they react to it by basically throwing a tantrum instead of trying to get their shit together and move forward constructively, it calls into question everything about them, including their actual expertise with stuff people think they’re experts in.

It’s like finding out that the neurosurgeon who’s about to operate on you is also a flat earther- it doesn’t directly impact his ability to be a neurosurgeon, but it would be a big red flag that calls everything else into question.


NHTSA came knocking and then openpilot was opensourced. It's to be used as a prototyping tool.


One side of me agrees, the other thinks of the Wright brothers. Oh wait, they got someone killed.


I am genuinely interested, do you think that the Wright Brothers should not have developed the technology that they did because of the risk that someone would get killed? I am having trouble imagining any way (at least at the time, but probably today as well) of inventing manned flight without anybody dying in the process.


Inventing manned flight without anybody dying would have been completely possible. Use more caution, pay more attention to safety. But that's not the right hypothetical. This is inventing manned flight without endangering or killing innocent bystanders.


I don’t think anything is possible risk free. So it comes down to your/societies risk tolerance how save, expensive and slow or risky, cheap and fast progress should be. Ideally the risk is taken by conscious and free individuals. But yes, bystanders will be hit as well.

For manned flight it was totally worth the risk.


He did? Wow, I have never heard this story:

https://www.thoughtco.com/the-first-fatal-airplane-crash-177...

..he killed an Army Lt during a demonstration, and took some pretty bad injuries to himself.


Ye, it is cool.

How ever, I feel there's a huge difference between pioneering flying and risking the general public's lives to feed your proprietary machine learning model. Because Hortz hasen't released that part of the software right? Last time I checked it was still closed source.


scifi idea: stop making cars, train only




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: