Hacker News new | past | comments | ask | show | jobs | submit login

> If we believe that self-driving cars are inevitable

I am a programmer and I would never ever get in a self-driving car if I had a choice. I don't trust other programmers, I don't know if a backdoor has been put in by someone, and I don't want to get into a vehicle that can be hacked.

I believe self-driving cars will be a catastrophic failure.




> I don't trust other programmers

Then how are you commuting currently? Almost every part of the vehicle is electronically controlled nowadays. All the components are connected - one way or another. So, it's easy to traverse around and make your way to CAN. Once, you are there, you can control the car.

https://www.kaspersky.com/blog/blackhat-jeep-cherokee-hack-e...


The simplest solution is to stay away from cars that have networking ability.


Which basically means you are going to stay away from any car manufactured in the past 10 years.


Yep. My daily driver is 16.


Steering and brakes should work even if there's no electricity.


A backdoor on your current non-self-driving car can disable brakes and suddenly pull the steering wheel to the left as soon as you reach 65 mph.


It can pull the steering wheel, but I'm pretty sure the ratios are selected so that I can still overpower it. The steering wheel is still mechanically connected to the front wheels.

And it surely can't disable brakes. It can disable ABS or force it to cycle, turn off power assist, and apply electric parking brake, but brake pedal is still connected to the brakes through some old school hydraulics.


ABS is able to disengage your brakes even while the pedal is fully depressed, the old-school hydraulics are connected to ABS that can release the pressure so that the brakes disengage - that's what it does during normal operation. If it would intentionally release the brakes and keep them open instead of "pumping", there's nothing you can do.

The same applies for power assist - it's quite powerful; steering at high speed is possible but quite hard if it simply turns off; however, if it would be actively working against you (and do it suddenly without any warning), it's likely that you wouldn't be able to turn it back until you'd already smashed into the opposite lane.


put your car in neutral and turn off your engine, see what happens. You can still steer and brake, but for a sharp turn or down an incline you would be surprised how hard it is without servo steering and the vacuum servo.


Probably drives a 1967 Pontiac GTO.


If you don't trust the abilities of other people, who do you think is currently driving cars? You are trusting everyone else on the road with your life every time you get in a car.

You ever look at other drivers when you are stuck in traffic? You will see people looking at their phones. You will see people eating. You will see people shaving. You will see people doing their makeup. You will see people reading books. You will see people with dogs on their laps. And that is just the people who might accidentally kill you due to carelessness. Vehicular suicide is more common than most people expect and there is no telling if you might be collateral damage in one of those collisions.

Self driving cars don't have to be perfect to be dramatically safer than human drivers.


> Self driving cars don't have to be perfect to be dramatically safer than human drivers.

This statement is technically correct, but it severely downplays how close to perfect the autonomous cars have to be to be significantly better than human-driven cars.

If you look at European countries, current statistics are around 2 fatalities per billion km. The US is around twice that number. Either way it's actually a quite safe mode of transportation.

In fact, if you compare it to travelling by train, the average over European train travel over the past decade is around 0.2 deaths per billion passenger km. Trains are driven by professionals, still accidents happen.

In other words, the fatality rate for cars per billion passenger-kilometers is in the same order of magnitude as for trains driven by professional drivers. How much do you really expect autonomous cars to be able to improve on that?


Going from 2 to 0.2 deaths per billion km (a 10x improvement in safety!) would be an enormous success for autonomous vehicles. Globally, that would save hundreds of thousands of lives every year. Often the lives of innocent passengers and pedestrians.


First of all, it's closer to 5x (you have to divide the car number by the average number of passengers in a vehicle).

Second of all, I agree a 5x improvement would be impressive. But that's the level of a professional driver on a closed, dedicated track. I don't think autonomous vehicles will reach that level anytime soon.


I was going to post earlier that IMO the level they should be aiming at is that of a brilliant racer, otherwise who needs them.

There was a vid on here recently of the Porsche 919 beating the lap record at the Ring, and one of the comments was "I don't have the balls to drive that fast in a video game" such is the skill level of humans at the wheel-a few to be sure, but still and all there it is.

Here's the vid btw. And it is ridiculous...

"https://www.youtube.com/watch?v=PQmSUHhP3ug"


For what it's worth I find the Nurburgring record lap done last year with an Alfa Giulia even more ridiculous (https://www.youtube.com/watch?v=5gEdJmIVqLY). The 919 is a unique super-car while the Giulia QF is almost a "normal" car, which one can see parked while shopping at Whole Foods.


Fine share!

+1


An autonomous driving AI doesn't need to perform like a professional racing driver in order to improve safety.

The causes of most crashes are human fallibilities that AI doesn't suffer from:

An AI isn't tempted to drive drunk. It isn't distracted by incoming text messages. It doesn't get fatigued after driving for hours late at night. It doesn't suffer from road rage, and it isn't tempted to drive aggressively and take risks because it's running late for work. Just by avoiding these issues you've made driving a lot safer!

An AI also has reaction times that no human can match, has no blind spots and can track other vehicles in 360 degrees at once, and "see" through fog and rain using radar to detect obstacles.

Conversely, some of the things that are easy for humans are also the most difficult for AI to grasp: the social cues that we use to resolve pathing conflicts (eye contact, a wave, a blink of the headlights, etc), or our ability to intuitively predict the behaviour of other road users.

So while we may be some way yet from true, socially-accepted Level 5 autonomy, we're not far at all from semi-autonomous, driver-assistance technologies that significantly improve safety.


What you say about the AI vision is true in the best case. But frequently we encounter the worst case. How many times have you had a stone chip your windshield? On a Level 5 car, a stone hitting the Lidar means best case you're broken down on the highway, worst case you're having an accident because the car can't steer properly. A whiteout on a road with L5 autonomous cars would mean you have potentially hundreds of people completely stuck in a blizzard, emergency services would be overwhelmed.

And incidentally, many of the human problems are being solved without autonomous driving. Modern cars already have systems to detect that you're falling asleep and pull the car gently over. People are working on in-cabin online breathalyzers, which will detect alcohol in the air and make the driver blow into the straw to verify (s)he's not drunk (just a passenger). Better integration of phones with cars (CarPlay, Android Auto) and voice input will make texting-while-driving less convenient than just talking to the car with your eyes on the road.

Don't get me wrong, many of the new driver assistant safety features are great. Stuff like blind zone warning and auto emergency stop for pedestrians is awesome. But I think where we are now (autonomous highway driving in good conditions) will be the approximately the endpoint for autonomous cars in the foreseeable future. I think none of us will live to see L5 cars (without any steering wheel at all) on the road.


Technically that's one order of magnitude difference between the the 2 fatalities per billion km and the 0.2 figure.


The units are different though, 2 is per billion vehicle-km and the 0.2 is per billion people-km. The average number of people in a car is greater than 1.


Where did you get 0.2 deaths per billion passenger km?

The EU typically cites a value of 0.14 for the EU-28. Numbers are (statistically) worse in the least developed EU members, and better in more developed ones. For example in both the UK and Germany it was 0.06 in the last two year periods I looked at.

So how much can we improve? Well apparently a LOT.


Comment deleted. I would have been happy to discuss the point in a calm and respectful manner. But I don't appreciate being accused of intellectual dishonesty and being told that what I said was "ridiculous".

And "buff neck", really? What did that have to do with anything I said, except as a way to shut down the conversation?


You're being incredibly intellectually dishonest with yourself here. Suggesting that shaving makes you swivel your head and become more aware of traffic is akin to suggesting that shoulder checking during lane changes is going to give you a buff neck. It's ridiculous.


Some other driver, right now, could decide to pull out their phone and text and you could be dead instantly. Worrying about backdoors and hacking is like worrying about dying in terrorist attack. It's not like it won't happen to someone but there are more mundane ways to die. Ironically, getting into a car is a very common way to die.


I was rear-ended once because of that: stopped smoothly at a just toggled red light, driver behind received a phone call and compulsively looked at his phone briefly, not paying attention to neither the light nor me being fully stopped for a split second, covered the distance which should have been safe were he paying due attention, and impacted without braking. Luckily he was going below the legal speed or else both cars would have to be written off.


I’m a programmer and I would have absolutely no problem getting in a self driving car even at the level of current Waymo in Phoenix right now. I wouldn’t trust a car on that level to drive me around a windy cliff road when it’s snowing yet, but as is if I were in Arizona I already trust those Waymo-level cars more than my buddy who might not be paying attention 100% of the time.


If you don't trust software, don't look at reality. So many things that can directly or indirectly kill you are run by (often shitty) software, you won't leave the house again. And those systems aren't looked at as closely as self-driving cars.


Do you not get into regular cars where safety features are controlled by software? (That is, any car from the last 30 years?)


There's a big difference in user control between safety features and a self-driving car.


I mean, not really. You're already trusting software to decide how your car should steer (automatic lane keeping), brake (automatic emergency braking), and accelerate (adaptive cruise control).


Those are not in every car from the last 30 years.


You're assuming people actually use those features.


Is there anybody driving a car that does not use acceleration and braking? I could understand your argument for lane control, but otherwise your argument makes zero sense. Everyone's car is controlled by software, period.


Braking systems are controlled by a dedicated embedded software system that's extremely well-understood and well-tested (probably even formally verified). All it does is check the ABS (and anti-skid) sensors and moderate the braking pressure to avoid locking the wheels. It also has a fail-safe system that gets rigorously tested, e.g. if a sensor fails. All possible values of all inputs are known, and the configurational space of those inputs is small enough that you can exhaustively test correctness.

It's plainly obvious that a braking system controller software is a very different beast from an autonomous car software, and I have no problem understanding why someone would trust th former and not the latter.


> Braking systems are controlled by a dedicated embedded software system that's extremely well-understood and well-tested (probably even formally verified).

Are you sure? That code is closed source, and the peek behind the curtain we saw with Toyota's investigation did not inspire confidence:

http://www.safetyresearch.net/blog/articles/toyota-unintende...


That's a really interesting article for sure. But it does highlight that Toyota was being unusual in not following industry-wide (voluntary) coding standards for safety-critical embedded systems in cars, and they ended up losing big in court. So I do think (hope) that this example is worst-case, not normal.


The person you were replying to was talking about adaptive cruise control and automatic breaking. The majority of cars on the road do not have those features. Very different than plain acceleration and braking and much more likely to have glitches.


I mean more like... brakes.


Which are also controlled by software. That’s literally what an ABS is.


ABS saved me once from a big accident. I have heard\read about this feature before, but man practically seeing it, operating in front of your eyes - was just like beaming a torch through the darkness


Yeah we are in agreement :)


Cars are already mostly software controlled drive by wire systems. Self driving cars will also be easily safer than meat bucket controlled ones. You should try roding a bicycle regularly. It's scary how bad people are at driving. Most shouldn't be on the road. It's like they think what they learned to pass their test isn't relevant one you have your license.


How are cars drive by wire? Sure, newer cars have electric power steering, but I would guess that 99% of cars older than 10 years use hydraulic power steering and brakes.

Even then, it takes very little software to do electric power steering, so I'm not sure how you would consider this at all comparable to autonomous driving.


> Self driving cars will also be easily safer than meat bucket controlled ones.

This is a pretty generous assumption you're granting here.


We’re already granting the assumption that these self driving cars are generally popular. For that to be the case, they’ll have to be safer than humans or people won’t be okay with letting them on the roads in large numbers.


That's another mighty assumption. The new shiny doesn't have to be better or safer for it to be popular.


Human drivers are atrocious. It won't be hard to be better than them.


> It won't be hard to be better than them.

That's another very generous assumption wrapped in a truism.

I'd say it's one of the more difficult problems problems in the industry, given the billions of dollars poured into it.


It only needs to be solved once


Do you fly...?

Do you use electricity...?

Do you use indoor plumbing...?

All of those things are controlled by ... computers!


You may feel differently when you get to 80 and feel you can no-longer driver reliably.


You don't trust it now, but when everyone will happily ride autonomous vehicles and will have demonstrated they are safer, then you will change your mind. That's how we were convinced to go into metal tubes with wings hurtling through the air.


Of course you do. Do you ever fly? Do you drive in _current cars_, that are ran by software?


I am assuming you have flown on a modern airliner? Anytime you land in instrument conditions, you are trusting programmers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: