Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is not sufficient if, in aggregate, self-driving cars have fewer accidents. If you lose a loved one in an accident where the accident could have been easily avoided if a human was driving, then you're not going to be mollified to hear that in aggregate, fewer people are being killed by self-driving cars! You'd be outraged to hear such a justification! The expectation therefore is that in each individual injury accident a human clearly could not have handled the situation any better. Self-driving cars have to be significantly better than humans to be accepted by society.


We don't make policy or design decisions as a civilization based on whether individuals are going to be emotionally outraged. We make those decisions based on data that leads to the best average outcome for everyone.

How could we possibly do anything else?


We make those decisions based on data that leads to the best average outcome for everyone.

That's great news for everyone who needs an organ transplant. You anonporridge are among 50,000 Americans who've been randomly selected as an organ donor. Your personal donation of all your organs will save at least 10 lives and cost only your own, leading to the best average outcome for everyone.


> We don't make policy or design decisions as a civilization based on whether individuals are going to be emotionally outraged.

I feel like we're living in very different democracies. My provincial government just offered to pay just over a third of a billion dollars for a hockey arena so they could have a better shot at winning an election. That's entirely playing to people's emotions.


I'm not sure that we do make policy decisions based on data, otherwise we'd be doing a lot more about climate change.


So next time there is a pandemic we should just bomb the city it originates in before it can spread, got it. Both human emotions and logic play parts in policy making.


This is such a naive take. Bills get passed based on emotional appeal, not data. That's why politics is chock full of "it's for the children"/ "don't let the terrorists win" rhetoric.


There are countries with both opt-in and opt-out organ donation. This already invalidates your first sentence: there are examples to both policy.


I'm not sure this is really a universal opinion. The main difference between this and every other scenario we already do this kind of trade with today, e.g. medical professionals, is whether it's another person that is, on aggregate, better. That will definitely have some sway for some but certainly not everyone. "Someone will be upset" is also not the same thing as "society won't accept it". E.g., many people are upset with medical professionals, and there are plenty of cases of them being plain worse than a random person's guess, but the vast majority of society still relies upon them.

That said I agree it has to be more than "beats average", just how much so and why that is may be wildly different depending who you ask. I suppose that's the crux of the debate, not that there is just one obvious and well-known fact about acceptance some people are missing.


So people killing people is ok, but software killing people is way out? I have seen plenty of human-made accidents that were very readily avoidable - one that springs to mind is a colleague who killed himself and his three passengers by driving down the wrong side of the M1 at 180 mph while absolutely slaughtered (pun intended).

I mean, you’re saying that he couldn’t have handled that any better?


How would doing something irresponsible be different with self-driving? He would have just overridden the car’s control and still driven with 180 mph, or if some another poor soul was also caught up in the accident, there is no way a self-driving car could have done anything with a car suddenly appearing in some other lane with that speed avoiding a collision.


Indeed, and this is why we must surgically remove free will from all humans. It’s the only way to be safe.

Actually, if we were to simply euthanise all humans at birth then there would be no risk of them ever coming to harm.


> It is not sufficient if, in aggregate, self-driving cars have fewer accidents.

Morally, and in terms of our own personal opinions, it should be sufficient, even if emotionally and to broader society, it isn't. We as individuals should not be advocating for the modality that maximizes the number of deaths, regardless of other trivial factors like status quo bias.


> We as individuals should not be advocating for the modality that maximizes the number of deaths, regardless of other trivial factors like status quo bias.

I'm not sure it's that simple. Traffic deaths are not entirely random, they can but there are actions you can take to decrease risk for yourself and other people in your car. If the number of deaths only marginally decreases the chance of death for some people (those who don't drive while intoxicated, don't use their phones, are more attentive etc.) will increase?

Also the state will have to grant legal immunity to car manufacturers so that they couldn't be sued to bankruptcy. That shouldn't provide them too many incentives to make their cars safer..


"Minimizing deaths" is probably too simplistic, but not by much. If we can be reasonably confident that replacing human drivers will lead to 30% less traffic deaths, I think it would take some pretty large extenuating circumstances for me to not want that to happen.

> Also the state will have to grant legal immunity to car manufacturers so that they couldn't be sued to bankruptcy. That shouldn't provide them too many incentives to make their cars safer..

Indeed we would need to be careful not to make the wrong incentives.


But there are already many measures which could decrease traffic deaths by up to 30% or so. They are expensive and/or inconvenient (but not even close to how expensive replacing all cars with self-driving ones would be). Which choose not to implement them for these reasons.

For instance ban all cars made prior to 2008 or so. That combined with massive investments into public transport (would decrease average miles driven, .e.g many EU countries have way less traffic fatalities per 100k pop. but about the same when adjusted by distance driven). Should be about 30% if not more and we don't even need self-driving cars...

https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8118...


Well this is actually that exact situation. The driver was supposed to be driving.


What concrete definition of “significantly better than humans” are we talking about?

The fatality rate per 100 million vehicle miles traveled was 1.34 in 2020 in the US.

Would 0.134 be a significant improvement?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: