Hacker News new | past | comments | ask | show | jobs | submit login

Honestly if autonomous cars can reduce even 10% of accidents caused by drunk driving and other human errors then they will have payed their way and more.



But speaking of paying, self driving cars also won't speed, turn right on red without a full stop, etc. Which means traffic tickets would be a thing of the past, along with the revenue they bring in.


Driving while black, failure to provide proof of ..., tail light out.

If a cop declares "your" car's driving was reckless or too fast for conditions, how exactly do you fight that in court?


You could probably fight it in court with the electronic logs that would presumably be present in a self-driving car.


With dashcam?


At the same time policing costs would decrease dramatically.


That's probably not true. I've gotten the impression that a sizable chunk of police funding comes from speeding tickets.

Paying an officer to do something low-risk like camping out next to a road for a shift isn't really expensive. Same with monitoring systems like cameras on traffic lights. Set-up cost is significant, but they probably pay for themselves within a year.


The initial cost doesn't matter. If an officer doesn't cost much, yet you don't have to pay him any more than that is a 100% decrease in cost.

Second I know that the officer is not the only cost involved. What about maintaining a fleet of top of the line vehicles? How about paying all the support personnel such as IT Staff, dispatch, managers, etc? All of the other expensive toys carried around by the chaser?


> Second I know that the officer is not the only cost involved. What about maintaining a fleet of top of the line vehicles? How about paying all the support personnel such as IT Staff, dispatch, managers, etc? All of the other expensive toys carried around by the chaser?

Arguable. It depends on how badly you want [insert autonomous car producer here] to run your local police department from the cloud.


I was thinking about this for other reasons (it is beneficial for the auto-car to know about road closures), but a reasonable implementation has the vehicle interpreting rules data provided by the governments where it is operating, so you can do certification by making sure the vehicle 'correctly' interprets a given data set.

Having the rules data provided by the government is a fairly straightforward way for the vehicles to work even after the builder repudiates maintenance (maybe that is better said as 'to continue to work longer after', but whatever).

(I'm not worried about the cloud, hand-wringing and paranoia are going to make these things at least function independently or keep them off the road altogether)


Who is responsible for paying the fine if the self-driving car breaks the law?


Currently, the driver.


Four people are in the car, two friends, and two people who are on the title and regs. There is no steering wheel. Whose license do the points go on?


Currently, that situation isn't possible, as self-driving cars still require an attentive driver who can take over at any time.

In the future? That's a hard question, and I'm sure however the legal system deals with it first will be wholly unfair and illogical. It'll be a battle between car owners, manufacturers, state and federal bodies, cities and counties, certification and safety agencies, etc.


Probably the same person that gets points when they cause an accident with their bicycle: i.e., nobody, because you don't get points for that.


Are we talking push bikes? At least in the UK you don't need a license for that anyway. I guess it's possible you wouldn't need a license for a self-driving car, but while regulations demand a fit driver who can control the vehicle that won't be the case.


My point is, I think regulations will change as technology improves. As transatlantic flights started, engines were unreliable, so regulators forced airlines to take awkward routes when crossing oceans (so that they would be 60 single-engine flying minutes away from a diversion airfield). Now that engines are more reliable, the regulations are significantly more relaxed, sometimes allowing 330 minutes to a diversion airport.

The point is, regulations are not set in stone. Companies can advance the state of the art, and ask the government to regulate less strictly.

As it stands now, self-driving cars are a research project, so we'd expect them to be regulated very strictly. As testing shows them to be safe, then we can relax the regulations. If self-driving cars without a supervising driver end up being safer than a normal car with a normal driver, it would make sense to not require a license. If an accident happens, it happens. Car accidents are nothing new.


I hope it's Google.


Actually assuming 190 million licensed drivers (although drunks kill all manner of people, including themselves) and 10K drinking related deaths per year (how many are caused by drunks as opposed to would have happened anyway with at least one driver who happens to be drunk?)

Anyway assuming all victims loose 100 years of life...

10e3/190e6 * 24 * 365 * 100 * 0.1 = a delta of 4.6 hrs/day so I think your hyperbole is pretty much mathematically correct.

Our local krispie kreme closed, so the next closest is about 100 miles away, so if a "driveby" was purchased for me, it could actually burn up 4 or 5 hours of my time, especially during rush hour.


That's too idealistic. The reduction of accidents caused by human errors could very well be offset by a substantial increase in machine/OS/SW/HW caused accidents.


But humans are really shitty drivers.

The software would have to be written by chimps for it to be worse than human drivers.

I fully expect all kinds of errors to cause human death. I just hope the backlash doesn't destroy something that could save very many lives.


That's pretty unlikely though. It's not like cars are going from manual-everything to autonomous all at once. Multiple levels of traction control and steering assistance are already in control of your car, and they prevent more accidents than they cause. Autonomy is going to be rolled out in pieces and tested little by little.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: