Hacker News new | past | comments | ask | show | jobs | submit login

> The regulations would allow vehicles to operate autonomously, but a licensed driver would still need to sit behind the wheel to serve as a backup operator in case of emergency.

That alone limits the potential "disruptiveness" quite a bit.




It reminds me of the "Red Flag" laws, passed in the early days of the automobile, that basically made cars useless, so you'd have to defeat their very purpose in order to obey:

http://en.wikipedia.org/wiki/Red_flag_laws


In the United States, the state of Vermont passed a similar flurry of Red Flag Laws in 1894. The most infamous of the Red Flag Laws was enacted in Pennsylvania circa 1896, when Quaker legislators unanimously passed a bill through both houses of the state legislature, which would require all motorists piloting their "horseless carriages", upon chance encounters with cattle or livestock to (1) immediately stop the vehicle, (2) "immediately and as rapidly as possible... disassemble the automobile," and (3) "conceal the various components out of sight, behind nearby bushes" until equestrian or livestock is sufficiently pacified.


Is that real? O_o


I strongly suspect the "Quaker legislators" part is untrue, because I'm pretty sure Quaker dominance of Pennsylvania government had ended long before 1896. (I couldn't find a reference on this point though, so I might be wrong.)

I don't know if the rest of the quote is accurate.


Law makers are uncertain about the technology right now. When they're the ones sitting behind the wheel in one of these cars, reading a book while it drives, then we'll probably see new legislation removing the necessity of the driver.


Do law makers drive cars still? (not counting for fun)


For now. That will change eventually.


That's true, but it's probably a temporary measure (a decade or so?). And even with a backup driver, automated cars have the potential of vastly decreasing traffic jams.


What kind of license do they need? Would a regularly licensed teenager now be able to "drive" taxis and other commercial vehicles?


I'm very annoyed by this requirement. It doesn't hold up to the slightest scrutiny as being necessary yet completely hinders most opportunities for true disruptiveness.


You don't think the novelty and risk associated with a fully automated system that's still in the experimental stages warrants any immediate caution?

Commercial planes have had autopilot for years, but they still require at least the pilot or co-pilot to sit behind the yoke.

When driverless cars have had a few million or billion hours without statistically significant incidents, I think we'll see the laws adapt to not even requiring a human fail-safe.


Actually, for commercial airliners its "worse". They require both a pilot and co-pilot behind the yoke, even in autopilot.

If an airliner is flown with less than two crew in the cockpit, even for a short period of time, the remaining pilot/co-pilot must wear a mask when flying above 25,000ft.


Airliners used to have a third crew member, the Flight Engineer, but that job has been automated into oblivion.


A friend mentioned that non-scheduled (no passengers or commercial freight) flights in large aircraft (eg. 737) can still operate with a single-pilot and a rated flight engineer only. The pilot had to have her mask on (or clipped to the helmet) for the entire flight. This option is rarely done commercially, obviously.


I think the slightest scrutiny shows its complete necessity.

No licensed driver, car encounters situation it's not programmed for.

Now what?


Car pulls over, unlicensed driver calls a cab.


Unfortunately there is a very, very, large set of possible situations a car may find itself in that requires 'user intervention'. Especially without much context-aware systems, or AI, the best error handler a driver-less car has is "stop, inform driver". Is there a curb to pull over to? What if the system is unable to determine if the path is clear because something like a branch is in the way?


Well, my line of argument is that any driver in a self-driving car is going to be completely inattentive (think available-to-public, not a test-phase car). For any self-driving car to be considered safe enough for public use, the car will need to be proven absolutely safe in "emergency" situations (i.e., auto-shutoff, etc.).

So, if the car is proven to be absolutely safe, then why do we need a driver? To pull it off the road after it shuts off or stops?


Take professor Thrun's course on building self driving robots at Udacity, and then you'll see why this requirement is necessary at this stage of the development of this technology.


The whole purpose of the new autonomous vehicle law is to allow development of the systems in real-world conditions, before they can get to the next level of driving without a driver behind the wheel as a backup.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: