Hacker News new | past | comments | ask | show | jobs | submit login

> Was it life threatening? Hardly.

Uhh what? It seems you cannot go a week without reading about a pile-up on a freeway. Just last week a big-rig lost a wheel, it rolled into the on-coming lane, and drivers swerving and braking to avoid it actually caused a pile up. Stopping even on the shoulder on a freeway is considered "risky" by most police officers and many (like triple digits) have been killed while stopped in the shoulder due to vehicles drifting, failing to pay attention, or otherwise being distracted.

I cannot remotely begin to fathom how anyone can think a car going 0-10 MpH on a freeway ISN'T dangerous. And it is absolutely life threatening. If a car behind didn't notice the change in speed, panicked and either hit you or the concrete barrier(s) that could very easily cost them their life. Or leave them with life-long disabilities. Bigger things like trucks and those "road-trains" are even bigger liabilities.

Honestly I'll defend security research strongly in almost all contexts, but when you put people's actual lives in danger you clearly cross a line. There's no shades of gray there, endangering people's lives and health to effectively show off is absolutely immoral and should be illegal (and likely is).

Saying "well nobody got hurt" completely misses the point. It is the intent that is wrong, not the result. The result could have multiplied the wrongness of the intent and resulting in tens of years of jail time, but luckily for them their only "crime" this time was the intent of their dangerous actions.

And let's be frank here luck is the only reason nobody got hurt. The only reason why these two won't be in jail for many years.




The driver was aware of their activities, so he is probably the only one with any legal culpability.

Impeding traffic is a misdemeanor in Missouri, probably rates a maximum 1 year jail sentence (note 6: http://www.nhtsa.gov/people/injury/enforce/stspdlaw/mospeed.... )


According to the article, they didn't actually tell him ahead of time what they were going to do to the car.


Accomplice liability would make the researcher exactly as guilty as the driver.


Poorly maintained vehicles that break down while driving surprise the driver. This happens daily on public roads. Should we fine them for failing to maintain their vehicle to your standards?

There are autonomous vehicles being tested on our roads with a failure mode of "coast to a stop". They may not even have a human inside to react to things around them. Do the operators deserve to be jailed?

People modify their cars with various after-market upgrades and take them onto the highway. If the car fails, do they deserve to be imprisoned?

What a slippery slope!

Driving is a risk. The most deadly risk you will take each day. Drive defensively, don't be a statistic.


> Should we fine them for failing to maintain their vehicle to your standards?

You might want to review existing laws. See, e.g.:

Georgia: http://law.justia.com/codes/georgia/2010/title-40/chapter-8/...

"O.C.G.A. 40-8-7 (2010) 40-8-7. Driving unsafe or improperly equipped vehicle; punishment for violations of chapter generally; vehicle inspection by law enforcement officer without warrant"

Ohio: http://codes.ohio.gov/orc/4513.02

"(A) No person shall drive or move, or cause or knowingly permit to be driven or moved, on any highway any vehicle or combination of vehicles which is in such unsafe condition as to endanger any person."

California: http://www.leginfo.ca.gov/cgi-bin/displaycode?section=veh&gr...

"24002. (a) It is unlawful to operate any vehicle or combination of vehicles which is in an unsafe condition, or which is not safely loaded, and which presents an immediate safety hazard."

This research appears to have happened in Missouri, where it's harder to find the actual laws on the subject. That said, I did find this: https://www.mshp.dps.missouri.gov/MSHPWeb/PatrolDivisions/MV... which tends to imply that there are laws to this effect that I cannot easily locate via internet searches.


Failure to maintain your vehicle such that it puts other people at risk is against the law.

The people testing self-driving cars had IRBs that go over their test cases. Do these guys even know what IRB stands for?


and it's rarely prosecuted.


ya well all they did was stall the engine, they didn't tell the car to apply the brakes.


With what certainty did they know that was going to happen?


because they had previously tested it and knew what each function was doing. this was the Hackers movie with them flying around a computer and poking and prodding random things.


How did they know they would only affect the one car they were targeting? What if another similar car also stalled out?


They decelerated a car. The brakes weren't even applied. This happens all the time on highways. It is unfortunate that it happened where there was no shoulder on the road, but if an accident did happen then I'm not so sure the researchers or journalist would be at fault.

Here's a scenario:

Let's say a person is driving a car, when their car engine fails. There's no shoulder for them to drive onto, so they are just slowly decelerating when they are rear-ended by a vehicle behind them. Would you say that the car that had a mechanical failure is at fault, or the person behind them who wasn't paying attention is at fault?


> but if an accident did happen then I'm not so sure the researchers or journalist would be at fault.

So the people that purposely tried to cause the accident wouldn't be at fault for the accident if it occurred..?

I find it highly amusing that in your scenario you're using an unpredictable failure as an equal for an intentional act.

A better scenario would be:

I open your car bonnet while you go to the bathroom. I half-cut some cables knowing that they will fail when you knock them a few more times. You come out, get in your car, and drive down the freeway. A few miles later your car stops suddenly in the fast lane, and a big rig crashes into you while you sit there stopped going 70 MpH and you die. According to you I am not, at all, responsible for your death.

Or better yet still:

You just stop on the freeway just for fun/see what would happen. Someone drives into the back of you at 70 MpH and THEY die. According to you, you aren't at all responsible for that.


There's a huge difference between stopping on the highway and decelerating due to lack of engine power. The driver knew what was happening, turned on his hazard lights, and didn't apply the brakes. Slowing down on the highway, although annoying, shouldn't be an unfamiliar or unsafe scenario (ex: construction, traffic backup, etc.)

This would be a completely different story if the researchers applied full force to the brakes or accelerator since those are unexpected (to other drivers), sudden, and difficult to react to behaviours.


Everything else aside, slowing without good reason is likely to be a traffic infraction (in Missouri, a misdemeanor punishable by 1 year in jail!).

It isn't that convoluted to hold the driver responsible for the vehicle, they knew prior to driving into the area with a minimum speed that there was some intent to tamper with it.


They decelerated a car enough for other drivers to honk. It was slowed to a crawl. States have adopted minimum highway speeds for 50 years for a reason.

What if their proof-of-concept didn't work as predicted and did slam the brakes? This is just a reverse-engineered hack that was unleashed on a highway while the radio was blasting too loud to hear each other on the call.


> but if an accident did happen then I'm not so sure the researchers or journalist would be at fault.

Legally, you are required to do what you can to avoid accidents.

Ethically, it's all kinds of fucked up when you rationalize with "well, if someone goes wrong I can blame someone else."


Just because similar events occur under different circumstances doesn't mean this is ok.

This test could have easily been done on a closed test track or heck, even a large parking lot.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: