Driver wasn’t paying attention, not surprising. You have to actively ignore the warnings to pay attention or hack together a rig to put torque on the wheel.
I don't understand how paying attention can prevent this sort of thing.
Let's say you're paying attention with all your might. You see something up ahead, like an overturned tractor trailer. You expect autopilot will sense it, but you're preparing for it not to. But how do you know it's not going to sense it? Well, obviously, when it hasn't started reacting in a reasonable time. But then it's too late!
In order for a human to have time to figure out that the software isn't working it would have to routinely operate with (much) slower reflexes than a human. But then it would be useless, as you can't slow down normal driving, and in any case, that's not how they designed it.
The hypothesis that people have unreasonable expectations of something called "Autopilot" seems unnecessary and irrelevant to me.
There's a notable time gap between the time when the autopilot should start braking at a reasonable rate, and the last possible time when a full-strength brake will be sufficient. That time gap should be sufficient for an attention-paying human to apply the brakes.
I am however, doubtful think it's really reasonable for humans to have this expectation of 100% attentiveness given an autopilot that 99.9% of the time doesn't crash into things .
Ideally there'd be obvious visual feedback that reflects the current state of the Autopilot and what it recognizes. Something that can't be ignored (front and center on the windshield maybe) and that can give the driver sufficient information to make a decision about whether to interfere or not. This doesn't seem to be on the radar for anybody though. It's like everybody assumes they'll figure out a 99.99% working driving AI where driver intervention is never necessary.
It's even more egregious with Tesla, who is already shipping a half-baked self-driving feature in its current state where accidents like in the article are inevitable.
I could even imagine the windshield outlining everything that the Autopilot sees. If you don't see an outline around a object, you know the Autopilot isn't aware of it. No idea how technically feasible that is.
This sort of problem can be generalized I think to a lot of software design problems, including the sort of things I'm assigned at work (even though very mundane and boring as I'm not a real software engineer).
People say "make a program that does <thing> for me". And maybe you whip up something that does 50%, or 80% or 99.5%. But that only makes them more unhappy when it fails. A partial solution is no solution.
So, you have to come up with a model for humans to be augmented by the software. Rather than trusting it to make the decisions, the software needs to take a large amount of data and clarify and distill it so humans can more easily make the decisions.
But people always want to avoid making decisions. That's why managers/leaders have so much power even though they tend to be despised and seem incompetent.
> I don't understand how paying attention can prevent this sort of thing.
The simple answer is that you take over when something absurd is happening in front of you, like a trailer flipped on its side blocking your lane.
You don’t wait till the last moment to take over. You would immediately move over to the right lane, manually. AutoPilot helps in this case by keeping you perfectly in lane while you check your blind spot before getting over. That’s how it’s supposed to work anyway, and that’s how I use it.
In this particular case, maybe it was possible to avoid it, but in general, small changes in trajectory in a car can cause big consequences. This tractor trailer may have been huge and crosswise, but many things you are supposed to come within inches and so long as you miss it by millimeters, there's no problem.
What I tell my mom, who also drives a Tesla with AP, “If the AP isn’t driving better than you could at that very moment, if you think ‘Oh I’d rather be just a bit further over there’ - that’s when you disengage it. Don’t wait to see what will happen next.”
Turns out that driving is 99.9% utter monotony and AP actually drives better than me during all that time.
I'm sorry but if you see a truck on its side and expect autopilot to stop and it doesn't you are 100% at fault.
And if you saw it on its side, why wouldn't you at the very least change lanes?
It's fun to bash on Tesla whenever autopilot fails but they do make it very very obvious and those of you saying otherwise either don't own a Tesla or really don't pay enough attention.
Yes I would say so if it’s something you’d avoid. It’s one of those things where if you’re going to ignore all of the warnings and agreements when you get the car... how is that not negligent on the drivers behalf? Normal ACC doesn’t always stop either but no one yells at those cars. My Passat and egolf would all the time fail to stop when traffic was at a stand still.