Assuming autopilot was indeed engaged, this seems to be clear evidence that cameras alone are insufficient to perceive big objects in a car's path OR judge its distance to them.
This car traveled in a straight line in near-perfect weather for at least 10 seconds toward a large motionless object blocking its lane and the next one, but still overlooked the danger.
Imagine if this had happened in bad weather or at night, where risk of failure is substantially greater. This degree of incapability implies that vision alone is not only inadequate to perceive obstacles ahead that should be obvious to any human, but profoundly inadequate.
It doesn't say anything about the limitations of cameras. It says a lot about the current software limitations of Autopilot. As does the owner's manual:
> Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles.
To be fair, Tesla has for years now been pushing the fact that Autopilot doesn't mean you can sit back and let the car drive, and "Full Self Driving" is the term for that. Now that Full Self Driving doesn't look like it will live up to its name (not saying Teslas wont ever be fully self-driving, but the product they marketed as FSD to the first buyers of FSD are clearly never going to enable L4), they are saying FSD is just a collection of features in that direction, and "robotaxi" is the next word they seem to mean "really really self-driving." Given the nonsensical timelines Musk has been giving, they'll have to find a way to subvert the expectations of that word too.
The occurrence of this crash shows that the self-driving system as a whole is insufficient. I don’t see how you can be sure the problem is the camera and not the software.
I don't think it's a problem with cameras. I think it's a problem with autopilot. We don't know exactly how it's coded, but lots of other adaptive cruise control systems don't try to look for stationary objects at all.
> lots of other adaptive cruise control systems don't try to look for stationary objects at all
At highway speeds they tend not to [0] since slamming the brakes at perceived stationary objects (signs on the side of the road, cars veering in front of you, etc.) might be even more dangerous. Maybe the next generations of (reasonably priced) hardware will reliably detect such objects in the path of the car.
P.S. Of course it's not an excuse, it's an explanation for why this happens.
For stationary objects as big as a [fire]truck, that's not an excuse. They could see it early and slow down pretty gently if a lane change is out of the cards.
Edit: "P.S. Of course it's not an excuse, it's an explanation for why this happens."
For objects where you would not have to slam on the brakes, saying that it would be dangerous to slam on the brakes is not an explanation.
Slamming the brakes on the highway, interstate, autobahn is dangerous for the driver behind you. In some countries (maybe all) it's illegal to to this without an immediate danger. The car may see a sign on the side of the highway as "in your path" simply because the road is curving and the car can't tell that there's a radius and the sign is outside it.
Between RADAR, LIDAR, and cameras I'm sure there's a hardware setup that can adequately resolve the issue but that would be too expensive. Most manufacturers don't pretend their cars are self driving. But one stands out, with Musk insisting the car is fully self driving, safer than a human driver, and ready for prime time except for pesky regulation. It is not and it will not be for a long time. They can probably self drive in less than 1% of driving conditions and even then they won't do it successfully 100% of the time. The only reason the statistics don't look much worse is that most drivers aren't suicidal and will keep saving the situation when the car shows that even driving straight between to painted lines is a challenge for a computer.
> Slamming the brakes on the highway, interstate, autobahn is dangerous for the driver behind you. In some countries (maybe all) it's illegal to to this without an immediate danger. The car may see a sign on the side of the highway as "in your path" simply because the road is curving and the car can't tell that there's a radius and the sign is outside it.
I'm asking for a gradual slowdown, not slamming on the brakes.
There wasn't even a curve here. The lane went straight and visibly into the stationary object.
The whole thing about the "car doesn't know" part is that... the car doesn't know. The car just sees an object that just sat there the whole time with speed 0 and is programmed to assume that is a stationary object on the side of the road. Which is the likeliest explanation on a highway. Without LIDAR the car can't do more than either keep braking for every object on the side of the road, then get back up to speed once it realizes it's safe (however long that may take), or just take stationary objects as "most likely not in my pass, will ignore". By the time they realize a collision is imminent it's too late to do more than slow down a bit.
Adaptive cruise control has one selling point: it maintains speed automatically thus providing more comfort. If your car keeps slowing down (or stop) for such objects then nobody would use it. Which is why it's programmed as such, for better or worse. Don't forget that the driver is still driving, the so called "fully self driving" is just a collection of driver assists that require constant attention since they can and will make fatal mistakes even in the simplest driving conditions.
There are points in the road where it's a bit unclear the exact path you will take. This is not one of them. The car has an extremely clear line of sight to the lane, enabling it to know that there are no curves before the obstacle.
Just like "emergency braking is bad" shouldn't remove the option of slowing down, "car can't always tell where the lane goes" shouldn't exclude using precise lane information when it is available.
> Humans do it with vision alone, do they not? This seems to be a failure of processing, not data.
In the best case scenario, sure, but having been in Motorsports since I was 16 and then a lot of my Adult Life in the Auto Industry the reality is a resounding: NO, Humans are horrible behind the wheel of what is essentially a less-volatile bomb with 4 wheels.
I won't even mention my time riding around on a motorcycle on the street, as I have literately looked into the eyes of a driver only to see them make the mistake anyway--it's why I have to have a straight-pipe liter bike for the street, we've tested this and you can seriously hear me 1/3 mile away. Its astonishing how bad 99% of drivers are and realistically don't have the reaction times needed to drive anything with that much weight much less power in public and yet me let them anyway.
Hell, during this weekend's protests saw police and normal citizens running straight into People on the street with signs, why isn't that seen in this same light?
So, it is exactly because of those experiences that I have to side with this failure being just one of the many to get us to a more viable future where cars are fully autonomous. Its a shame, I'm just glad I got to live and learn in era where Humans were able to do what seems an entirely absurd thing.
Yes and no, if you were driving based on a stream of fixed, low-res, low-framerate video you would definitely crash more often than you do now.
You're also disregarding the (hugely important) role of other senses like sound and stuff like "I saw a bike coming from the right 10 seconds ago, where should it be now?".
How do you think the objects are detected with "vision" by autopilot? Are you considerig classifying all the objects in the world in all the possible situations so that that dumb computer can avoid them? So far that seems the plan.
You don't need to train with every object in the world to train in the general concept of road occluders.
Autopilot users are ALWAYS supposed to watch the road and take control in unusual circumstances. Like an autopilot system in an airplane, it will fly you straight into a mountain if you give it that course and don't monitor where you are.
>> Autopilot users are ALWAYS supposed to watch the road and take control in unusual circumstances.
It looks like there is no single general concept of road occluders. If autopilot can't avoid a big object in the middle of the road what is it good for? Tesla claims to provide more than an airplane autopilot, they say its tech would avoid that mountain.
This car traveled in a straight line in near-perfect weather for at least 10 seconds toward a large motionless object blocking its lane and the next one, but still overlooked the danger.
Imagine if this had happened in bad weather or at night, where risk of failure is substantially greater. This degree of incapability implies that vision alone is not only inadequate to perceive obstacles ahead that should be obvious to any human, but profoundly inadequate.