The most pertinent point here being that although collecting lots of self driving car data is easy, the tail and unexpected events that you care about are surprisingly rare. And many of these events can also be entirely new (never seen before). Current AI SOTA techniques simply require too much data and cannot reason well enough to solve this.
Leaving Musk aside, amongst those who are serious about self driving cars, e.g. Waymo, there is still a clear emerging picture that current AI techniques are too weak to get there right now. The limitations of current sensors and compute hardware are not helping either. Going the other way and eliminating LIDAR isn't going to help.
> Current AI SOTA techniques simply require too much data and cannot reason well enough to solve this.
If they had someone working on using the tech to ID mushrooms growing along the road then they’d probably come away with some new techniques to make the cars safer. Trying to make a self-driving car algorithm by only thinking about driving-related things seems like it will just lead to overfitting.
And in fact any car that doesn’t have enough cognitive surplus to identify mushrooms while driving in normal conditions may just not be safe to drive period.
I agree that ditching Lidar by Musk is the wrong move. From what I have read most if not all the self driving accidents that Tesla have been involved in could have been avoided if a Lidar type sensor was involoved. With tech available trying to use the same limits as human vision is stupid.
I think Musk has no choice but to ditch Lidar. He has promised to provide full self driving on cars already on the road, and retrofitting all of those cars with Lidar is incredibly expensive.
Bologna. He had/has a choice, he chose snakeoil salesman. Tesla could have focused on affordable (or luxury) electric vehicles. The choice to add self-driving was all Musk.
I agree with Musk and George Hotz that lidar is never going to get there. That's because in Michigan the terrain that lidar has mastered changes drastically every time it snows. For those of us in snowy climates limiting us to self-driving to seven months a year isn't going to cut it.
But this doesn't explain why self-driving cars won't succeed in Arizona. They don't need to take over the world at once, they just need to work somewhere and that would be a good start.
Lidar has no problem with snow, only when its _snowing_. However, if visibility is reduced, a pure visual approach is going to be even worse.
Musk made a mistake, and he's spending billions to prove that he never made it.
snow makes HD maps somewhat less useful, as the terrain changes shape. But then, you need a map that evolves. Sadly for musky, the only company that can do that using purely visual data is Scape Technologies. Not even lyft can do that.
That radar is utterly worthless for this sort of application, it has awful angular resolution. It can't tell the difference between a building next to the street and a truck parked in the middle of the street.
Just because lidar is not working in certain places and certain conditions does not mean that it's always useless. It just means that an AV cannot rely [only ]on lidar. Lidar could be used as a suplementary sensor - that is providing additional sensory data when appropriate and switching to non-lidar mode when conditions require.
This is one of the things where AV can have a clear advantage over [even above ]average driver: a superior sensory system
If you switch to non-lidar mode for unfavorable condition, why don't just use the non-lidar for all conditions. A system that is safe for unfavorable conditions is likely to be the system that is safe for all conditions.
Because lidar allows faster travel in the modes where it works. When there is heavy rain/snow the only safe speed is very slow anyway. When roads are clear and dry I really want my car to go faster than the elderly walk (I walk faster than that normally), and lidar may be required for that mode.
That's false. The main reason Waymo is held up is because the accepted standard of driving is unsafe, and it has data and simulations that prove it.
For example, an unprotected left turn during rush hour. The gap between cars that people drive into in that situation often does not leave any room for chance. For example, if another car speeds up or changes lanes, there can be an unavoidable accident. But the standard for human drivers is to take those chances during heavy traffic, and drivers that try to avoid that are not accepted.
The issue is that people are incorrectly going to hold Waymo's computer responsible for some unavoidable accidents in those types of scenarios, but at the same time expecting them to make those maneuvers even though the computer knows there isn't enough margin.
That's not a weakness of the AI. That's an inability of people to understand the nature of traffic and the relationship between AI and people.
There were scenarios in which this LIDAR technology would effectively blind detectors on other cars. I can understand how this puts the use case for this technology in question, since ideally, you'd want passive sensors, or at the very least sensors that don't put other people's lives at risk.
If only there was some way to compute entailments from unlikely combination of events :-) But seriously, Gary Marcus and Ernie Davis make a good case in their new book Rebooting AI that logic has to play a role in any real AI.
The goal is not to achieve perfection, just demonstrably more safe than an average driver.
I'm curious how we will evaluate average driver. Clearly, including drunk driving accidents in the mix is not a helpful metric for safety-conscious drivers.
In utilitarian terms, self-driving cars just need to be better than the drivers they replace. (Or even stronger: not much worse, because the added convenience is worth something.) Ie we should start to replace the worst drivers already.
In practice, the target seems to be not just the average driver, but the best human driver (or at least the 99%th percentile.) Given the trajectory of improvement we often see with AI, surpassing the average human and surpassing all humans shouldn't be too far apart.
You're imagining a scenario where everyone has a self-driving car and completely ignoring cost and social factors.
Who among us would admit, "yes, I am a bad driver, I need this more expensive thing to take control in the event of an accident"? Very few is my guess. In 2019, people use these for convenience, not because they think the system is a better driver. I bet that's true even for Waymo's taxis.
As for getting drunk drivers to use self-driving, that's as easy as asking them to either stop drinking or not drive. Then there is the cost. I wouldn't be surprised if the unemployed drink and drive more often.
If self driving cars exist courts will be more likely to take away drivers licenses from those who drive while drunk. Thus I'd expect 10% of the cars being self driving to result in something like 20% less drunk drivers because such drivers are more likely to get the self driving cars - not that they want them more than anyone else but because they are not given any choice.
> If 10% of the cars are self-driving, to a first approximation there will be 10% less drunk drivers.
It seems unlikely that people who buy/use early self-driving cars have the same percentage of drunk driving as the general population. Surely employment rates would be a factor, for example.
I'm not sure whether or not there is a correlation here. Anecdotally, it seems to me there are plenty of wealthy, employed people who buy new cars and crash them after drinking.
But even if it is true that those who drive cheaper, older cars are more likely to drive drunk? Today's new cars with high-tech self driving features are tomorrow's affordable used cars. You have to start somewhere.
Leaving Musk aside, amongst those who are serious about self driving cars, e.g. Waymo, there is still a clear emerging picture that current AI techniques are too weak to get there right now. The limitations of current sensors and compute hardware are not helping either. Going the other way and eliminating LIDAR isn't going to help.