> assume they mean take emergency corrective action when an issue arise like google with Waymo, and the general thing will still be sorta kinda self driving like tesla does
Having taken Waymos and put my parents' Tesla in FSD, I wouldn't assume as much. Tesla's FSD makes mistakes more frequently, and they're bad ones, in part due to its willingness to aggressively accelerate when it thinks the road is clear.
I would describe the difference as: Waymo drives like a limo driver, carefully and smoothly, and tries to disappear for the passengers in the back seat. Tesla FSD drives like a car owner might, and tries to make the person in the front seat feel like the behaviour of their car is the same as they would do it.
I rode in a Waymo for the first time last week. The highest praise I can give it is that it's boring. It drives like a careful human with zero surprises.
The more important difference is that Waymo's FSD system is SAE level 4. The Tesla FSD system that is in their cars that people can currently buy is level 2.
The difference is that the former only needs human help in rare circumstances. The latter needs human help in common circumstances.
If you base you robot taxi on a level 4 system you just need humans on standby to respond when the car knows it needs help and asks for it. If you base it on a level 2 system you need humans monitoring during routine trips to be ready to intervene when the car is about to do something stupid and has no idea that it is out of its league.
Humm I wonder if there's some truth to that, some kind of pattern matching, like "at this stretch of the road usual behaviour was: speed X mph, turning at this and this points, etc"
And FSD is improving everyday. The only question is when will it be good enough to run a system similar to Waymo where a remote operate an help if it gets stuck, and then when will it be good enough to not need that and be full autonomous.
Having taken Waymos and put my parents' Tesla in FSD, I wouldn't assume as much. Tesla's FSD makes mistakes more frequently, and they're bad ones, in part due to its willingness to aggressively accelerate when it thinks the road is clear.