Neither of those things were "learned adaptations" from human driving, and this is one of the biggest fallacies around FSD. People letting their FSD do stupid things and only intervening at the last moment (if at all) to let it learn from its mistakes.
That it might use the direction of the car in front of it as a guidance track doesn't mean it understood the human gestures of the cop telling it to do that. Or in a concert parking lot where an attendant might be doing things like alternating cars to the left and right lots.
> Because it copies human drivers, FSD actually was doing slow rolling stops for a while.
No, this was programmed behavior, with an interface/config setting, to do a "rolling stop". People "freaked out" because Tesla was literally allowing the car to perform illegal traffic infractions, and if they'd do it for stop signs, what else would they do it for?
But none of this is some Tesla "swarm" learning to do rolling stops. There is no adaptive learning happening. This is all trained from static models according to parameters from Tesla.
Version 11 had lots of hand-coded behavior. Version 12 is entirely a neural network, trained on human driving. Partly it comes from people running FSD and intervening sometimes, and partly it comes from just passive observation of a people doing their own driving. When FSD runs, video feeds into the neural net and it outputs vehicle controls, that's it.
Learning to respond to gestures is just more training on video and car control data. It shouldn't be hard to believe given all the other things we're doing these days with large neural networks.
That it might use the direction of the car in front of it as a guidance track doesn't mean it understood the human gestures of the cop telling it to do that. Or in a concert parking lot where an attendant might be doing things like alternating cars to the left and right lots.
> Because it copies human drivers, FSD actually was doing slow rolling stops for a while.
No, this was programmed behavior, with an interface/config setting, to do a "rolling stop". People "freaked out" because Tesla was literally allowing the car to perform illegal traffic infractions, and if they'd do it for stop signs, what else would they do it for?
But none of this is some Tesla "swarm" learning to do rolling stops. There is no adaptive learning happening. This is all trained from static models according to parameters from Tesla.