It seems like you are not thinking things through then. You would at least need to get some kind of certification that your car remains „street legal“. You cannot seriously expect to run any autonomous driving software that you want in your car?
obviously the software should have some guarantees, just like cars have safety standards. But those should not be too strict to stifle competition, nor should companies be allowed to say "it s illegal to modify your car's firmware"
And how would you guarantee that the combination of parts that you are combining are safe? This would mean that there would need to be standards throughout the car.
If all of the construction would be modularized with open interfaces, I could imagine something like this working... So, I must admit that it seems theoretically possible to set something like this up but we are currently nowhere near this. Every car is a black box that only the manufacturer really knows. Same for software. What you demand would require all software modules to be “standardized” (think API with safety requirements and guarantees) so that automatic verification could take place.
But what are appropriate “module” sizes? Do we regulate every function call? Just applications? What if Apple sells the iphone plus software as one application? What are generic rules you could use to decide what is the “right” application to regulate? By what mechanisms can we come to good decisions around this?
It’s an interesting vision... but also totally different from the world we live in today. It’s not as simple as you make it sound.