In 2003 there were 6 million car accidents, not to mention people falling over, wars, etc. You're mandating the robot to put everyone in a padded cell for their own good.
But why would it be? If you give it the first law, then that command should be "disabled" for anyone from outside. You could try to tell a robot that, but it shouldn't work. Your best chance should be if you try to trick the robot into doing a certain action that would indirectly get that person dead. but if he's smart enough to know how to protect a human life, he should be smart enough to deny you the command, knowing it will endanger that other person's life.
Does anyone know how that law is flawed? Unless I'm remembering it wrong.