But why would it be? If you give it the first law, then that command should be "disabled" for anyone from outside. You could try to tell a robot that, but it shouldn't work. Your best chance should be if you try to trick the robot into doing a certain action that would indirectly get that person dead. but if he's smart enough to know how to protect a human life, he should be smart enough to deny you the command, knowing it will endanger that other person's life.