Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't remember now, but I think I heard that law is flawed, and could be easily bypassed, even when used together with the other 2.

Does anyone know how that law is flawed? Unless I'm remembering it wrong.



In 2003 there were 6 million car accidents, not to mention people falling over, wars, etc. You're mandating the robot to put everyone in a padded cell for their own good.


Here's the paradox: a human tells a robot to kill another human; at that point, law #2 is in direct opposition to law #1.


But why would it be? If you give it the first law, then that command should be "disabled" for anyone from outside. You could try to tell a robot that, but it shouldn't work. Your best chance should be if you try to trick the robot into doing a certain action that would indirectly get that person dead. but if he's smart enough to know how to protect a human life, he should be smart enough to deny you the command, knowing it will endanger that other person's life.


Law #2 makes mention of law #1:

A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: