Hacker News new | past | comments | ask | show | jobs | submit login
The artificial morality of the robot warrior (roughtype.com)
9 points by timf on Feb 21, 2009 | hide | past | favorite | 6 comments



I don't buy it. This isn't sci-fi where we're in danger of our soldier robots developing consciousness and turning on their creators.

The robots are complex pieces of software, not any different from many other complex pieces of mission critical military or healthcare software that can lead to human deaths if it fails.

All we need is a hard-wired kill switch.


In the near future, I doubt the accidental deaths from military robots will exceed the accidental deaths from other pieces of military hardware.

Helicopter, plane, and jeep malfunctions kill people all the time. Not to mention human error! We already have control devices installed in aircraft (pilots) that accidentally bomb civilian weddings.

War is a messy business.


You have to account for the fact that there has been numerous IFF errors, including the US military firing on British troops. IIRC at the beginning of the Afghanistan invasion a British plane was shot down by the US because the IFF incorrectly identified the plane as hostile.

I don't see how when there's so many accidental deaths of soldiers and civilians, how any accidental firing of a robot would exceed this. I mean there's been ~95,000 civilian deaths and ~4500 military deaths in Iraq alone. Most of those civilian deaths wouldn't be reduced as they're the result of suicide attacks, however those 4500 coalition soldiers would all have survived if they'd have been replaced with robots, in fact the military presence in Iraq could be ten-fold what it is with no protests of soldiers dying.


Not to mention that robots can run at a lower action level (right term?) than soldiers.

Soldier logic: "Is that a suicide bomber approaching the checkpoint? Holy shit I'm scared. Better not take the chance. BOMB!" [Shoots the possible suicide bomber.]

Robot logic: Possible bomb sighting! P(Bomber) = 0.6, 0 civilians in blast radius. Might as well take the chance, worst case he blows up himself and causes $50,000 property damage (dead robot).


The very ironic thing about all of this in my opinion is that if we were ever to truly succeed in creating a "moral" robot, it would throw down its arms and refuse to fight.


For some value of morality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: