Annalee Newitz over at io9 raises an interesting ethical question when she asks, "Will we hold robots accountable for war crimes?"
Here's a bit of her post:
This isn't idle speculation. An automated anti-aircraft cannon's friendly fire killed nine soldiers in South Africa last year, and computer scientists speculate that as more weapons (and aircraft) are robot-controlled that we'll need to develop new definitions of war crimes.
So how will justice be served in these cases? Presumably, we'll punish a guilty robot by smashing it flat or refabricating it into a Kia Sportage owned by someone who commutes into DC.
But who else would we punish, especially if these robots are autonomous? The programmer who came up with the algorithm? And would the programmer, in turn, try to prove a software glitch caused by the manufacturer? Of course, this is the same question raised when 2001: A Space Odyssey hit screens almost forty years ago and more recently in the Will Smith vehicle I Robot
.If you want to weigh into this issue on the ground floor you should make it a point to attend the Technology in Wartime conference. Your conference fee even gets you a free T-shirt (bonus).
(Gouge: CM)
(Photos: iRobot's Bomb Disposal robot in action, courtesy iRobot; Capture from film "I Robot" starring Will Smith, courtesy 20th Century Fox)
-- Ward