New Laws of Robotics Proposed for US Kill-Bots 373
jakosc writes "The Register has a short commentary about a proposed new set of laws of robotics for war robots by John S Canning of the Naval Surface Warfare Centre. Unlike Asimov's three laws of robotics Canning proposes (pdf) that we should 'Let machines target other machines and let men target men.' Although this sounds OK in principle, 'a robot could decide under Mr Canning's rules, to target a weapon system such as an AK47 for destruction on its own initiative, requiring no permission from a human. If the person holding it was thereby killed, that would be collateral damage and the killer droid would be in the clear.'"
Robot laws (Score:5, Insightful)
Three rules... (Score:5, Insightful)
2) The military will program their toys to kill everything and everything, and to hell with Asimov (right up until they turn on us)
3) Humans already count as collateral damage in warfare. Damn the men, spare the oilfields!
huh (Score:4, Insightful)
Re:Robot laws (Score:5, Insightful)
Re:huh (Score:3, Insightful)
How is a robot supposed to know the difference?
Sounds more like RoboCop laws of robotics... (Score:5, Insightful)
Honestly though, I see value in a policy that no human life should be risked in automatic death systems - including land mines and other traps. These loopholes make that policy as useless as some RoboCop parody though.
Ryan Fenton
Worthwhile pursuit (Score:4, Insightful)
Protects against political problems, not sentience (Score:3, Insightful)
The article summary doesn't give the right impression... the proposed policy would allow machines to target military machines. (see p.15-16 of the PDF) Page 23 is the most interesting, saying that anti-personnel landmines are looked down upon in the international community because they linger after war and kill civilians, whereas anti-tank mines aren't looked down upon so much, because they can only misfire during an armed conflict. So the policy is mostly intended to address international political responses to war, not to prevent sentient machines from taking over the human race.
Though, it would limit somewhat the extent to which machines could enslave the human race... if humans never took up arms, machines could never take lethal action against humans. That doesn't mean machines couldn't control humans politically/economically/socially (eg. deny food, deny housing), but it does mean they couldn't take up a policy of overt extermination of all humans, unless humans decided to fight to the last.
And how would these "laws" be programmed? (Score:4, Insightful)
Re:Robot laws (Score:4, Insightful)
Re:Robot laws (Score:2, Insightful)
Re:Three rules... (Score:2, Insightful)
Premature (Score:5, Insightful)
Re:Robot laws (Score:3, Insightful)
Re:Robot laws (Score:5, Insightful)
Re:Robot laws (Score:3, Insightful)
Bullshit and liberal psycho-babble claptrap.
You get in fight, the other guy is bleeding more than you are and down for the count - You Win!
You get sued, the other guy loses more money than you - You Win!
You get into a war, you nuke the other guy into submission - You Win!
Yes, in each of these situations you lose something, blood, money, time, people, and equipment, but the other guy is worse off? You Win!
The only place your philosophy works is also the only place pacifism works, in a theoretical la-la world of perfect situations where everyone else thinks like you (god forbid that ever happens). The pacifist says, "I will not let you make me fight. Not even to defend myself." In your La-la World, the opposition says, "Gee, golly, gosh, he really means it, how could we ever think of carrying on in our evil plots? Let's sing kumbaya. Sorry." In the real world, the opposition says, "Great, kill this guy first. He's just a trouble maker. Now, let the tanks roll." The problem with pacifist is that for them to continue on existing and trying to make their philosophy work and propogate is that people like me, willing to carry a gun, willing to sign up and deploy, willing to kill the other guy and break his stuff must defend his sorry ass even while he decries me for doing so.
Wars are not only won, but spectularly so.
Re:Robot laws (Score:3, Insightful)
On the contrary, winning at any cost is often far worse than losing. A Pyrrhic Victory [wikipedia.org] often invites an even greater disaster in the future, but simply losing a fight means you can continue fighting in other ways, or fight again later when you've marshalled your strength and more carefully evaluated the enemy's weaknesses.
I'd draw parallels to current world events, but anyone willing to shred the Constitution just to be able to kill a few Al Qaeda members is probably not interested in learning real political or military history.
No, the original three laws work too. (Score:3, Insightful)
This reflects how real wars are fought, too. Name me one war in all of recorded history that did NOT involve first dehumanizing the enemy by all sides involved. We see that even today, with German soldiers posing by pictures of the skulls of defeated enemy, or American soldiers posing by naked and shackled prisoners. You think these soldiers would be capable of such flagrant human rights violations if they first pictured their opponents as human? This isn't about a few bad apples, it's a product of training.
(As the character of Travis put it in Blake's 7, "I reacted as I was trained to react. I was an instrument of the service. So if I'm guilty of murder, of mass murder, then so are all of you!")
It's also an inescapable product of training. Like I said, dehumanizing isn't limited to a few people or a few wars - it has included ALL combatants in ALL wars in as much of history as we have enough of to comment on. If you want a totally humanized nation, you simply cannot have an armed forces. Likewise, if you have an armed forces, you simply cannot have a totally humanized nation. I don't run the country, so which is "better" is not my problem. What I can be sure of is you can't have it both ways.
Re:Robot laws (Score:1, Insightful)
We already do this (Score:1, Insightful)
For instance, we can't target a person with a m2
Who needs fancy new robots, the marine corps makes robots the old fashioned way: With brainwashing!
Re:Robot laws (Score:0, Insightful)
Re:Robot laws (Score:2, Insightful)
Re:Robot laws (Score:2, Insightful)
How's this set of principles:
Automated systems can fire on other automated systems willy-nilly, assuming no people are likely to be in the fire zone.
Automated systems can fire on people, but only if that person is pointing a weapon at the machine or at the people the machine is there to protect.
Automated systems cannot fire on people whom it cannot readily identify as containing a weapon, and as such must fall back on human operators then.
The weapons system must use two forms of verification to identify friend or foe.
All these laws are flexible depending upon the situation, such that an automated system trying to protect an area from a careening car bomb could open fire on the wheels of the vehicle, or a ship-to-ship missile defence system could fire on a careening object, be it manned or unmanned.
Re:Robot laws (Score:4, Insightful)
The only purpose REMOTELY possible by US military activity at the moment, is to (forcefully) create states that are NOT dangerous enemies to western civilization. If we followed your logic, even that last hope will be lost.
Re:Robot laws (Score:1, Insightful)
Personaly I'd rather be hit in the head with
Re:Robot laws (Score:3, Insightful)
Poorly?
No (Score:5, Insightful)
You're confusing governments with peoples. Yes the Irish are still around. So are the Italians, so, in fact, are the Germans, Japanese, and Brits. Winning or losing wars rarely affects that, with notable exceptions like the Native Americans, for whom I think it's pretty obvious losing was a bad thing. What aren't still around are governments. And while winning might not make one last forever, I think Hitler and Hirohito would tell you losing is much worse.
Seriously, the only way winning would not be a virtue, is if it led to complacency, arrogance, and ultimately weakness. But even then, you would have to _lose_ a war for it to matter. And really, with the exception of the Native American's most peoples have survived, and there's really no one to outlast. You are thinking of governments, and trust me, just because you can't think of the names of the governments that disappeared (fair because winners write history) they did.
Re:Robot laws (Score:3, Insightful)
If you're sitting at a bar, and some guy gets rowdy, you get into a fistfight, and the other guy spends the rest of the night bleeding on the floor. Sure, you "won", but you'd much rather have not had to spill your beer and bruise your knuckles, to say nothing of the black eye you'll have in the morning.
War is the breakdown of diplomacy. Every single time the United States has entered military conflict is because we already lost our attempt at an acceptable non-military solution.
(And, for the historical record, liberals are far more likely to go full-gear-for-bear on a real threat, rather than the conservative "oh what a wonderful army we have" pseudo-exercises.)
Re:And how would these "laws" be programmed? (Score:3, Insightful)
Note: this point was entirely lost in the movie remakes of these books.
So isn't it a little scary that we're actually comparing some policy to the three laws of robotics? To repeat myself: The whole point of the books was that the three rules didn't work.
Comment removed (Score:3, Insightful)
Re:Robot laws (Score:1, Insightful)
In the context of the USA alone, I must agree with you. However, unless you believe that a woman submitted to the taliban mullahs participated in an "inclusive" society, GWB took an important step *forward* to a more just society globally.
If one wants to believe in things like justice, one should accept the existence of some global laws. The existence of laws implies in the existence of law enforcement and, for better or worse, when it comes to things like dictators, the only global law enforcement we have are the armies of the bigger countries.
Unfortunately, the bigger countries usually don't attack unless they are attacked first. GWB only invaded Afghanistan and Iraq because of 9/11, meanwhile the Islamic government of Sudan is performing genocide in Darfur. However, if GWB's doctrine that a powerful contry is entitled to perform global law enforcement were accepted, things like WWII and the genocide in the former Yugoslavia wouldn't have happened.
Re:Robot laws (Score:3, Insightful)
"However, unless you believe that a woman submitted to the taliban mullahs participated in an "inclusive" society, GWB took an important step *forward* to a more just society globally."
It's not a simple binary choice, objecting to GWB's agorgance does not mean I support the oppression of women.
"GWB's doctrine that a powerful contry is entitled to perform global law enforcement"
And here I was thinking GWB would never submit to international law?
"meanwhile the Islamic government of Sudan is performing genocide in Darfur"
The US has been on both sides there as well, 20yrs ago the tribes in the south were "terrorists" and the govt death squads were the "good guys". What is happening in Sudan is just another proxy war between members of the UNSC, most noteably the US and China.
War is Hell (Score:4, Insightful)
"War is Hell"
Ever read All Quiet on the Western Front? Ever talked to someone who was there or a civilian in European WWII?
War sucks. It's supposed to suck. Without the pain and suffering that war can bring to all sides of the battle, winners and losers alike. Perhaps the generals should go watch Star Trek Episode 23, A Taste of Armageddon, circa 1967.
That society has done such a nice job making war "clean" that they have decided to continue fighting a war for 500 years rather than just figure out how to make peace.
In most societies, people are taught that violence against others is fundamentally bad. This becomes a moral element that entwines all the people within that society. It also motivates the same people to find ways around doing violence.
If you study anything about the Nazi camps in WWII they had a growing behaviour where the soldiers in the concentration camps knew what they were doing but absolved themselves of any responsibility by hiding behind the statement, "I was just following orders", thereby removing themselves morally from the actitivies. After WWII this was considered to be a War Crime and has been backed by hundreds of trials across the world.
Fast forward 60 years and we are at a point where the soldiers who are operating a computer screen which operates these killer robots can absolve themselves from any responsibility of moral involvement because the Laws will simply allow them to say, I was just operating a computer program. And while this is going on, there is no one left to come back from the battlefield to serve as a reminder of just how bad war really is and how important it is to avoid it.
At the same time if we are going to commit to a war, we had better be willing to do it to completion even when it gets ugly. I'm pretty pissed at the news for giving us daily body counts of 4 and 10 soldiers on a 5 year battle. In contrast, WWII was hundreds to thousands a day and everyone was sticking to their plan. Everyone was commited to the plan and everyone knew why they were fighting. Vietnam wasn't so clear cut. It was rather vague as to why were where there and even on day one, not everyone was convinced we needed that war. And now we are in the Middle East without a convincing and clear cut plan as to what we are doing, why we are there, what we hope to accomplish, and not enough people in the States give a shit. Perhaps in New York City, but no where else.
They'll get their killer robots and their legal loopholes to kill anything they want and no one will really do much because it's clean and doesn't interfere with "Dancing with the Stars" and the sheep continue to bleat
Re:Robot laws (Score:3, Insightful)
The Geneva conventions take NO stance on the 12.5mm/50cal ammuntion and it's usage on humans. For that matter, shotguns loaded with slugs are larger in diameter. The whole 'aiming at equipment, such as a belt buckle', is most likely the result of somebody classifying the M2 as a 'anti-equipment' weapon, and the resulting stupidity of logic to make the system usefull against charging troops again.
Now,
If the
The Geneva conventions only restrict weapons that cause unnecessary harm and suffering. Stuff like glass rounds, which aren't more effective in removing the combat capability of a soldier, but makes treatment of his wounds(assuming he survives), a huge difficulty.
Expanding bullets(hollowpoints) are a big question mark. They're more effective at enacting a 'stop' on an unarmored target than FMJ for their caliber, but to do it they trade off penetration ability. The Hague conventions(US didn't sign), banned them, but they're the standard load for police and self-defense today, so the argument that they cause unnecessary suffering is shaky.
The 'certification' of HP, match grade bullets for sniping went like this: 1. The HP bullet is the most accurate bullet we've found, out of several dozen possible candidates, both HP and non-HP 2. Wound characteristics between the non-expanding HP and non HP versions were essentially the same. Therefore it's approved. I'd argue that even if it's more wounding(IE more likely to quickly kill the target), it'd still fit within the definitions. Take 9mm FMJ and 9mm HP. 9mm FMJ has something like an 80% stop rate* with a single chest hit, 9mm HP is around the 90% rate. It's outright more effective. Survival is slightly less for the HP rounds, but disability is about the same. Something nasty like a glass bullet would be more expensive, have a lower stop rate; yet a higher death and disability rate. Result: Unnecessary suffering, banned.
*rate at which the one hit is no longer able to pursue agressive actions within moments of being shot.
Re:Robot laws (Score:2, Insightful)
In WW2 we were attacked, and a genocidal nutjob had the real capacity to destroy the world, I'm happy we stopped Hitler, but I couldn't care less that we won "Dominance" over the world.
As an American, I don't want "dominance" of the world, just prosperity. We don't need military bases on every continent and spend more than every other nation in the world combined to be prosperous. As long as I and those in my community are doing well economically, I couldn't care less how much soft power I wield over foreign officials. And before you tell me that our economic success is contingent on our military strength, look at Ireland. They have a higher GDP per capita then us, faster growth, and far better foreign relations. Somehow, they managed to do this without a giant military.