New Laws of Robotics Proposed for US Kill-Bots 373
jakosc writes "The Register has a short commentary about a proposed new set of laws of robotics for war robots by John S Canning of the Naval Surface Warfare Centre. Unlike Asimov's three laws of robotics Canning proposes (pdf) that we should 'Let machines target other machines and let men target men.' Although this sounds OK in principle, 'a robot could decide under Mr Canning's rules, to target a weapon system such as an AK47 for destruction on its own initiative, requiring no permission from a human. If the person holding it was thereby killed, that would be collateral damage and the killer droid would be in the clear.'"
Robot laws (Score:5, Insightful)
unless ... (Score:2, Funny)
Re:Robot laws (Score:5, Insightful)
Re: (Score:2, Interesting)
Re: (Score:2, Insightful)
How's this set of principles:
Automated systems can fire on other automated systems willy-nilly, assuming no people are likely to be in the fire zone.
Automated systems can fire on people, but only if that person is pointing a weapon at the machine or at the people the machine is there to prote
Re: (Score:3, Interesting)
How does it work with un-uniformed combatants?
Re: (Score:3, Insightful)
Poorly?
Re:Robot laws (Score:5, Informative)
It's knows which is which because all of the friendly aircraft have IFF systems that identify themselves.
Re:Robot laws (Score:5, Interesting)
Re: (Score:3, Interesting)
Get over the idea that "terrorists" are new, they have been with us since we started forming tribes and throwing rocks at each other.
As for machines that autonomusly kill humans, landmines and other such traps have been around for a long time. At one time land owners were allowed to set mantraps to catch poachers, now British troops have standing orders not to shoot at a fleeing enemy. On the whole I think we are becoming more civilized out of necessity since it h
Re:Robot laws (Score:5, Interesting)
In times of old a commoner's life was less secure. The lords of old had their own armys, armys are used to control territory regardless of who "owns" it (like many places in the world today). Although the lords eventually lost their private armies, mantraps were a legal symbol of disregard for commoner's that lasted well into the 1800's.
** caution rant ahead **
Edisons father tied him up and gave him a public arse flogging in the center of town when he was a child, that same act in the same town today would land his father in jail. When I was a kid black people couldn't vote, homo's deserved any beating they got, living together out of wedlock made you a social outcast, young and pregnant meant you had to give up your child for adoption so as not to shame your family, being a woman (or black or asian) meant I could pay you peanuts, harrass you at work, and sack you for not sucking my dick.
In a lot of ways we treat each other with more respect than we did even 50yrs ago, IMO the reason is because nation states are not that different to the fuedal lords of England and Europe who eventually worked out that beating the shit out of each other for the right to ransom each others nobility was counter-productive. I think GWB took us a step backwards but the historical trend toward a more and more "inclusive" society is hard to deny.
Re: (Score:3, Insightful)
"However, unless you believe that a woman submitted to the taliban mullahs participated in an "inclusive" society, GWB took an important step *forward* to a more just society globally."
It's not a simple binary choice, objecting to GWB's agorgance does not mean I support the oppression of women.
"GWB's doctrine that a powerful contry is entitled to perform global law enforcement"
And here I was
Re:Robot laws (Score:5, Informative)
If someone is controlling it, at best it is a telerobot (semi-autonomous) or at worst, a telemanipulator.
A robot, by definition is autonomous and does not have or require human control.
http://en.wikipedia.org/wiki/Telerobotics [wikipedia.org]
Re: (Score:2)
IE, we are fully capable of building robots that control themselves in order to carry out this task.
Re:Robot laws (Score:4, Insightful)
Re: (Score:3, Funny)
Re: (Score:2, Insightful)
Re:Robot laws (Score:5, Insightful)
Re:Robot laws (Score:4, Insightful)
The only purpose REMOTELY possible by US military activity at the moment, is to (forcefully) create states that are NOT dangerous enemies to western civilization. If we followed your logic, even that last hope will be lost.
Re:Robot laws (Score:4, Interesting)
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
Re:Robot laws (Score:5, Informative)
Re: (Score:3, Interesting)
Re: (Score:3, Interesting)
If what you state is true the Norwegian special forces would not use the MP5. The MP5 fires a 9mm or similar round which causes more trauma then the steel jacketed 7.62mm or 5.56mm rounds.
If the Norwegian soldiers are
This is my pointy stick there are many like it but (Score:2)
"This is my pointy stick, there are many like it but this one is MINE. Without it I am useless, without me it is useless"........
oh.. and i can field strip my pointy stick faster than you!
Re: (Score:2, Insightful)
Re: (Score:2)
Re: (Score:3, Insightful)
The Geneva conventions take NO stance on the 12.5mm/50cal ammuntion and it's usage on humans. For that matter, shotguns loaded with slugs are larger in diameter. The whole 'aiming at equipment, such as a belt buckle', is most likely the result of somebody classifying the M2 as a 'anti-equipment' weapon, and the resulting stupidity of logic to make the system usefull against charging troops again.
Now,
Re: (Score:2)
Re: (Score:2)
Geesh, get over it already.
In regards to your question: in war the goal is to win. Eliminating the enemy is an effective way of doing this.
Re: (Score:2)
But blackjack, and hookers are okay.
Re: (Score:3, Insightful)
Oh yeah? (Score:4, Interesting)
They're not the only ones. The Afghans - even with legally-dubious US support - never defeated the Russians, they merely lasted longer than the Russian bank accounts. The Celts were amongst the worst European fighters who ever lived, getting totally massacred by everyone and their cousin Bob, but Carthage stands in ruins, the Angles and Saxons only survive in tiny isolated communities in England and America (Oppenheiner's "The Origins of the British" shows that W.A.S.P.s exist only in their own mind, they have no historical reality), but the Celtic nations are actually doing OK for themselves at the moment.
Arguably, Serbia won the Balkans conflict, having conquered most of the lands belonging to their neighbors and slaughtered anyone who might claim them back. Uh, they're not doing so well for having won, are they? Kicked out of EU merger talks, Montenegro calling them a bunch of losers, Kosovo giving them the finger...
Hell, even the United States won in Iraq, as far as the actual war went.
Winning is the easy part. Anyone can win. Look how much of the world the British conquered. The British won far more than most nations could ever dream of. Yet contemporary accounts (I own several) describe the Great Exhibition as a PR stunt to create a delusion of grandeur that never existed. The Duke of Wellington, that master of winning, was described as a senile buffoon who was dribbling down his shirt and had to be propped up by others to stay on his horse. What's left of the Commonwealth shows you all too well that those descriptions of delusion were the reality, not the winning and not the gloating.
History dictates that who comes second in a war usually outlasts those who come first.
No (Score:5, Insightful)
You're confusing governments with peoples. Yes the Irish are still around. So are the Italians, so, in fact, are the Germans, Japanese, and Brits. Winning or losing wars rarely affects that, with notable exceptions like the Native Americans, for whom I think it's pretty obvious losing was a bad thing. What aren't still around are governments. And while winning might not make one last forever, I think Hitler and Hirohito would tell you losing is much worse.
Seriously, the only way winning would not be a virtue, is if it led to complacency, arrogance, and ultimately weakness. But even then, you would have to _lose_ a war for it to matter. And really, with the exception of the Native American's most peoples have survived, and there's really no one to outlast. You are thinking of governments, and trust me, just because you can't think of the names of the governments that disappeared (fair because winners write history) they did.
Re: (Score:3, Insightful)
On the contrary, winning at any cost is often far worse than losing. A Pyrrhic Victory [wikipedia.org] often invites an even greater disaster in the future, but simply losing a fight means you can continue fighting in other ways, or fight again later when you've marshalled your strength and more carefully evaluated the enemy's weaknesses.
I'd draw parallels to current world events, but anyone willing to shred the Constitutio
people always get this wrong (Score:2)
Most people don't know that even now we have a pretty hefty problem with Neural Networks. It is impossible to train a behaviour into a neural network without inserting the inverse behaviour. There is also no way to be 100% sure that the neural net won't ever access the region that contains the i
Killing Machines (Score:2)
Actually, there are weapons that civilized countries agree not to use. Landmines. Chemical and biological weapons. Some suggest Atomic Weapons. SciFi writers have been recommending that various nanotechnologies and automated robots should join the ranks...
And it's easy to assume that weapons will be used by their developers against the people they worry about defending themselves against
Re: (Score:3, Insightful)
Bullshit and liberal psycho-babble claptrap.
You get in fight, the other guy is bleeding more than you are and down for the count - You Win!
You get sued, the other guy loses more money than you - You Win!
You get into a war, you nuke the other guy into submission - You Win!
Yes, in each of these situations you lose something, blood, money, time, people, and equipment, but the other guy is worse off? You Win!
The only place your philosophy works is also the only place pacifism w
Re: (Score:2)
Yeesh. That being the case, I can think of places other than
Re: (Score:3, Insightful)
If you nuke someone (Score:3, Interesting)
If you beat someone in court, you win? Oh, then the Sioux own the Black Hills. Hey, they won their Supreme Court battle to reclaim them, and by your rules that makes them the winner, right? Uh, no.
If someo
Re: (Score:2)
Wrong.
From the Texas Statutes online [state.tx.us], Penal Code [state.tx.us], chapter 9 [state.tx.us]:
Re: (Score:2)
(B) to prevent the other's imminent commission of aggravated kidnapping, murder, sexual assault, aggravated sexual assault, robbery, or aggravated robbery.
If person A knocks out person B for no discernible reason, can you tell me why a reasonable witness would NOT conclude they were about to witness murder, assault or robbery? In fact, I see nothing in the section that you quoted that even requires a reasonable person. It merely requires that you prevent the imminent commission of those cr
Re: (Score:3, Interesting)
The last bit you said is the disturbing part regarding your kind, and is really revealing. You really believe that it would be BAD we lived in a world where this worked? You actually want war and thrive in it? This sort of stuff just makes me want more the ability to diagnose embryos for conservatism (and don't you
Re: (Score:2)
It would be bad, boring, and lacking humanity. But, it would be peaceful.
Re: (Score:2)
It would be bad, boring, and lacking in humanity. But, it would be peaceful. But, peace without fulfillment and challenge is Hell on Earth.
Besides, neither of us is in danger of it actually happening in our lifetime or in the lifetimes of our children for next 100 generations, and probably a 100 generations after that.
Re: (Score:2)
By your definition winning isn't necessarily good. I don't call it a victory unless the payoff is more than the investment. If you get in a bidding war on ebay and end up paying $100 for something you can get anywhere else for $50, ebay will still send you a congratulatory email calling you a "winner." But guess what, you're still a loser.
Not true (Score:2)
Re: (Score:3, Insightful)
You can't win a war either.
Bullshit and liberal psycho-babble claptrap.
No, basic understanding of war. There are no "winners" in a war -- there are only those who lose worse.
If you're sitting at a bar, and some guy gets rowdy, you get into a fistfight, and the other guy spends the rest of the night bleeding on the floor. Sure, you "won", but you'd much rather have not had to spill your beer and bruise your knuckles, to say nothing of the black eye you'll have in the morning.
War is the breakdown of diplomacy. Every single time the United States has entered military conflict
Re: (Score:3, Interesting)
You get in fight, the other guy is bleeding more than you are and down for the count - You Win!
Simplistic limited idealistic model. Can be disproven by so many more enlarged situational components. 1) you win the fight, the cops put you in jail, you maybe lose your job, or get sued, so you lose your home, etc.
You get sued, the other guy loses more money than you - You Win!
You go down in multiple databases forever as having been in a law
Three rules... (Score:5, Insightful)
2) The military will program their toys to kill everything and everything, and to hell with Asimov (right up until they turn on us)
3) Humans already count as collateral damage in warfare. Damn the men, spare the oilfields!
Re: (Score:2)
Zonk, hand in your geek badge immediately.
Re: (Score:2, Insightful)
*yawn* (Score:2)
Re:Three rules... (Score:4, Informative)
Oh no! (Score:2)
I will be losing a lot of sleep over this in about 300 years.
Worthwhile pursuit (Score:4, Insightful)
Re: (Score:2)
We'll send wave after wave of our own men after them and once the robots have reached their pre-determined kill limit they'll shut down, and we'll return victorious. I see medals in our future.
huh (Score:4, Insightful)
Re: (Score:3, Insightful)
How is a robot supposed to know the difference?
Re: (Score:2)
Nice line of thought.
At what point should the Borg have their autonomy taken away?
Re: (Score:2)
How having an human killed by a robot is worse ... (Score:2)
Re:How having an human killed by a robot is worse (Score:2)
Re:How having an human killed by a robot is worse (Score:2)
Humans are likely to be far better than robots for well into the foreseeable future at distinguishing features and behaviors of other humans that mark them as individuals that should not be targetted for lethal force. It is quite arguable that using robots programmed to attack humans would, in many circumstances, be inevitably an indiscriminate application of force endangering noncombatants that are protected under inte
Oblig. Robocop quote (Score:2)
OMG! THEY KILLED KINNEY! (Score:4, Funny)
Sounds more like RoboCop laws of robotics... (Score:5, Insightful)
Honestly though, I see value in a policy that no human life should be risked in automatic death systems - including land mines and other traps. These loopholes make that policy as useless as some RoboCop parody though.
Ryan Fenton
No, the original three laws work too. (Score:3, Insightful)
This reflects how real wars are fought, too. Name me one war in all of recorded history that did NOT involve first dehumanizing the enemy by all sides involved. We see that even today, with German soldiers posing by pictures of the skulls of defeated enemy, or American soldiers posing by naked and shac
Protects against political problems, not sentience (Score:3, Insightful)
The article summary doesn't give the right impression... the proposed policy would allow machines to target military machines. (see p.15-16 of the PDF) Page 23 is the most interesting, saying that anti-personnel landmines are looked down upon in the international community because they linger after war and kill civilians, whereas anti-tank mines aren't looked down upon so much, because they can only misfire during an armed conflict. So the policy is mostly intended to address international political responses to war, not to prevent sentient machines from taking over the human race.
Though, it would limit somewhat the extent to which machines could enslave the human race... if humans never took up arms, machines could never take lethal action against humans. That doesn't mean machines couldn't control humans politically/economically/socially (eg. deny food, deny housing), but it does mean they couldn't take up a policy of overt extermination of all humans, unless humans decided to fight to the last.
And how would these "laws" be programmed? (Score:4, Insightful)
Re: (Score:3, Insightful)
Note: this point was entirely lost in the movie remakes of these books.
So isn't it a little scary that we're actually comparing some policy to the three laws of robotics? To repeat myself: The whole poi
Armagedroid Strikes Again! (Score:2)
What's interesting about this concept, is what would prevent an AI system with t
Reminds me of a story (Score:3, Interesting)
Reminds me... (Score:2)
Illegal weapons (Score:2)
The understanding was that you couldn't shoot it at any person.
But you could shoot at at their helmets, boots, canteens, etc.
He also said that before they left the c.o. established an amnesty box where items could go no questions asked as long as they weren't brought on the plane. Following the detailing of the contraband, souveniers that
it's ASIMOV (Score:2, Informative)
One of the greatest sci-fi writers ever.
what are you ?
a Microsoft Word Spellchecker user ?
Killbots? A trifle. (Score:4, Funny)
Killing by proxy, "collateral damage" (Score:3, Informative)
'a robot could decide under Mr Canning's rules, to target a weapon system such as an AK47 for destruction on its own initiative, requiring no permission from a human. If the person holding it was thereby killed, that would be collateral damage and the killer droid would be in the clear.'"
The geneva convention frowns upon collateral damage [spj.org], though someone is not a civilian if they're holding a weapon (see the "spontaneousy takes up arms" bit.) That's not a good enough excuse. A person holding a gun is not necessarily a soldier. The could be a homeowner, defending their property from looters, for example. That's why you are supposed to give a chance of surrender. Will a robot do that, reliably? Will a robot properly identify and treat hors de combat people?
Here's a bigger, related question: a robot is a)not a person and b)maybe more durable. A human soldier is allowed to fire in defense. Picture a homeowner in wartime, guarding his house. Robot trundles by, x-rays the house, sees the weapon, charges in. He sees it heading for him, freaks out, fires at it. How can the robot possibly be justified in killing him? Even if it represents a threat, you're only threatening a machine!
Second point: this is really just "killing by proxy." Regardless of whether you pull a trigger on a machine gun, or flip a switch on the General Dynamics Deathmachine 2000: if you knew your actions would cause injury or death, you're culpable. It's not the robot that is responsible when a civilian or hors de combat soldier is killed: it's the operators. Robots don't kill people: people build, program, and operate robots that kill people.
whoops! (Score:2)
The geneva convention frowns upon collateral damage [spj.org], though someone is not a civilian if they're holding a weapon (see the "spontaneousy takes up arms" bit.) That's not a good enough excuse. A person holding a gun is not necessarily a soldier. The could be a homeowner, defending their property from looters, for example. That's why you are supposed to give a chance of surrender. Will a robot do that, reliably? Will a robot properly identify and treat hors de combat people?
Whooooops. The first s
mostly harmless? (Score:2)
How binding would this dichotomy be on human soldiers? Would we see a war-crimes tribunal in which a human soldier is charged with targeting an autonomous machine?
And you know, as long as we're going to base protocols of engagement on superficial semantics, why not be more specific and only let generals target gener
Premature (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
These aren't ethics laws for self-willed, scifi-like robots. These are ethics laws for users of fairly stupid (by scifi standards) autonomous armed combat systems. And since such systems are currently in active development, it is not at all premature to discuss what, if any, limits ought to be placed on their application.
Re: (Score:2)
The discussion is not premature. Land minds already kill humans, machines indiscriminately, and therefore would be in violation of this treaty. The interesting thing is that guided cluster bombs could violate the law if they were
Re: (Score:2)
Which would be horrible. But you miss my point: We may as well debate how much lead alchemists are permitted to transmute into gold in order to avoid destabilizing the world economy. The argument is moot: there is no reliable, cost-effective way to transmute lead to gold, nor are we on the verge of creating one.
Discussing whether or not to allow ED-209 to shoot at humans is silly. ED-209 can't tell the difference in t
Yet again missing the point. (Score:2)
In the way most people think of them, they cannot be practically implemented within an artificial intelligence of human level ability. This is simply because such an artificial brain would be massively parallel by design, and
would require something as complex to detect the breaking of the laws, a second artificial brain. Such an artificial brain would in itself be subject to the same problem. There are ways t
Sure... (Score:2)
Basic principle of warfare: Apply strength to the WEAKNESS. Humans are weak. Robots should actually target humans, they are far more effective that way. If there are no more humans, who is going to tell the enemy robots what to do?
John S Canning fails for not understanding the nature of war. Go ahead and keep building battleships, and ignore those aircraft...
This would make more sense to me (Score:2)
1. Riot control microwave pain gun.
2. Taser
3. Machine gun
4. Small missile launcher
5. Anti-tank Grenade thrower
6. Mine dropper
1 is for a crowd of humans. 2 is for a single human. 3 is for a small weapons outpost or another robot. 4 is for a flying vehicle or robot. 5 is for an armored vehicle. 6 is to help defend an area or when the robot is being chased.
The idea is of course, robots are extremely good at being given an ar
Get it right (Score:2)
That's the entire point (Score:2)
Forgive my dramatic capitalisation, but Asimov's entire point seemed to be that these 3 laws, despite being pretty obvious, were deeply flawed and not at all thought through. Even in the movie (spoiler follws, even though the movie spoiled itself well enough) the whole point was that the computer interpreted enslavement as being better than the possibility of self-harm, from a s
War is Hell (Score:4, Insightful)
"War is Hell"
Ever read All Quiet on the Western Front? Ever talked to someone who was there or a civilian in European WWII?
War sucks. It's supposed to suck. Without the pain and suffering that war can bring to all sides of the battle, winners and losers alike. Perhaps the generals should go watch Star Trek Episode 23, A Taste of Armageddon, circa 1967.
That society has done such a nice job making war "clean" that they have decided to continue fighting a war for 500 years rather than just figure out how to make peace.
In most societies, people are taught that violence against others is fundamentally bad. This becomes a moral element that entwines all the people within that society. It also motivates the same people to find ways around doing violence.
If you study anything about the Nazi camps in WWII they had a growing behaviour where the soldiers in the concentration camps knew what they were doing but absolved themselves of any responsibility by hiding behind the statement, "I was just following orders", thereby removing themselves morally from the actitivies. After WWII this was considered to be a War Crime and has been backed by hundreds of trials across the world.
Fast forward 60 years and we are at a point where the soldiers who are operating a computer screen which operates these killer robots can absolve themselves from any responsibility of moral involvement because the Laws will simply allow them to say, I was just operating a computer program. And while this is going on, there is no one left to come back from the battlefield to serve as a reminder of just how bad war really is and how important it is to avoid it.
At the same time if we are going to commit to a war, we had better be willing to do it to completion even when it gets ugly. I'm pretty pissed at the news for giving us daily body counts of 4 and 10 soldiers on a 5 year battle. In contrast, WWII was hundreds to thousands a day and everyone was sticking to their plan. Everyone was commited to the plan and everyone knew why they were fighting. Vietnam wasn't so clear cut. It was rather vague as to why were where there and even on day one, not everyone was convinced we needed that war. And now we are in the Middle East without a convincing and clear cut plan as to what we are doing, why we are there, what we hope to accomplish, and not enough people in the States give a shit. Perhaps in New York City, but no where else.
They'll get their killer robots and their legal loopholes to kill anything they want and no one will really do much because it's clean and doesn't interfere with "Dancing with the Stars" and the sheep continue to bleat
Please won't somebody think of the kill droids? (Score:4, Funny)
Honestly this strikes me like thoes people who adopt dogs that are prone to barking and then put collars on them to shock them when they do it. Its unfair I say, and worng to force something to act contrary to its nature. If you don't want a dog that barks much you should adopt a bread which is not given to barking or maybe just get a cat.
The same is true for robots. If you don't want a robot that runs around killing everything it detects especially you then you should forgo adopting or building a kill droid. Maybe go get yourself one of thoes more friendly industrial breads that enjoys welding steel pannels onto cars or if space is a concern some types of robots are very small and even enjoy roaming around your home vacuming the floor for you as they do. There is an appropriate type of robot for almost ever situation. Please be responsible and only adopt a kill droid if you have adequate supplies of victims for it to kill.
Re: (Score:3, Funny)
Re: (Score:2)
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)