AI expert calls on colleagues to take a stand on autonomous killer robots

AI expert calls on colleagues to take a stand on autonomous killer robots

Artificial Intelligence expert and professor at the University of California, Stuart Russell, has published a Comment piece in the journal Nature, calling out colleagues to take a stand on the development of lethal autonomous weapons systems (LAWS)—armed robots that enter the battlefield without human masters and make decisions about who to kill. He suggests the technology is developed enough now that it is time those in the field, such as roboticists, AI experts and computer scientists, step up and take a stand regarding the development of such systems.

Currently, drones in the battlefield are armed and are used to kill and cause damage—but all of them are merely extensions of a human being; a human pilot sits remotely at the controls steering the craft and pressing buttons that cause munitions to be deployed. But that could change very soon, Russell notes, a variety of scientists are working on projects that could very well lead to LAWS, and thus far, nothing has been done to add a voice to whether it is something that should happen.

From a purely military perspective, LAWS make sense—why send in human piloted craft or foot soldiers when a robot could do the job, saving the human soldier from possibly being killed? But of course, it is not as simple as that, every country has rules about humans killing humans, whether it covers civilians killing one another or soldiers killing as an act of war. In all cases, it is expected that humans are able to tell the difference, and to behave according to some sort of established rules. In private life, people are restricted by laws—in warfare, soldier behavior is governed by commanders, international law and oftentimes the court of worldwide opinion. But allowing LAWS to make such decisions, to choose who to kill, Russell argues, would violate fundamentals of human dignity. He suggests that the has a duty to take a position on such technology, presumably against it—noting how physicists around the world have banded together to state their opposition to , and how other scientist groups have worked together to help outlaw chemical and and laser weapons that could blind soldiers. Such a movement needs to be started, he argues, at all levels of the scientific community to debate the merits or failings of LAWS and to make public their views—failure to do so, he insists, would be akin to nodding silently as such weapons are developed and eventually deployed.


Explore further

Debating the future of autonomous weapons systems

More information: Robotics: Ethics of artificial intelligence, Nature, www.nature.com/news/robotics-e … intelligence-1.17611
Journal information: Nature

© 2015 Tech Xplore

Citation: AI expert calls on colleagues to take a stand on autonomous killer robots (2015, May 28) retrieved 20 July 2019 from https://techxplore.com/news/2015-05-ai-expert-colleagues-autonomous-killer.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
276 shares

Feedback to editors

User comments

May 28, 2015
While he's at it, can we please also take a stand against HUMANS being armed and used to kill and cause damage?

Thank you very much.

(I think if we don't get that one down then keeping people from developing killer AI bots is a lost cause)

noting how physicists around the world have banded together to state their opposition to nuclear weapons,

Didn't work so well, did it?

have worked together to help outlaw chemical and biological weapons

Hasn't prevented their research, use and stockpiling, either.

and laser weapons that could blind soldiers.

Only successful because development of such a weapon - if a country/organization were in dire need - would take mere hours.

So I'm not holding out much hope that this will result in any effective prevention of killer AI.

May 28, 2015
While he's at it, can we please also take a stand against HUMANS being armed and used to kill and cause damage?

Thank you very much.


As long as Islam exists that will never be possible, unfortunately.

Only successful because development of such a weapon - if a country/organization were in dire need - would take mere hours.


The author is apparently unaware that we are already using fighter scale and even capital scale laser weapons in the battlefield, and in bomb sweeping missions, and as part of anti-missile defenses. An anti-missile laser such as the THEL or COIL can cut a human being's arm right off with virtually no resistance. There is a weaker version called ZEUS which is used for mine sweeping and safely disarming unexploded ordinance, which usually means heating it until the detonator is destroyed or until it explodes spontaneously at a safe distance.


May 28, 2015
Then there will be SKYNET!

Lethal autonomous weapons systems (LAWS) are essentially mobile re-loadable land-mine technologies. Consider that a land mine is autonomous, and indiscriminate.

So how well is the anti-land-mine campaign going?

People make weapons, and consider the consequences after the fact.

Perhaps these weapon makers qualify for the Darwin Award.

May 28, 2015
There is another moral problem arising too.

If you create a robot capable of knowing good from evil well enough to identify an enemy from a civilian, you've just created a free moral agent with human-like intelligence, and it now has inherent "human-like" rights. This has been explored in science fiction with Data and the Doctor in StarTrek, but really when does it stop being a toaster oven and start being a "being"? Near as I can tell, the potential to distinguish friend from foe and good from evil makes it a "being" and a "being" has rights; see declaration of independence for the moral and logistical problem this creates. A weapon isn't very useful if it has a right to live and a right to happiness and liberty.

You can be assured that anything smart enough to distinguish friend from foe is also smart enough to make a-moral decisions and rebel in some way. That is a real problem, and it's not science fiction. Proof? Look at any murderer among the human race.

May 28, 2015
Maybe he should simply start a call that anyone working on killer AI should actually make them work - but put in a backdoor so that the AI will turn on their own generals/politicians if ever used (outside of test scenarios).

May 28, 2015
Of course, it doesn't meet the biological definition of "Life" if it doesn't meet the definition of "inclusive fitness" which is to say it can't reproduce on it's own. It's more like an intelligent virus in that it needs a central production facility to copy itself, and moreover the robots might not have the code to copy themselves, though it seems nothing prevents them from learning the code by chance or by inspection.

Anyway, the lack of self-reproduction is the one thing that would prevent it from meeting the biological definition of life.

May 28, 2015
Maybe he should simply start a call that anyone working on killer AI should actually make them work - but put in a backdoor so that the AI will turn on their own generals/politicians if ever used (outside of test scenarios).


That's awful.

Conspiracy to commit murder; like 500 counts. Conspiracy to commit treason. Unlawful access to a computer; as many counts as the number of robots affected. Reckless endangerment. War crimes; targeting/killing unarmed civilians.

May 28, 2015
Maybe he should simply start a call that anyone working on killer AI should actually make them work - but put in a backdoor so that the AI will turn on their own generals/politicians if ever used (outside of test scenarios).


That's awful.

Conspiracy to commit murder; like 500 counts. Conspiracy to commit treason. Unlawful access to a computer; as many counts as the number of robots affected. Reckless endangerment. War crimes; targeting/killing unarmed civilians.


Doesn't seem so bad comprared to your proposed genocide: "As long as Islam exists that will never be possible, unfortunately."

May 28, 2015
As long as Islam exists that will never be possible, unfortunately.
@Returners
absolutely true, but a little skimpy
it should read: as long as RELIGION exists

not faith, mind you, but the actual codified rules used to control others and create friction and judgement: IOW - religion

will turn on their own generals/politicians
@AA_P
given the fact that code and programming can mutate over time and the fact that we can also utilize this method of mutation (polymorphism), isn't this pretty much a very likely scenario in the long run anyway?

May 28, 2015
While he's at it, can we please also take a stand against HUMANS being armed and used to kill and cause damage?

Thank you very much
This only means that other humans who want what they have would arm themselves and kill them. As they are doing right now all over the place.

Or we could pretend that we live in heaven already where everybody is good and never gets hungry.

You do like to pretend, dont you aa?

Re the article I would rather trust an AI that never gets confused, angry, tired, sick, hungry, etc to make these sorts of decisions.
https://www.youtu...GoIxoVGk

May 28, 2015
As long as Islam exists that will never be possible, unfortunately
I hate religionist bigots.

"Even though this movement failed, Kony used a similar spiritual base. He believed that he was a prophet sent from God to purify the people of Uganda and to create a bastion of peace. Kony had been a soldier with the Uganda People's Democratic Army (UPDA), which got him involved in military affairs. The leaders of the UPDA signed an agreement with the Ugandan government called the Gulu Peace Accord of 1988 in which most of the former rebels were integrated into the government's army. Kony refused to go along with the agreement and splintered off with other soldiers. With the combination of his military background and religious beliefs he created the Uganda Christian Democratic Army and began fighting against the government. In 1991 he changed the name of the group to the Lord's Resistance Army."

-Youre all unspeakably evil because you believe that unbelievers cant be good.

May 28, 2015
Re the bradley cooper vid I posted, an AI would have instantly identified the grenade and shot the woman before she could hand it off to the kid. Humans cant think that fast.

Or it could have read the mans lips as he was talking on the phone and shot him. Done.

May 29, 2015
Gullible enthusiasts and supporters of autonomous cars argue that if widely implemented it would almost eliminate accidents since machines do not make mistakes.

While conditionally that may be true for traffic accidents but it smuggles into our collective mind dangerous notion, especially in context of killer machines, that machines are "morally" superior vs. man by default since they are flawless in their moral judgments. This is an absurd notion since machine judgments are human judgments but without human doubt and compassion.

It should be called for what it is, an attack on basic human rights. "To err is human and to forgive is divine" (A.Pope) and we should fight and even die for the right to be wrong otherwise that would mean the end of humanity including those insane Silicon Valley militarists who push it. Killer machines are already among us. Choose your side now.

May 29, 2015
But as winthrom already pointed out: There are reseachers with nor compuctions about working on better anti personel mines. So I have no doubt there will be researchers working on killer AI despite any kind of majority consensus paper by scientists/engineers. If all else fails the work can be cloaked so the one working on it doesn't actually know what he's working on.

Told this story before: In my uni years I was working as a "data typist" in a project designed for speech recognition in cars (phone calls, controlling on-board systems like the radio, etc.) Turns out the project was partially funded by the NSA. The algorithms invented there now are used to catch all of us speaking 'suspicious' words on the phone.
We really thought we were doing something worthwhile at the time. I'm still rather angry about that one.

May 29, 2015
When I think about laser targeting AI being added to this beast ( http://phys.org/n...deo.html ) , I really start to worry.
Where do I sign up for the Mars colony?

May 29, 2015
Without military funding, AI will go nowhere.

May 29, 2015
"Turns out the project was partially funded by the NSA. The algorithms invented there now are used to catch all of us speaking 'suspicious' words on the phone."
------------------------------------------

AA,I got the same feelings after helping put together, deploy, and operate the Electronic Battlefield in Southeast Asia 1967-68. It was reinforced by being drafted onto a Federal Criminal Grand Jury as Deputy Foreperson in almost 70 sessions for 24 months .

I no longer trust my government. Bush and Reagan gave it to the filthy rich and powerful.

Now, they OWN us.

May 29, 2015
Our violence-addicted Christians seem to have "forgotten" the hundreds of thousands who were burned alive slowly by their Loving Christians. And the Witch hangings as well.

And the Holocaust, perpetrated by more loving Christians, which is more recent, showing it is not all in the past. They are just as blood-thirsty as before: "Bring 'em on!".

May 29, 2015
machines are "morally" superior vs. man by default since they are flawless in their moral judgments. This is an absurd notion since machine judgments are human judgments but without human doubt and compassion
They would be programmed with all the morals and good judgement that the best of humanity has, and would apply it far faster and more consistently than any human ever could. This would make their judgement and dependability better than 99% of the soldiers we already trust to make these decisions. And they would ALL have this. No william calleys in the lot.

You doves are under the mistaken impression that if we make war painful enough it would simply cease to happen. This is because youve fallen for the fairy tale that greed is the cause of war.

But war happens because humans inevitably overpopulate. And people who are watching their children starve will do anything to feed them, including taking what you need to feed yours.

Only zero growth can end war.

May 29, 2015
It should be called for what it is, an attack on basic human rights. "To err is human and to forgive is divine" (A.Pope) and we should fight and even die for the right to be wrong otherwise that would mean the end of humanity including those insane Silicon Valley militarists who push it. Killer machines are already among us
Ahaahaaaa 'to lose and be annihilated is divine'
Choose your side now
We already have. The surviving side. Your side was extincted long ago.
AA,I got the same feelings after helping put together, deploy, and operate the Electronic Battlefield in Southeast Asia 1967-68
... And you did this as a what? A noncom? You were in your 20s and never went to OCS did you?

May 29, 2015
We really thought we were doing something worthwhile at the time. I'm still rather angry about that one
What, angry that you werent working for the enemy who was doing the same thing?

May 29, 2015
Those who have squandered their lives on the sidelines always have nasty comments for those of us who have actually worked in the Real World, not relying on Wiki.

Never having served, otto does not understand who knows what they are doing, and who does not. If you think our Commander (a major with an EE) understood what we were doing, you are in error. Send me your address, and I'll send a copy of my performance reviews.

May 29, 2015
"Send me your address, and I'll send a copy of my performance reviews"

-Nah post your proof here that as a 20+ year old noncom (at best) you were

"helping put together, deploy, and operate the Electronic Battlefield in Southeast Asia 1967-68"

-Or put it on your website. I am sure the smithsonian as well as all your many fans would be interested.

Funny, none of them appear to be here because no one here has expressed the slightest interest have they? Everybody here is busy questioning your pedigree in light of all the verifiable garbage that you post.

I can imagine that as a noncom you were busy stringing wires and plugging things in. Yes?

May 29, 2015
Your case of the otto Syndrome is probably being passed around right now in some VA clinic. The psychologists there need a laugh, after dealing with the real people, those who have served.

I sent one to review otto's posts, for the fun of it.

For the rest of you, "otto" here has already bragged in these fora about being a lurker, a sniper hiding behind pseudonyms, playing "games". His words.

May 30, 2015
I wonder why George is so buddy buddy with VA psychiatrists?

The extent of your delusional nature is becoming clearer with every post. You can't hide your sickness here George. Here, most people are smarter, more knowledgeable, more accomplished than you.

And so every time you pretend to be an engineer, or profess to be an expert in this or that, and then proceed to post your made-up facts, you are easily exposed.

This is true whether you choose to, or are constitutionally able to, acknowledge it or not.

May 30, 2015
Explain please how stringing wires and plugging things as a 20+ year old noncom, under the direction of officers and specialists, gives you special insight into the nature of the NSA and 'big brother'.

EXPLAIN.

May 30, 2015
Would somebody explain to the lurker who hides behind the pseudonym "otto" how the world works? Not having served, and not having worked in the Real World, this goober thinks you need a paper with the degree of "Engineer" be actually be one. I taught engineers for about 30 years, otto, and could even teach YOU how it all works together.

That is what Power Quality is, otto, the resultant of interactions between complex systems. You have to understand many things to do it otto, as you have seen in my website. Gio back and check out the graphs of non0-linear currents again, and see how they are expressed in the change of the Voltage waveform. Then, come back and I will explain how the different frequencies of harmonics have different effects in certain systems based on configuration, and the frequencies of the signals.

I will teach you about thirds and fifths, otto, and what they do to motors, to wiring, to your facility,and how YOU produce them yourself.

Ready?

May 30, 2015
Okay, I will explain to otto and those without real life experience how the system works. Commissioned officers are managers. Except for professions as such as pilots and medical staff, none of them really understand what the enlisted specialists know. I had to bail my EE-degreed Commander out twice.

Your concepts, like Reagan's, apparently come from movies.

Yes, I see VA folk for PTS, the residue of serving my nation in rough times. You do not get it, otto, we lost 55 of us "non-combatants" while you were cowering at home. I watched as too many aircraft crashed and burned, sometimes with folk I knew inside. Yeah,. I am opening myself up here, otto to slap you out of your adolescent revenge from being shown up.

Now,how do we get these anonymous snipers like otto off this forum?

May 30, 2015
@TheGhostOfOtto1923: Please refrain from casting aspersions on non-coms. I was an Air Force non-com with a degree in math writing S/W for Air Weather Service. I got a commission as a communications officer and lost all my authority! Non-coms make things work. All degreed officers know and rely on that. I later got an MS degree in computer science, hardware. As an officer I wrote a few patents for the AF, and NEVER would have done that without the help of a retired non-com in the engineering shop I was working in. His name is on my first patent along with with mine. He didn't think he did enough to warrant inclusion, I knew better.

May 30, 2015
winthom, when were you in? We had the U-2's and the WB-57's working "weather" in my day.

May 30, 2015
1965-1985. Non-com from 1969-1974. Commissioned 1974. Started out in 6th Mob, radiosonde. Saw WB-47s on Okinawa, JP, that relocated during typhoon on Philippines. Kadena runway clogged with them. I was supporting the B-52 refueling squadron (KC-135) for Vietnam bombers. No Wx U-2, but saw some SR-71. Later was at Reese AFB (Lubbock, Tx). Then to AFGWC, Omaha, NE. After commision went to Keesler AFB (ATC) Biloxi, MS, for Communications-Electronics, and stayed as an instructor. Later an inventor.

http://patft.uspt...,406,627

http://patft.uspt...,761,762

May 30, 2015
My retired non-com friend went on to make another patent for himself:
http://patft.uspt...,761,762

correction, another one I did was:
http://patft.uspt...emory%29

May 30, 2015
Cool stuff. I got my comm at Keesler, 1965-66, then went to Edwards for the time of my life, but with scuffs and scratches. We lost more pilots in my year at Edwards than we did in my year at Korat, 1967-68.

While at Eddie's, got orders in the middle of the night to be offbase and go to an outfit which did not exist, and helped start Igloo White, the Electronic Battlefield, for McNamara. Cool stuff, too!

We have too much in common to discuss it here. May I send an email or other by looking you up by the name on the patent? Or, I am george@kamburoff.com, if you are interested.

May 31, 2015
1. There will always be scientists out there who will take the paycheck over the morals. Nuclear, biological, and chemical weapons were still developed and deployed.

2. Who would have to worry about war weariness in a nation that sent no soldiers into battle?

3. Trying to prove a criminal act in war is nigh impossible. It is well known that militaries the world over deploy "illegal" weaponry. The problem with convicting is how to prove who did what to whom in a combat environment. So these banned weapons are deployed regardless. That laser that blinds soldiers, its still out there and many a nation are willing to use it.

Sorry Stuart this battle has already been lost.

May 31, 2015
-Youre all unspeakably evil because you believe that unbelievers cant be good.


It seems you're just as guilty of believing that all religious people are inherently bad the way you put it.

May 31, 2015
Surely it is possible that these super weapons could be hacked - and turned against their masters? I can imagine a super high tech arms race - where the fear would be that our army of super bots - would suddenly turn around - and start nuking their home land. Would an e.m.p. bomb suddenly disable your whole army? Sometimes it seems like our stupidity knows no bounds. But if the Chinese or Ruskies have better bots than we do - kiss our asses good bye. So many questions????


There is so many ways to counter an EMP weapon that I don't know why people keep mentioning them.

May 31, 2015
Our violence-addicted Christians seem to have "forgotten" the hundreds of thousands who were burned alive slowly by their Loving Christians. And the Witch hangings as well.

And the Holocaust, perpetrated by more loving Christians, which is more recent, showing it is not all in the past. They are just as blood-thirsty as before: "Bring 'em on!".


I must say what I find interesting is no organized group has achieved the killing rate of non-religious states such as communist China and the USSR. Nobody. China in the early '50's outdid all the combatants in WW2 in 2 years less time. That was some leap forward.

May 31, 2015
As long as Islam exists that will never be possible, unfortunately.
@Returners
absolutely true, but a little skimpy
it should read: as long as RELIGION exists

not faith, mind you, but the actual codified rules used to control others and create friction and judgement: IOW - religion

will turn on their own generals/politicians
@AA_P
given the fact that code and programming can mutate over time and the fact that we can also utilize this method of mutation (polymorphism), isn't this pretty much a very likely scenario in the long run anyway?


I don't see religionist or goat herding muslims developing AI killer robots. THAT is all your scientists of "civilized" nations brainstump. Religion my ass, technology and scientists will lead to the EXTINCTION of mankind PERIOD. You want to save mankind then you better start praying.

May 31, 2015
"Please refrain from casting aspersions on non-coms. I was an Air Force non-com with a degree in blah"

-What weve done is irrelevant to what we post here.

Gkam continues to post outrageous nonsense like:

•Fallout is the MAJOR cause of lung cancer
•Dried manure which he mistakenly calls 'volatile solids', is a principle component of 'high' air pollution in the Central Valley
•H2 cannot detonate but can still cause molten Pu to go critical and throw imaginary reactor parts 130km, about 100x farther than a real nuke

-etc. From a long list.

He also claims to be an engineer and to know more about batteries than Elon musk.

-I don't know maybe you think that military service gives one the right to make up bullshit like this? Personally I think it's an insult to all the honest decent vets out there with integrity.
we lost 55
-So you're saying that THIS means you know more about the systems you allegedly assembled than the real engineers who designed them?

May 31, 2015
"It seems you're just as guilty of believing that all religious people are inherently bad the way you put it."

-The central tenet of all religions is the idea that goodness comes exclusively from their deities. This is written in all their books. And devoutness is measured by the strength of their belief in this steadfast rule.
Evangelists may talk about ecumenicism but they do not preach it. Listen to them on the radio. Doubt and criticism are the devils work.

I suppose their were some good nazis who were casual in their belief that it was wrong to persecute and murder Untermenschen. Perhaps progressivism would have been forced upon the third Reich had it survived the war, just as it had been forced on xian-dominated cultures who previously considered slaves, women, amerinds, and yes Jews, as somewhat less than human.

But since the bigotry remains in the books we must understand that it can reemerge at any time.

Just listen to any evangelist on the radio.

May 31, 2015
"There is'?

I can get through your Faraday cage.

Jun 01, 2015
C/K=V
K= Average kills per unit
C= Cost of each unit
V= Value of someone's life

If you define it as a function you can get a rough estimate on how much a total genocide will cost you

Jun 01, 2015
"you define it as a function you can get a rough estimate on how much a total genocide will cost you"

-Right now genicide is being committed in many places around the world by hordes of religion-fueled zealots, themselves the product of religions which force women to bear children until it kills them.

This mechanism was designed to create large armies for the purpose of outgrowing and overrunning their less prolific counterparts. It is and always has been the cause of the most horrible wars and revolutions and pogroms and genocides of history.

Currently the only protection we have against these hordes is our technology, which gives a few people the ability to kill the enemy in far greater numbers. A dispassionate, consistent, and dependable AI imbued with highest moral judgement that programmers can give it, may be the most humane way of doing this.

But the scourge will never be over until the cultures which produce it, are destroyed.

Jun 01, 2015
"I don't see religionist or goat herding muslims developing AI killer robots. Al Qaida and ISIL are adept at using the Internet for hacking and recruiting. Iran, a fundy religionist state, is flying drones throughout the region. And they don't need AI to build nukes. We may need AI however, to resist them.
THAT is all your scientists of "civilized" nations brainstump. Religion my ass, technology and scientists will lead to the EXTINCTION of mankind PERIOD
Our tech is currently the only thing keeping us from extinction. Without it most of the people in the world would starve. And we must continuously improve it or fall behind all the pathogens and infestations which are evolving in response to it.

Jun 01, 2015
Techxplore really get to know how physorg uses quotes as it is very frustrating to switch between one and the other, and makes for illegible posts.

Jun 01, 2015
The heart of the argument over autonomous killing machines is that they lack compassion. They do not have the capacity to disagree with their orders, and therefore are capable of committing acts of pure evil. The one issuing the commands to it is so far removed from the actual horrors they create, that a line has been crossed. One where we are devalued as humans. There have been many sci-fi stories on the subject. The recent Robocop movie would be a good example. There were autonomous robots that fought along side human soldiers. But what of the boy who approached the robot with a knife, and the robot was about to shoot the boy. That shocks us because, there is no compassion there. A human life cannot be judged on such a small set of criteria.

The same goes for unfeeling humans, as well. In the movie "Universal Soldier", the killing machines were human, but had lost their ability to feel emotions. A soldier needs compassion to some degree, or he is just a monster.

Jun 01, 2015
"I must say what I find interesting is no organized group has achieved the killing rate of non-religious states such as communist China and the USSR. Nobody. China in the early '50's outdid all the combatants in WW2 in 2 years less time. That was some leap forward."

-Communism is a pseudoreligion with all the requisite characteristics. A chosen people (the working class), godmen saviors (Mao, Lenin, pol pot), holy books full of references to a divine mandate and an immortal soul, and the promise of a heaven on earth when all the heretics are annihilated. And surrender of personal property and political control to the state.

Functionally they are indistinguishable.

Jun 01, 2015
They do not have the capacity to disagree with their orders, and therefore are capable of committing acts of pure evil.

I would rephrase that. 'Evil' is a moral/ethical qualifier. Machines aren't capable of that kind of decision qualification. (In your sense a bomb or a land mine is 'pure evil')

The thing that is problematic, is that soldiers, since the Nuremberg trials, cannot argue that they were "just following orders". However, machines cannot argue act any other way.

The one issuing the commands to it is so far removed from the actual horrors they create, that a line has been crossed.

That line has been crossed a long time ago with any kind of remote/scattershot/area destruct weapon (from multiple rocket launchers to nuclear bombs to minefields)

A soldier needs compassion to some degree

First thing that is broken in any kind of bootcamp training is compassion for anyone but your comrades. Otherwise you get ineffective soldiers.

Jun 01, 2015
I would rephrase that. 'Evil' is a moral/ethical qualifier. Machines aren't capable of that kind of decision qualification


That's Correct, it is a moral qualifier. And land mines (in unmarked areas) are pretty darn close to pure evil.

The thing that is problematic, is that soldiers, since the Nuremberg trials, cannot argue that they were "just following orders". However, machines cannot argue act any other way.


Therein lies the issue, you can hold multiple people responsible for things that they do, but when one person sends a battalion of Robots to do questionable things, who is to stop them? There are fewer compassionate minds involved, fewer links of accountability. The problem with Robot Minds is that they operate under absolutes.

At some point people will desire to stop fighting (not worldwide, but in regions). Will robots ever stop fighting? Will they ever try to "just get along"?

Jun 01, 2015
"First thing that is broken in any kind of bootcamp training is compassion for anyone but your comrades. Otherwise you get ineffective soldiers."
------------------------------------------

Is that what you got? I didn't.

Jun 01, 2015
"The heart of the argument over autonomous killing machines is that they lack compassion. They do not have the capacity to disagree with their orders"

-'Compassion'. In war, people need to be killed. A machine which is imbued with all the factors which the very best soldier possesses, which are studied and agreed to beforehand, is far better able to decide who needs to be killed and who doesn't in the heat of battle than a scared, angry, tired, wounded human.

A machine will provide far more 'compassion' on the battlefield than a human ever could because a machine is not able to disregard the 'compassion' it has been programmed with, while a human is.

'Compassion' is also the reduction of confusion on the battlefield which leads to accidents and needless casualties. Machines can make decisions far faster and more dependably than any human and are far less susceptable to the fog of war.

Greater efficiency equates to more 'compassion'.

Jun 01, 2015
Here's an interesting example which proves the point, what this site is calling the German soldiers 10 commandments.
http://www.jewish...and.html

-Machines which were programmed with these edicts could not refuse to follow them, no matter what their commanders ordered them to do.

Jun 01, 2015
"Is that what you got? I didn't"

-Who knows what you 'got'? You've proven yourself to be so full of shit that no one can believe a word you say.

Jun 01, 2015
I like the way folk who can't tell one engineer from another, someone who has never served in the military, becomes an expert in both.

But he hides behind a pseudonym, having bragged about how he plays games with those of us here who want to discuss the Real World.

Jun 01, 2015
"That's Correct, it is a moral qualifier. And land mines (in unmarked areas) are pretty darn close to pure evil. "

I would argue that people who build them and those who order their deployment are the ones who are evil. The mines are just bits of machinery. In any case it doesn't help to saddele them with an ethical qualifier, as they don't really feel the impact of that. Humans do (well, obviously the people who build/deploy that stuff don't...or at the very least they can come up a rationalisation why they think it's OK )

"There are fewer compassionate minds involved, fewer links of accountability. "

As I said: as soon as we started using weapons of mass destruction (from poisoning wells to nuclear bombs)..that became an issue. It's not one new/particular to killer AI. Not even to a new order of magnitude.

I agree it is a problem, but one that we haven't been able to deal with in the past. So I'm not expecting something like in the article to work in the present.

Jun 01, 2015
The concept of programming "compassion" into autonomous war robots is interesting, for instance if it were so adept to actually just injure and disable enemy combatants, and employ non-lethal force as often as possible. That could work, and yes it would be better than people killing each other.

I have to wonder if we imagine this system being fully developed, and robots fight other robots on our behalf, would there be a point? You might as well just play StarCraft 2 against each other.

I seriously doubt that Autonomous War Machines would be made to be more compassionate than other humans, by all parties involved in a conflict. That's the scary part. But, there are worst things in the world, like biological, chemical and nuclear warfare.

I think, deep down, no one wants to create a Terminator World. That's what the debate and the fear is all about.


Jun 01, 2015
"The concept of programming "compassion" into autonomous war robots is interesting, for instance if it were so adept to actually just injure and disable enemy combatants, and employ non-lethal force as often as possible. That could work, and yes it would be better than people killing each other."

-Of course. Watch the american sniper vid.
https://www.youtu...3u9ay1gs

-An AI could have read the mans lips as he talked on the phone. It could have recognized the grenade as soon as it appeared, and WOUNDED the woman before she had the chance to hand it to the boy.

It would have recorded the incident for future analysis. And it could have identified all the actors to compare with a database for future use. They might have already been known, and the machine could have acted as soon as it recognized them.

Humans cant do these things. Letting a sniper cope with situations like this when the tech is available to let a machine do it, would be inhumane.

Jun 01, 2015
This featurette from the same film shows the trauma inflicted on the people who have to make these decisions.
https://www.youtu..._PdW1xkg

-Using machines is also humane in this respect because people who are forced to make them are often damaged as a result.

Jun 01, 2015
"someone who has never served in the military, becomes an expert in both"

-Your lies are a disgrace to everyone who has served his country honorably and then chosen to lead decent, honest lives.

There ought to be a way to arrest people like you.

Jun 01, 2015
Can I get help in getting this chronic malcontent and character assassin off this forum?

He has no opinions, only scorn for having been shown up.

It's time to go back and find his bragging about being a sniper hiding behind phony names, playing what he calls his "games" with the Decent Folk.

Meanwhile, he has seen me in my Vietnam outfits, and that I was Airman of the Month for the Air Force Flight Test Center, so pay no attention to his accusations and bluster.

otto served nowhere, for nobody but himself. He cowered at home, hid while the rest of us served. He has no right to discuss "decent, honest lives".

Jun 01, 2015
The TheGhostofOtto1923 is a troll. Lucid but ludicrous. I ignore this character. Recommend all do the same.


Jun 02, 2015
"I have to wonder if we imagine this system being fully developed, and robots fight other robots on our behalf, would there be a point? You might as well just play StarCraft 2 against each other. "

As in starcraft the goal would be control over resources, area (or simply extermination of the opposition) - not the AI vs. AI slugfest.

"I seriously doubt that Autonomous War Machines would be made to be more compassionate than other humans"

Probably not. But they could be made to follow the Geneva conventions to the letter (whether completely literal interpretation is good or bad is another question). If aynthing expending AI soldiers instead of humans would lower the threshold to go to war. It's hard to campaign against an ever increasing amount of war-widows and orphans. It's easier to (politically) keep a war going if its fueling the AI-replacement economy.

Jun 02, 2015
How many billions of dollars have been wasted on this? We do such research on the idea that OUR lives are valuable and must be preserved while those at the receiving end of a drone attack are apparently worthless and can be claimed in cold blood by a machine. I know that if I were a villager in a remote region of Afghanistan and a drone killed my family, I would call that evil and fight it with all the power at my disposal.

Jun 02, 2015
"If aynthing expending AI soldiers instead of humans would lower the threshold to go to war"

-And this is your real objection isn't it? The delusion that making wars more costly will make them go away.

Religion makes war inevitable. Without our technology we would eventually be overrun.

And please do consider that, because of your social status, you and your family would be among the first to be marched off to the camps. This is also inevitable.

It has happened before and rest assured, it would happen again.

Jun 02, 2015
Although they appear similar, Islamism and national socialism are different in that Islamism has a much longer history of conquest by annihilation, with a holy book that describes in no uncertain terms why people like you and me should be exterminated.

Jun 02, 2015
A true intelligence would see the effects we have on the Earth and get rid of us.

Jun 02, 2015
I don't see religionist or goat herding muslims developing AI killer robots
@stevepig
kinda hard when they're busy killing each other and other religions because of differences in opinion, eh?
ever been to Afghanistan? Iraq? Iran? you should take a vacation there... make sure to wear your religious affiliations proudly
technology and scientists will lead to the EXTINCTION of mankind PERIOD
you have the right to your own opinion
from what i see: computers/internet spread informaiton so that even the religious people can't hide reality anymore (see Dark Ages)
plus all that criminal medicine keeping people alive past the typical 40yrs of history... sure. what a horrible thing to do, right?

i mean... all that horrible cheaper goods from manufacturing insuring that we will never have enough stuff to live... and those horrible advancements in agriculture too

shame on science for making the world so horrible
(satirical hyperbole)

Jun 02, 2015
You want to save mankind then you better start praying
@stevepig
it will have the exact same effect as you doing nothing
there is a difference between a faith and a religion
a faith is a belief without evidence, whereas a religion is the codified rules or tenets normally associated with a faith that cause friction, prejudice and division
religion is used for controlling the weak minded who cannot think critically- it is by definition used to judge others and either accept or reject them based upon the tenets
THAT is why religions are dangerous

also- as Otto pointed out: ISIS and religions ARE using technology to kill people who simply believe another way

you call that good?

good and evil are moral constructions usually developed by cultures and used by religions for control
that is why killing is BOTH good and evil even in the xtian bible: good to kill because god said so, but bad to kill tribal members

neither is true

Jun 02, 2015
And land mines (in unmarked areas) are pretty darn close to pure evil
Krundoloss
like AA_P, i disagree with this
this is also the crux of the american 2nd amendment debate, by the way

the landmines are a tool that are used, not an evil or good anything

the use (and thus the moral attachment of good or evil) comes from the people who use them, therefore the use of unmarked land mines etc and the morality assignment of said use should fall squarely upon the people who ordered them used

Same with firearms, or any weapon, really (which would include AI under the control of a gov't, military or humans in general)

the issue really isn't the tool itself, but the underlying violence and pathological need to destroy

the fight or flight/self defense is built into humans (anything living, really)
the extension of this to cultural and tribal defense was a human adaptation which gave us war
violence is the underlying cause, not the weapon or tool (or AI)

Jun 02, 2015
An AI could have read the mans lips as he talked on the phone. It could have recognized the grenade as soon as it appeared, and WOUNDED the woman before she had the chance to hand it to the boy.

It would have recorded the incident for future analysis. And it could have identified all the actors to compare with a database for future use. They might have already been known, and the machine could have acted as soon as it recognized them.

Humans cant do these things. Letting a sniper cope with situations like this when the tech is available to let a machine do it, would be inhumane
I gotta agree with this

AI could (as pointed out by AA_P) be programmed to follow the Geneva convention to the letter
Also, as Otto points out, AI could far easier shoot to wound (or inflict a wound that typically doesn't kill in the west) rather than kill

and as much respect as i have for snipers and the combat arms, i would prefer to see less mistakes as well

Jun 02, 2015
"I don't see religionist or goat herding muslims developing AI killer robots"

-Yah they've got an unlimited supply of flesh and blood robots to do their bidding.

Jun 02, 2015
I guess it all comes down to -
Would you rather the clean kill of a robot, or a fallible human leaving you crippled and alive for the rats?

Jun 02, 2015
"Also, as Otto points out, AI could far easier shoot to wound (or inflict a wound that typically doesn't kill in the west) rather than kill"

-Because of much faster sensory processing, decision-making, accurate deployment of weapons, and lack of distraction, the actual incidences where 'compassion' would be a factor is greatly reduced. I offered a few possible responses to the situation in the trailer which would avoid the compassion issue entirely.

And if the situation didn't suggest a ready solution the machine could always stop and ask a human what to do while providing him or her with a much better audiovisual representation of events than Bradley cooper describing it while nervous, sweaty, and confused, ever could.

Jun 02, 2015
A robot may not injure a human being or, through inaction, allow a human being to come to harm.

A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Jun 02, 2015
" It's hard to campaign against an ever increasing amount of war-widows and orphans. It's easier to (politically) keep a war going if its fueling the AI-replacement economy"

-The first gulf war was over when saddaams armies were obliterated. We carpet bombed them into mush and then plowed it into the sand.

AI machines could be just as effective and certainly more surgical in destroying an enemy. They could have probably brought the 2nd war to a conclusive end.

And afterwards there would be no reason to continue.

Jun 03, 2015
A robot may not injure a human being ...does not conflict with the First or Second Law.
@winthrom
this is for robots- not war machines who's purpose is specifically to inflict damage to the enemy

which is the crux of the problem, i think
Thanks for reminding us of the laws


Jun 03, 2015
And afterwards there would be no reason to continue
@otto
i disagree
as long as there are religions and cultures that despise other cultures/religions then there will always be a need to continue

IMHO - introduction of AI in that war would only cause the religious and anti-US/west people into a harder stand against technology and science as well as modern living (western life)- it would solidify their hatred against technology

it very WELL would have lead to a better conclusion on our part
but given the prevalence of the religious fundamentalism in the region, it would only exacerbate their anti-western stance and cause more strife/terrorism
&
it gives them a specific target for hate which is far more ambiguous in real life (as in Technology)

talk about tribalism and moving backwards... it would trigger worse conditions than even now


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more