Russia’s advanced military combat robot — which has drawn alarmed comparisons to Hollywood’s Terminator — will be able to run and clear an obstacle course by late this year, according to reports that surfaced this week.
That’s just the latest news fueling the already-fiery debate over what to do about killer robots.
Artificial intelligence and robotics experts must decide whether to support or oppose the development of lethal autonomous weapons systems, or LAWS, urged Stuart Russell, a professor of computer science at the University of California at Berkeley, in a paper published last month in Nature.
The technologies to create LAWS, such as artificial intelligence, robotics and sensors, already exist, and all that remains is to put them together, he pointed out.
For example, the technology used in self-driving cars could be combined with the tactical control of Google’s DeepMind DQN algorithm — which earlier this yeardemonstrated its prowess in video game play — to support search-and-destroy missions in urban areas.
International humanitarian laws such as the 1949 Geneva Conventions don’t have specific provisions for the use of LAWS, but they may still apply, Russell theorized, because they require subjective judgments about things like military necessity, who is and isn’t a combatant, and proportionality, which current AI systems can’t tackle.
The U.S., the UK and Israel, which lead the development of LAWS technology, have taken the position that a treaty on the use of LAWS is not necessary, Russell added, because they already have internal weapons review processes that ensure compliance with international law.
However, existing legal restrictions “are inadequate to really deal with this fundamental change in technology and warfare,” said Peter Asaro, an assistant professor at The New School, who cofounded the International Committee for Robot Arms Control and is spokesperson for the Campaign to Stop Killer Robots.
Why LAWS Are a Pain
Letting machines choose whom to kill would violate fundamental principles of human dignity, Russell warned. Killer robots could be deployed in the millions, and their agility and lethality would leave humans defenseless.
Next-generation armed drones, including the X-47B and the Taranis, are being developed for full flight autonomy and “might be designed to select and engage targets without supervision,” the Campaign to Stop Killer Robots’ Asaro told TechNewsWorld.
Proponents of the use of LAWS, including the U.S. Department of Defense, contend that killer robots will minimize civilian casualties, but opponents argue that LAWS will make it easier to start a war, because it will be easier to launch an attack without immediate risk.
Whether people accept the arguments in favor of LAWS “will depend on how much faith you have in modern AI and its ability to distinguish people, places and things on a noisy battlefield with enemies intentionally trying to spoof it,” remarked John Sullins, chair of philosophy at Sonoma State University and author of an ethical analysis of the case for controlling LAWS.
LAWS and Deep Disorders
The technology will make it “more difficult to hold people accountable for the mistakes and atrocities caused by machines, because they will be unintentional in the legal sense,” Asaro pointed out. Further, it will devalue human life.
“I have given talks to military and policy professionals and … a number of them … are deeply concerned with [the use of LAWS],” Sullins told TechNewsWorld. “However, the main focus is on law and treaty compliance.”
Not enough attention is given to the fact that a country might comply completely with international law but still carry out unethical actions, he pointed out.
For example, a military power could use LAWs in a legal way, but it “could still be unethical if those weapons let it engage in decades-long low-level conflicts,” said Sullins, “since the ethical choice is to seek peaceful relations — not settle for the more pragmatic solution of just less-lethal (to one side) wars.”
Situations like that could arise, because “with any new weapons system, the focus initially is on the offering’s primary goal,” noted Robert Enderle, principal at the Enderle Group. “Safety and security comes later.”
However, it’s critical to design safety and security into autonomous weapons from the start, he told TechNewsWorld, “or the probability of these weapons turning on their masters becomes unacceptably high.”
LAWS systems “should be outlawed” because of the threat they would pose if a terrorist were to get hold of them, or if they were hacked or went out of control for any reason, Enderle cautioned. In such cases, “stopping them would be very difficult, and the death toll would be exceedingly high.”
We tried to get people to care about themselves, their country, their families for 43 years.
We finally created Justice Caps – http://www.justicecaps.com – so that every person can be an "Army of One."
Our technology blinds NSA video surveillance and turns their multi-billion dollars software into junk.
All for $ 15.00