GENEVA — A United Nations expert called Thursday for a global moratorium on the testing, production and use of armed robots that can select and kill targets without human command.
“War without reflection is mechanical slaughter,” said Christof Heyns, the United Nations special rapporteur on extrajudicial, summary or arbitrary executions.
“A decision to allow machines to be deployed to kill human beings worldwide, whatever weapons they use, deserves a collective pause,” he told the Human Rights Council in Geneva.
No countries use such weapons, but the technology is available or soon will be, Mr. Heyns told the council.
The United States, Britain, Israel and South Korea already use technologies that are seen as precursors to fully autonomous systems. Little is known about Russian and Chinese progress in developing them.
“My concern is that we may find ourselves on the other side of a line, and then it is very difficult to go back,” Mr. Heyns said in an interview. “If there’s ever going to be a time to regulate or stop these weapons, it’s now.”
Mr. Heyns urged the council to set up a high-level panel to report within a year on advances in the development of “lethal autonomous robotics,” to assess whether existing international laws are adequate for controlling their use.
Preparations to introduce armed robots raise “far-reaching concerns about the protection of life during war and peace,” Mr. Heyns said in a report on lethal autonomous robotics he submitted to the council. “This includes questions of whether robots will make it easier for states to go to war.”
Some states active in developing such weapons have committed to not deploy them for the foreseeable future, Mr. Heyns said. He pointed to a United States Defense Department directive issued in November that banned use of lethal force by fully autonomous weapons for up to 10 years, unless specifically authorized by senior officials, and that identified possible technology failures. Mr. Heyns said that provided important recognition of the need for caution.
Addressing the council, however, he said, “It is clear that very strong forces, including technology and budgets, are pushing in the opposite direction.”
His initiative comes as nongovernmental organizations and human rights groups are campaigning to ban fully autonomous weapons to pre-empt deployment in the same way as the ban on blinding laser weapons. Discussions are under way with a number of governments that may be willing to take the lead in drafting a treaty to outlaw the weapons, Stephen Goose, director of Human Rights Watch’s arms division, told journalists in Geneva this week.
Supporters of the robots say they offer a number of advantages: they process information faster than humans, and they are not subject to fear, panic, a desire for revenge or other emotions that can cloud human judgment. Robots can be used to acquire more accurate battlefield data that can help to target fire more precisely and in the process may save lives.
A report by Human Rights Watch and the Harvard Law School cites a United States Air Force assessment that “by 2030 machine capabilities will have increased to the point that humans will have become the weakest component in a wide array of systems and processes.”
Human rights groups dispute the ability of robots to meet the requirements of international law, including the ability to distinguish between civilians and combatants or to assess proportionality — whether the likely harm to civilians during a military action exceeds the military advantage gained by it. Moreover, in the event that a killer robot breaches international laws causing civilian casualties, it is unclear who could be held responsible or punished.
“It is possible to halt the slide toward full autonomy in weaponry before moral and legal boundaries are crossed,” Mr. Goose said in a statement this week, “but only if we start to draw the line now.”