In a report released on Tuesday by a non-governmental agency (NGO) that monitors human rights, the United Nations is urging the United States to cease its development and use of what the U.N. special rapporteur terms “fully autonomous robotic weapons” by the military and law enforcement agencies.
“These weapons, once activated, can select and engage targets without further intervention by a human,” according to the NGO, Human Rights Watch.
Beginning on May 29 in Geneva, nations attending the United Nations Human Rights Council will debate the challenges posed by fully autonomous weapons, sometimes called “killer robots.”
“The U.N. report makes it abundantly clear that we need to put the brakes on fully autonomous weapons, or civilians will pay the price in the future,” said Steve Goose, arms director at Human Rights Watch. “The US and every other country should endorse and carry out the U.N. call to stop any plans for killer robots in their tracks.”
The UN special rapporteur, Professor Christof Heyns, prepared a 22-page report on robotic weapons to be disclosed at the second session of the Human Rights Council on May 29. The U.N. council will then consider how to act on the report’s recommendations, including its call on nations “to institute an immediate moratorium on these weapons and work for an international agreement that addresses the many concerns identified in the report.”
According to Human Rights Watch, President Barack Obama and his Defense Department have acknowledged the problems regarding use of fully autonomous weapons. As a result, almost unnoticed, the U.S. Department of Defense issued a directive on Nov. 21, 2012, which mandates that “a human being to be in-the-loop” when decisions are made about using lethal force.
For about a decade, Directive Number 3000.09 has allowed the Defense Department to develop or use only fully autonomous systems that deliver non-lethal force, unless department officials waive the policy at a high level. In effect, the directive constitutes the world’s first moratorium on lethal fully autonomous weapons, claims HRW.
“While a positive step, the directive is not a comprehensive or permanent solution to the potential problems posed by fully autonomous systems,” Human Rights Watch said. “The policy of self-restraint it embraces may also be hard to sustain if other nations begin to deploy fully autonomous weapons systems.”
“Over the past decade, the expanded use of unmanned armed vehicles or drones has dramatically changed warfare, bringing new humanitarian and legal challenges,” Human Rights Watch officials said. The UN report acknowledges that “robots with full lethal autonomy have not yet been deployed” despite the lack of transparency on their research and development.”
These future weapons, sometimes called “killer robots,” would be able to choose and fire on targets without human intervention, according to HRW.
In a 50-page report released last year, “Losing Humanity: The Case Against Killer Robots,” outlines concerns about these fully autonomous weapons, which would inherently lack human qualities that provide legal and non-legal checks on the killing of civilians. In addition, the obstacles to holding anyone accountable for harm caused by the weapons would weaken the law’s power to deter future violations.
“Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” said Steve Goose, Arms Division director at Human Rights Watch.