Why the CCW had to take on the Killer Robot question

This month the Meeting of States Parties to the Convention on Certain Conventional Weapons (CCW) met to discuss the operations of the Convention and consider new issues. On the forefront of new issues is the question of Lethal Autonomous Weapons or, colloquially, Killer Robots.  We are not in the realm of science fiction here, either.  Unmanned Aerial Vehicles (UAVs or, colloquially, drones) are used throughout the world and in many combat situations.  Equipped with missiles, drones have been used in Pakistan, Yemen and Somalia in recent months to attack and killed terrorists. The US military’s drones are operated out of bases as far away from the theater of war as Arizona, thousands of miles away, by soldiers using video game controllers and looking at large computer monitors.  The monitors display high resolution images captured by cameras on board the drones.  When the pilot identifies a threat, he or she can fire a missile at the threat (if the drone is so equipped) or call in armed air support.

As of today, a drone cannot initiate an attack.  Only the human controllers (“pilots”) can do so.  However, with companies, including seemingly benign ones like Facebook, developing more accurate facial recognition software, there is the possibility that, in the not-too-distant future, a drone could be programmed with a particular individual’s face and set loose in the sky.  The drone would fly around and upon finding its target, be able to attack and kill the target with a missile.  Technologically, we’re probably only a couple of years from being able to do this: pre-program a drone with a pre-authorized kill list, have the drone identify individuals on that list, and then let the drone launch a missile strike.  No warnings, no confirmations.  Just push “start,” sit back and wait for the bad guys to get blown up.

A little more than a year ago, Human Rights Watch published its report, Losing Humanity. The report described the human rights and legal arguments in support of a ban on autonomous weapons.  The report provided the spark for the April 2013 Launch of the Campaign to Stop Killer Robots, a coalition of 45 organizations, including the International Committee for Robot Arms Control, Human Rights Watch and other civil society organizations from 22 countries calling for a ban on lethal autonomous weapons.

The argument against killer robots is that they are indiscriminate weapons.  Unlike humans, a killer robot could not make a judgment call about a particular situation or may launch an attack against a target when the target is in the presence of non-combatants.  Or the killer robot could falsely identify a target, killing an innocent.  These are all valid points, but I’m sure programmers would say that a nearly flawless facial recognition software is possible, eliminating the chances of a robot targeting the wrong person.  Also, drones have killed hundreds of non-combatants whilst under the control of human operators; both by mistake and when non-combatants were in the presence of a verified target.  Humans are prone to mistaken identification and could order an attack on an innocent.  Armed drones have been used in “signature” strikes where a human evaluated the actions of individuals viewed through the computer monitor and determined that the individuals on screen were engaged in activities that fit a “signature” of terrorists.  So while the arguments against killer robots are valid, they could, and should, also be made about the current state of drone warfare which I am certain is part of the issue.  By seeking a ban on killer robots, the disarmament community can start to rein in the use of drones, or at least define some limits on their use.

The question of a ban or regulation on lethal autonomous weapons was referred by the United Nations to the CCW as the “appropriate venue” for discussions.  But the CCW was not the only possible venue.  The issue of killer robots was first raised at the United Nations by the Special Rapporteur for Extrajudicial Killings within the context of the UN’s Human Rights Council and some states would have been supportive of the issue being taken up by the Council (the Council will be one of the frameworks to address drone warfare more broadly as some, especially those circumstances mentioned above, fall under the definition of extrajudicial or arbitrary killings).  The other alternative would have been to take the issue entirely out of the United Nations framework, just as had been done when the CCW was seen by member states to fail on the issues of landmines and cluster munitions.

The Nobel Committee awarded the Nobel Peace Prize to the International Campaign to Ban Landmines for finding an alternative means to achieve international disarmament, outside of the UN system.  The ICBL and several countries went outside the UN system when the CCW’s Amended Protocol II on landmines was seen by the ICBL and those countries as insufficient protection from the humanitarian harm of landmines.  A decade later when the CCW produced Protocol V on Explosive Remnants of War, civil society (represented by most of the ICBL membership) partnered with supportive states to draft, sign and ratify the Convention on Cluster Munitions.  Both the Mine Ban Treaty and the Convention on Cluster Munitions have more signatories than the corresponding CCW Protocols, making the Protocols increasingly irrelevant.  Why would a country sign and be obligated to a weaker standard when it has already agreed to the higher standards of the Mine Ban Treaty or the Convention on Cluster Munitions?

In addition to the threat of pursuing a ban on killer robots through the Human Rights Council, campaigners suggested that another negotiating framework could have been pursued outside of the UN system had the CCW failed to take up the issue.  The Campaign the Ban Killer Robots, with the support of many of the ICBL’s members, would have been able to partner with supportive states, including France, Egypt and the Holy See (think about how many times you would put those three in a room together), to draft a multi-lateral treaty banning lethal autonomous robots.  That option remains should the CCW fail to achieve a ban in the coming year.

The success of the Mine Ban Treaty and the Convention on Cluster Munitions has put significant pressure on the CCW to remain relevant.  Under the leadership of the French delegation, the CCW has agreed to address the emerging issue of killer robots, which maintains the CCW’s position as a relevant and timely negotiating forum, but should talks drag on (and considering the speed with which the issue has developed and achieved a spot on the agenda, even a short delay by the negotiators could be seen as too long), a civil society-led process could be in the offing.

Michael P. Moore

November 27, 2013

Advertisements


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s