news 5

Prof Christof Heyns placed the issue of autonomous weapon systems on the agenda of the UNHRC in May 2013
25 June 2014

Prof Christof Heyns placed the issue of autonomous weapon systems on the agenda of the UNHRC in May 2013.

Last year, a global debate erupted around the development of increasingly autonomous weapons systems – sometimes called killer robots. This development means that computers, and not humans, will decide whom to target and when to release force during war and also during law enforcement. The issue was placed on the agenda of the United Nations Human Rights Council in May 2013 by University of Pretoria law professor Christof Heyns, in his capacity as the UN Special Rapporteur on extrajudicial, summary or arbitrary executions.

Since his appointment at the UN in 2010, Heyns has presented reports to the world body on a wide range of topics dealing with the protection of the right to life, among others, the legal rules on the use of force during demonstrations by the police, the death penalty, the protection of journalists, and the use of armed drones.

Heyns states that: ‘By and large, these earlier reports were picked up and taken further by the international system, but none of them elicited a response similar to the autonomous weapons one. That report clearly hit a raw nerve. Most of us find the idea that machines will have the power to decide who will live and die frightening. And yet that is the inherent logic of the way technology is going. Human beings have in many ways become the weakest link in modern warfare. We are simply too slow, so we are not only being physically removed from the battlefield, but our decision-making functions are also taken over by robots.’  

In preparation for the report, Heyns first consulted with robotists, the military, philosophers and other international lawyers at the Institute for International and Comparative Law in Africa at UP, where he is based, in 2012. ‘I needed to get up to speed on how these robots work and what [kind of issues they raise], and got some of my best ideas here,’ he says. The New York University Law School then arranged a consultation for Prof Heyns with US specialists, and the European University Institute in Florence, Italy, convened a meeting of experts from other parts of the world.

In his 2013 report, Heyns noted that the new technology raised concerns ‘about the protection of life and human dignity during war and peace’ and the question whether machines could meet the requirements of international humanitarian law. Yet, according to Heyns, we cannot discount the increasing role that technology in all its manifestations plays in our society, and the fact that technology can also save lives and help us achieve some of our objectives. ‘The real question,’ Heyns says, ‘is not whether technology should play a role in the battlefield and even [target] decision-making, but rather how big a role it should be allowed to play.’

Heyns called on the international community and world states to establish national moratoria on the use of such weapons, and for an international process to establish where the line should be drawn.

According to Thompson Chengeta, a doctoral student of Prof Heyns, who studies the same topic and has attended a number of the key meetings, the international response has been immediate and comprehensive. Less than a month after Prof Heyns’s presentation to the Human Rights Council, the United Kingdom’s House of Commons held a meeting to discuss the implications of the issue. The European Parliament adopted a resolution on autonomous weapon systems, and the UN General Assembly, as well as the Disarmament Advisory Panel of the Secretary-General, took up the issue. The United Nations Institute for Disarmament convened a panel, of which Professor Heyns is a member, to study the problem. The International Committee of the Red Cross also held a meeting of experts to debate the issue. Chengeta remarks that the report and subsequent inputs from the mandate form a central point of reference in all these debates. Prof Heyns has also lectured on increased weapon autonomy at the universities of Oxford and Cambridge and Yale University.

Perhaps most significantly, the United Nations Convention on Conventional Weapons (CCW) placed the issue of autonomous weapons on their 2014 agenda. From 13to 17 May 2014, the CCW held an experts’ meeting in Geneva on autonomous weapon systems. According to Chengeta, who also attended, the meeting was extremely well attended by representatives of states from around the world, who, in many cases, delegated their military experts. Heyns and 17 other experts addressed the meeting. The CCW will decide how to take the issue further in November of this year.

‘This has been a hugely interesting process – an opportunity to experience first-hand how international law develops at the cutting edge,’ says Heyns. ‘I think the responsible thing to do for the international community is to draw a line at how much control [in respect of the use of force] humans may delegate to machines. Perhaps the most promising line of thinking at the moment is to require that states must ensure “meaningful human control” over every individual attack or use of force. Some of us are currently trying to work out exactly what that would mean in practice.

‘The process has again illustrated to me how important it is to see law in its broader, social context. The Vice-Chancellor and Principal of UP, Prof Cheryl de la Rey, attended a lecture I gave at the [Faculty of] Engineering and later remarked that it [was] clear that the world [had] moved on from the development of the nuclear bomb, which was done largely by engineers working on their own. That is no doubt right – the current process regarding new weapons is multi-disciplinary and I think we are all seeing the benefits. At the same time I must admit I have – perhaps surprisingly for someone coming from a human rights background – learned most from the military people. They have a focus that I admire.’

Asked about the ups and downs of the experience, Heyns responded as follows: ‘There were times – noticeably the first day of the CCW meeting and also with the Red Cross experts’ meeting – when I was holding my breath to hear whether the military experts from around the world were going to say this is not a real issue, or not yet a real issue. But that fortunately did not happen. On the contrary, I think I may have underestimated the speed with which we are going to see high levels of autonomy in weapons also being deployed in the area of law enforcement. I hope in my next report to the General Assembly to set out some of the elements of the term “meaningful human control”, and also to comment on the use of such weapons by the police, because it is coming.’

Find material on autonomous weapon systems by clicking on the links below: