The fourth industrial revolution – with automation as its key feature – is in full swing. Militaries around the globe intend to benefit from this development, and so called “autonomy” in weapons systems is on the rise. In a new article, Elvira Rosert and Frank Sauer compare the international humanitarian disarmament processes on blinding laser weapons, anti-personnel landmines and lethal autonomous weapons systems (LAWS) aka “killer robots.” Emphasizing that weapon autonomy differs substantially from past issues, the authors argue that the international campaign against LAWS cannot rely on simply modeling their effort after past successes. Instead of aiming to define and ban LAWS as a category of weapons, the use of autonomy in weapons should be regulated through codifying a positive obligation to retain human control.
Since 2013, the international community has been discussing LAWS at the United Nations in Geneva. The main venue of this debate is the Convention on Certain Conventional Weapons (CCW), a framework convention tasked with restricting or prohibiting weapons deemed to have indiscriminate effects or to be excessively injurious. This diplomatic process is owed in large part to a global coalition of 160 non-governmental organizations (NGOs) in 66 countries, coordinated in the joint “Campaign to Stop Killer Robots” (KRC), tirelessly raising awareness of the legal, ethical, and security concerns accompanying weapon autonomy.
In its effort, the campaign is employing tried-and-tested strategy elements successfully applied in previous humanitarian disarmament processes that resulted in the bans on blinding laser weapons, anti-personnel landmines and cluster munitions. This includes public awareness-raising, the dissemination of expertise to the general public as well as to the diplomats working on the issue, and building coalitions with powerful voices in the CCW such as, for instance, the International Committee of the Red Cross. However, while these strategy elements are indeed conducive to the campaign’s goal of creating new, binding international law on weapon autonomy, others are not.
A key problem is the campaign’s framing of the issue as one of “killer robots”. For every successful humanitarian disarmament campaign, a simple, powerful and dramatic message (like “blinding is cruel” or “landmines maim civilians”) is indispensable. By invoking pictures of the Terminator, the “killer robots” label resonates well with the public and conveys an existential threat – however, it also inevitably renders the issue futuristic and thus much less urgent. This “sci-fi-feel” stifles progress in the CCW, where ban opponents use it to declare the negotiations a premature, speculative discussion about future military technologies.
More importantly, the “killer robots” frame obscures the complex and polymorphous nature of weapon autonomy that sets the issue apart from both blinding lasers and landmines, creating several challenges. First, the variations of what “killer robots” might look like are endless. Every conceivable future tank, plane, boat, submarine, or swarm of such systems could potentially be deemed a lethal autonomous weapons system. Second, no system would even be discernible as autonomous by looking at it – in fact, whether a weapons system is remotely piloted, and thus under human control while in operation, or whether it is autonomous, that is, finding, fixing, tracking, selecting, and engaging targets without human intervention, is impossible to know from the outside. The difference will eventually be nothing but a checkbox in its software’s user interface. Third, future weapons systems will increasingly be spatially distributed, raising the tricky question, “where and when [a LAWS] begins and ends”, as Maya Brehm puts it.
Consequently, LAWS, in contrast to other weapons like blinding lasers or landmines, do not constitute a clearly definable category, or at least not one that is inclusive and exclusive. Stigmatizing LAWS is thus much harder and, in addition, complicated by the fact that some applications of weapon autonomy, for instance in terminal defense systems against incoming munitions, are protecting human life and barely raising any humanitarian concerns.
Nevertheless, the legal, ethical, and security concerns raised by campaigners are valid – but finding some common “definition of LAWS” that aims at categorically separating them from “non-LAWS” is not the way to go. Instead, to get a regulatory grasp on weapon autonomy, campaigners and the international community are challenged to collectively stipulate how future targeting processes should be designed so that the use of military force remains under human control that is meaningful, as in, not just a mindless pushing of buttons.
It is therefore encouraging that the CCW deliberations have begun shifting from the futile search for a categorical definition of LAWS toward gauging the role of the “human element,” that is, the creation of conditions to retain meaningful human control over weapons systems. One of our suggestions to the campaign is to explicitly acknowledge this shift and adjust its messaging accordingly, away from “banning killer robots” and towards “codifying meaningful human control” as a principle requirement in international humanitarian law. The goal is to regulate when a machine and when a human is deciding what, that is, performing which function in the decision-making cycle of finding, fixing, tracking, selecting, and engaging a target. The answers undoubtedly will differ – depending on the operational context and the target (that for instance, might be an incoming missile or a human being). But while banning killer robots this way is tricky, it at least is feasible.
Elvira Rosert is a Junior Professor for International Relations at Universität Hamburg and the Institute for Peace Research and Security Policy in Hamburg. Frank Sauer is a Senior Researcher at Bundeswehr University Munich. They are the authors of “How (not) to stop the killer robots: A comparative analysis of humanitarian disarmament campaign strategies”, Contemporary Security Policy, and of “Prohibiting Autonomous Weapons: Put Human Dignity First”, Global Policy 10: 3, 370-375.