In the future, a growing number of combat operations will be carried out by autonomous weapon systems (AWS). At the operational level, AWS would not rely on direct human input. Taking humans out of the loop will raise questions of the compatibility of AWS with the fundamental requirements of international humanitarian law (IHL), such as the principles of distinction and proportionality, as well as complicate allocation of responsibility for war crimes and crimes against humanity.
This Article addresses the development toward greater autonomy in military technology along three dimensions: legal, ethical, and political concerns. First, it analyzes the potential dehumanizing effect of AWS with respect to the principles of distinction and proportionality and criminal responsibility.
Second, this Article explores, from an ethical perspective, the advantages and disadvantages of the deployment of AWS independent of legal considerations. Authors from various fields have weighed in on this debate, but oftentimes without linking their discourse to legal questions. This Article fills this gap by bridging these disparate discourses and suggests that there are important ethical reasons that militate against the use of AWS.
Third, this Article argues that the introduction of AWS alters the risk calculus of whether to engage in or prolong an armed conflict. This alteration is likely to make that decision politically more palatable and less risky for the political decision makers.
Friday, April 17, 2015
Wagner: The Dehumanization of International Humanitarian Law: Legal, Ethical, and Political Implications of Autonomous Weapon Systems
Markus Wagner (Univ. of Miami - Law) has posted The Dehumanization of International Humanitarian Law: Legal, Ethical, and Political Implications of Autonomous Weapon Systems (Vanderbilt Journal of Transnational Law, Vol. 47, 2014). Here's the abstract: