Autonomous weapons systems seem to be on the path to becoming accepted technologies of warfare. The weaponization of artificial intelligence raises questions about whether human beings will maintain control of the use of force. The notion of meaningful human control has become a focus of international debate on lethal autonomous weapons systems among members of the United Nations: many states have diverging ideas about various complex forms of human-machine interaction and the point at which human control stops being meaningful.
In Autonomous Weapons Systems and International Norms Ingvild Bode and Hendrik Huelss present an innovative study of how testing, developing, and using weapons systems with autonomous features shapes ethical and legal norms, and how standards manifest and change in practice. Autonomous weapons systems are not a matter for the distant future - some autonomous features, such as in air defence systems, have been in use for decades. They have already incrementally changed use-of-force norms by setting emerging standards for what counts as meaningful human control. As UN discussions drag on with minimal progress, the trend towards autonomizing weapons systems continues.
Wednesday, October 5, 2022
Bode & Huelss: Autonomous Weapons Systems and International Norms
Ingvild Bode (Univ. of Southern Denmark - Center for War Studies) & Hendrik Huelss (Univ. of Southern Denmark - Center for War Studies) have published Autonomous Weapons Systems and International Norms (McGill-Queen's Univ. Press 2022). Here's the abstract: