2023 Conference on International Cyber Security | 7-8 November 2023
Register now

Ingvild Bode

Dr Ingvild Bode is Associate Professor at the Centre for War Studies, University of Southern Denmark. Her research focuses on processes of normative and policy change, especially with regard to the use of force. She is the Principal Investigator of the European Research Council-funded project AutoNorms: Weaponised Artificial Intelligence, Norms, and Order (08/2020-07/2025). The AutoNorms project investigates how practices related to autonomous weapon systems change international norms. AutoNorms examines military, transnational, political and dual-use practices in China, Japan, Russia, and the USA. Ingvild serves as the co-chair of the IEEE Research Group on AI and Autonomy in Defence Systems. Her work has been published with the European Journal of International Relations, Ethics and Information Technology, Review of International Studies, International Studies Review and other journals. Ingvild’s most recent book entitled Autonomous Weapons and International Norms (co-authored with Hendrik Huelss) was published by McGill-Queen’s University Press in 2022. Previously, Ingvild was Senior Lecturer in International Relations at the University of Kent, Canterbury (2015-2020) and a Japan Society for the Promotion of Science International Research Fellow with joined affiliation at United Nations University and the University of Tokyo (2013-2015).



How practices make norms: Autonomous and AI technologies in weapon systems

In March 2021, a United Nations report argued that the Kargu-2, a one-way attack drone, has been used to strike militias in Libya autonomously, that is without direct human supervision and authorisation. In the war in Ukraine, both sides have used similar types of drones that appear to have the latent technical capability to identify, track, select, and strike targets autonomously. Such drones are examples of a growing number of weapon systems incorporating autonomous and AI technologies in targeting. This list includes air defence systems, active protection systems, guided missiles, stationary sentries, and counter-drone systems – demonstrating that the prospect of autonomous warfare is no longer science fiction. Although often discussed via the umbrella term autonomous weapon systems (AWS), most existing weapon systems cannot be neatly classified as AWS because they appear to be operated with humans ‘in the loop’ to authorise attacks. But the quality of control that human operators can exercise is already compromised. This is due to the complexity of the tasks human operators need to perform and the demands they are placed under, for example in terms of speed and overseeing multiple, networked. In the practices of warfare, algorithmic rather than human decision-making may therefore prevail, raising humanitarian, legal, ethical, and security concerns.  

The keynote examines how this diminishing quality of human control results from a governance gap of AI in the military domain. The only international forum to discuss the topic of governance (since 2017) – the Group of Governmental Experts (GGE) on lethal AWS under the UN Convention on Certain Conventional Systems (CCW) – has not resulted in substantive progress for three reasons. In the absence of top-down governance, I argue that the use of autonomous and AI technologies in weapon systems makes norms. Practices of design, of training personnel for, and of using such weapon systems shape a social norm, an “understanding of appropriateness” of what counts as the requisite form of human control over the use of force. These practices are typically performed at sites outside of the public eye. I will demonstrate how this emerging norm accepts a diminished, reduced, uncertain form of human control when interacting with autonomous/AI technologies as “normal” and “appropriate”. At the GGE debates, states have not scrutinised this emerging norm but rather ignored it, distanced discussions from it, or even positively acknowledged it. I close by arguing that the emerging norm of diminished human control is both a societal challenge and public policy problem because it undercuts human agency in warfare.