Will REAIM live up to its…aims?
Upcoming Summit provides platform for states to discuss commitment to responsible use of military AI in practice, but is unlikely to deliver.
The Responsible AI in the Military Domain (REAIM) Summit 2024 will take place in Seoul, Republic of Korea, from 9-10 September and is co-hosted by The Kingdom of the Netherlands, The Republic of Singapore, The Republic of Kenya, and The United Kingdom of Great Britain and Northern Ireland. Last year’s inaugural REAIM Summit in The Hague saw the parallel launches of the ‘REAIM Call to Action’ and of the United States’ ‘Political Declaration on Responsible Military Use of AI and Autonomy’ which ostensibly outlined guidance on “responsible behaviour” for “states’ development, deployment, and use of military AI”, but fell woefully short of calling for the negotiation of new international law to prohibit and restrict autonomous weapons systems.
Since the last REAIM Summit took place, there have been numerous reported uses of military AI tools beyond autonomous weapons systems, including the use of AI-powered ‘decision support systems’ in various contexts, notably by Israel in Gaza, which used a system that suggested human targets to strike. Other states, including the United States in Iraq and Syria, have also employed these ‘decision support systems’ to identify military objects for targeting. In a context in which military use of AI has likely directly contributed to civilian death and suffering, the key questions for states to directly engage with if they are seeking to shape norms on ‘responsible military AI’ are whether the current direction of reported practice is acceptable, and what the criteria for ‘responsible’ use are. If states cannot meaningfully address these questions, there seems to be little utility in these discussions.
Stop Killer Robots is deeply concerned by all uses of technology that erode meaningful human decision-making and control, and entail digital dehumanisation in the use of force. Alongside clear legal rules on autonomous weapons systems, which we have consistently called for, all developments in military AI and autonomy which threaten our safety, security, and humanity must be adequately addressed by the international community, with unacceptable practices prevented.
The REAIM Summit is also an industry event. It will feature an exhibition providing ‘a visual demonstration of how artificial intelligence can be applied in the military’, where attendees ‘will be able to see a wide range of technologies, from those currently available to those that will revolutionise the military in the future’. With participation from companies which are actively developing military AI technologies and AI-based weapons systems, the REAIM Summit foregrounds the perspectives and concerns of the military, defence and tech industries: those with a vested interest in encouraging the further adoption of automation and artificial intelligence in the military space. The Summit’s slogan of ‘Responsible AI for a safer tomorrow’ seems to offer a narrow interpretation of both ‘responsible’ and ‘safer’. It also begs the question: safer for whom?
As the CCW continues to stall and with the UN General Assembly (UNGA) on the horizon, states should not get distracted by initiatives that don’t critically and meaningfully engage with the very real consequences posed by real world uses of AI and automated-decision making in the military. They must commit to urgently addressing unacceptable uses of AI and autonomy, including autonomous weapons systems, through legally binding prohibitions and regulations.
At the upcoming session of UNGA, states must use this inclusive forum – where progress cannot be blocked by a small minority – to adopt a resolution mandating negotiations on a legally binding instrument on autonomous weapons systems, building on all the progress that has now been made on this topic. The UN Secretary-General called again on states to conclude such negotiations by 2026 in his report to the General Assembly: this is the key next step that states must focus on, and there is now no reason to delay further.