menu hero image

Statement by Stop Killer Robots to the GGE on lethal autonomous weapons systems, 3-7 March

Read the Stop Killer Robots statement to the Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS)

The following statement was delivered on Tuesday, March 4, 2025, by Stop Killer Robots Executive Director Nicole Van Rooijen to delegates participating in the  meeting of the Group of Governmental Experts on Lethal Autonomous Weapons (LAWS), which took place on 3-7 March 2-2025.

Thank you Chair.

Stop Killer Robots welcomes the attempt to address anti-personnel systems in Box III, paragraph 6, section C, point v on restricting use to objects that are military objectives by nature, but urges states to discuss and make explicit rules on anti-personnel systems to effectively meet many of the core legal and ethical challenges – including around bias.

Stop Killer Robots is aligned with the position of the International Committee of the Red Cross on this issue.

From a technical perspective, distinguishing valid human targets is immensely more difficult than identifying military objectives by nature–such as tanks, munitions or military facilities–as valid targets. This is because there is no stable and universal ‘combatant’ target profile: this is a contextual, human judgment to make, which makes it impossible to build a software system that can accurately distinguish combatants from civilians. There is also the further problem of recognizing combatants who are hors de combat, injured or surrendering. The risk of IHL violations in the context of the use of autonomous anti-personnel weapons systems is high, as is the risk to friendly military personnel who may be present where such weapons systems are in use. Moreover, the intentional killing of a human being requires legal and moral justification. No machine, computer, or algorithm is capable of recognizing a human as a human being, nor can it respect humans as inherent bearers of rights and dignity, understand what it means to be in a state of war, much less what it means to have, or to end, a human life. Decisions to end human life must be made by humans in order to be morally justifiable. The current paragraph of the text  (Box III, Paragraph 6, Section C) which addresses limiting the types of targets, duration, geographical scope, and scale of the operation of AWS could be extended to explicitly prohibit targeting of personnel, and we urge States to consider making this addition.

Thank you Chair.

Stop Killer Robots

Image alt text
SKR dots icon

Stop killer robots

Join us

Keep up with the latest developments in the movement to Stop Killer Robots.

Join us