Gender norms and patriarchal power structures affect the way we view, use, and engage with weapons, war, and violence. Autonomous weapons could be used to commit acts of gender based violence, and increase inequality as a result of algorithmic bias or target profiling.
New international law regulating autonomous weapons and banning systems that target people or operate without meaningful human control would support feminist foreign policy goals by focusing on human security and preventing militarisation of emerging technology and technological advancements.
What’s gender got to do with it?
Gender based violence could easily be enacted by autonomous weapons that select and engage targets on the basis of target profiles. We have already seen target profiling based on gender in the use of semi-autonomous weapons such as armed drones, which have been used to target militants (or count them as legitimate targets in casualty recording) based on their appearance as “military-aged males”. In this case, assumptions about men as potential or active combatants reinforce gender norms regarding male violence, which in turn legitimises them as targets – feeding into the cycle of gender-based violence.
Proponents of killer robots argue that autonomous weapons wouldn’t get hungry, tired, feel pain, fear or anger, and wouldn’t act in self-defence or make rash decisions in the heat of the moment. But as inanimate objects, such weapons systems would also lack empathy, conscience, emotion and understanding of human rights and human dignity.
These tools of human judgment are crucial for making the complex ethical and moral decisions required of soldiers in combat.
The development and use of autonomous weapons would only further dehumanise warfare and killing, and perpetuate patriarchal structures of military violence.
Autonomous weapons would select and engage targets on the basis of sensor processing, rather than immediate human command. In essence, this would reduce humans to patterns of data or lines of code. This becomes even more dangerous when considering bias that could be programmed autonomous weapons.
Emerging technologies like facial and vocal recognition have been proven to have high failure rates in recognising women, people of colour, and persons with disabilities. The use of autonomous weapons that rely on these technologies would likely result in higher risk for these groups, and anyone who does not fit within the ‘norm’ determined by the programmer.
Killer robots would not end sexual violence in conflict, but would likely perpetuate it. Autonomous weapons, void of human compassion or doubt, would not question an order to rape, if programmed to do so. Rape and sexual violence are used as weapons in conflict, and are already ordered by states and armed groups as a matter of strategic policy and inflicting terror. Autonomous weapons would be even less likely to disobey orders to commit rape than human soldiers, as a result of their lack of conscience, empathy or understanding of the act or consequences of sexual violence.
Securing a Feminist Future
In recent years, a small – but growing – number of governments are adopting feminist foreign policies, including Canada, France, Mexico, and Sweden. While these policies are being implemented to varying degrees and in various ways, ensuring meaningful human control over the use of force would support the feminist foreign policy approach and strengthen global peace and security.