
As Big Tech accelerates military uses of AI, civil society raises alarm on digital dehumanisation
Stop Killer Robots at RightsCon Taipei 2025
Stop Killer Robots was in Taiwan from 24 – 27 February 2025 for RightsCon Taipei. This was Stop Killer Robots’ fifth year attending the conference for stakeholders at the intersection of human rights and technology. RightsCon gathered thousands of activists, journalists, human rights defenders, whistleblowers, government officials, and industry stakeholders for a series of lectures, discussions and workshops.
With growing digital dehumanisation, Stop Killer Robots calls for new international law on autonomous weapons to regulate the use of these weapons in conflicts, policing, and border control to ensure that life or death decisions are delegated in part, or whole, to a machine.
In the months leading up to the event, there have been multiple reports on how big tech companies such as Google and Microsoft, among others, have been active in the development of AI systems for use by the Israeli Defense Forces in Gaza. Further, on 5 February 2025, Alphabet, which owns Google, reneged on its pledge to not use AI for weapons. Despite their stated approach to AI, which they claim is “grounded in understanding and accounting for its broad implications for people”, this policy decision increases the risks of automated harm in military, law enforcement, border control and surveillance contexts. These developments, along with the fact that a number of big tech companies – including Meta, Microsoft and Google – sponsored RightsCon 2025, led to participants expressing growing discontent on the military-industrial complex accelerating the development of dehumanising precursor technologies to autonomous weapons systems.
On 26 February, we hosted an online dialogue that further delved into the issues of digital dehumanisation as it relates to autonomous weapons. The session, titled ‘Don’t use my data against me: Taking action against digital dehumanisation and automated decision making.’ highlighted the multi-billion dollar funding for tech companies to accelerate development of automated/ autonomous systems for military use. Speakers raised concern on the disproportional impact of datafication and war on the global majority. The hour-long session was moderated by Automated Decision researcher Sai Bourothu in conversation with expert panelists Elke Schwarz (Professor of Political Theory at Queen Mary University of London and member of the International Committee for Robot Arms Control), Ishmael Bhila (Doctoral Researcher and Research Associate at Paderborn University), and Heramb Podar (Director, India Chapter of Encode).
The Stop Killer Robots team attended sessions on AI governance, technology-facilitated gender-based violence, internet shutdowns and censorship, urgent need for regulations, transparency, accountability, and human-rights-centred design. Given RightsCon’s intersection at the crossroads of human rights and technology, we highlighted the campaign’s key policy position on prohibitions on antipersonnel autonomous weapons – autonomous weapons should not target people. Visitors to our booth responded positively to this. They also voiced an overarching concern, across various themes, about the impacts of reducing people to data – or digital dehumanisation – and the need to act now. Through these fruitful discussions, we learned about other forms of digital dehumanisation within and beyond the military sphere, and found connections between our work and the breadth of issues RightsCon participants are seeking to address.
RightsCon participants also shared our hope that these technological threats to human rights can be deterred. We highlighted the important work that states, in particular those from the Global South, have recently undertaken in the form of conferences and communiqués to reject digital dehumanisation and call for legal safeguards on autonomous weapons.Given the international community’s interest and support for new international law, now it is time for them to take meaningful action to address autonomous weapons and ensure meaningful human control in the use of force.
Our time at RightsCon demonstrated that civil society – working across a range of issues – shares our concern about the extreme digital dehumanisation that autonomous weapons systems threaten. In 2025, where there is a growing trend of autonomy in weapons systems and Big Tech involvement in this acceleration, the time to act is now.