National campaigning against killer robots
This round-up of recent actions at the national level supporting the call to preemptively ban fully autonomous weapons systems covers developments in Australia, Belgium, Canada, Denmark, Holy See, Italy, Netherlands, UK and US in November-December 2017.
Australia
On 2 November 2017, more than 120 members of the Australian AI research community wrote to Prime Minister Malcolm Turnbull to ask Australia to endorse the call to ban lethal autonomous weapons systems and commit to working with other states to conclude a new international agreement that achieves this objective. The signatories called on Australia to “take a firm global stand against weaponizing” artificial intelligence by prohibiting weapons systems that remove meaningful human control from determining the legitimacy of targets and deploying lethal force.
Professor Toby Walsh of the University of New South Wales organized the sign-on letter and made the case in The Conversation for autonomous weapons to be added to the list of weapons that are morally unacceptable, alongside chemical and biological weapons. The letter to Turnbull received widespread coverage, including in ABC News, news.com.au, Independent, Guardian and Sydney Morning Herald. In July 2017, Australia’s Defence Industry Minister Christopher Pyne announced $50 million in funding for a seven-year research project into military applications of autonomy and artificial intelligence.
Belgium
On 6 December 2017, 116 scientists working in fields including artificial intelligence, robotics and computer science issued an open letter expressing serious concern at the development of weapon systems lacking meaningful human control over the critical functions of targeting and engagement in every attack. The letter, organized with the support of Campaign to Stop Killer Robots member Pax Christi Flanders, calls on the government and parliament of Belgium to support an international ban on such weapon systems and to resolve as a nation never to develop, acquire or deploy them.
The Belgian letter was issued on the same day as the Belgian parliament’s first hearing on the topic of fully autonomous weapons. Among others, parliamentarians heard from the defense industry and a Ministry of Defense representative, who dismissed the need to ban fully autonomous weapons. Three Belgian scientists and Daan Kayser of PAX addressed the hearing’s second session, and all called on Belgium to support the demand for a ban. The open letter and hearing received extensive media coverage, including in De Standaard, VTM Nieuws, Nieuws Blad, De Morgan, and VRT NWS.
A draft resolution that is expected to be voted on in the coming weeks asks the Belgian government to impose a national ban on killer robots and to join international efforts to negotiate a new legally binding instrument. This Green Party proposal has received support from the Social-Democratic group, while a Christian-Democratic member of the parliamentary committee on defence has also expressed support for a ban on weapons systems that lack meaningful human control. Back in 2015, Belgium’s current Minister of Defence Steven Vandeput declared that killer robots gives him “cold shivers” and stated that he does not favor such weapon systems.
Canada
On 2 November, more than 200 Canadians working in the field of artificial intelligence, including AI pioneers Geoffrey Hinton and Yoshua Bengio, issued an open letter to Prime Minister Justin Trudeau, demanding Canada to support the call to ban lethal autonomous weapons systems and commit to working with other states to conclude a new international agreement that achieves this objective. The letter received significant media attention, including on national radio and television as well as in print outlets such as the Canadian Press wire service, Ottawa Citizen, and Toronto Sun.
Professor Ian Kerr, Canada Research Chair in Ethics, Law and Technology at the University of Ottawa, co-authored the letter. In a 6 November article for Toronto’s Globe and Mail, Kerr welcomed the unprecedented $125-million of funds the Canadian government has committed to invest in AI research, but cautioned that “Leading in AI also means acting responsibly about it.”
On 15 November 2017, Kerr joined the Canadian Red Cross, Mines Action Canada and various Canadian AI experts in the first civil society consultation on lethal autonomous weapons systems convened by Global Affairs Canada, the country’s foreign ministry. Officials from Global Affairs and the Department of National Defence told the consultation that Canada believes “it is premature to ban a technology that doesn’t exist” and said, “we cannot draw the line without know what we are dealing with.” They pointed to page 73 of Canada’s 113-page 2017 Defence Policy Review, which states that “The Canadian Armed Forces is committed to maintaining appropriate human involvement in the use of military capabilities that can exert lethal force.”
Denmark
On 5 December 2017, Denmark’s largest newspaper Politiken published an article by Campaign to Stop Killer Robots coordinator Mary Wareham of Human Rights Watch, asking why Denmark is not actively working to address the serious concerns raised by fully autonomous weapons. A few days before, Danish academic Rune Saugmann published an article in foreign policy magazine Ræson describing Denmark’s “glaring absence” from the recent UN meeting on lethal autonomous weapons systems. Denmark last spoke on the topic in April 2015.
Wareham addressed a “Dansk Institut for Internationale Studier” or Danish Institute for International Studies (DIIS) seminar on “the politics of lethal autonomous weapons systems” in Copenhagen on 27 November 2017. This was the institute’s second seminar on the topic since 2016. In March 2017, DIIS published a four-page policy brief that recommends the Danish government support international efforts to ban or strictly regulate the development and use of lethal autonomous weapons. Danish television and radio outlets also covered the call for Denmark to take a strong stance on killer robots. In response, the Danish defense minister told Zetland Magazine that Denmark is unable to take a stand on the challenge “independently of what will happen internationally.”
Holy See
On 11 November, six Nobel Peace laureates attending a high-level symposium on nuclear disarmament at the Vatican provided His Holiness Pope Francis with a two-page statement that includes a call to ban fully autonomous weapons before they appear on the battlefield. In the statement, Nobel laureates Mohamed El Baradei, Mairead Maguire, Adolfo Pérez Esquivel, Jody Williams, and Muhammad Yunus affirm that “it is imperative to ask ourselves what ethical and moral human beings can possibly believe that it is fine to give machines the ability to kill humans.”
Since 2014, the Holy See has supported the call to preemptively ban lethal autonomous weapons systems. In November, the Caritas in Veritate Foundation published a working paper entitled “The Humanization of Robots and the Robotization of the Human Person” that contains all the Holy See’s UN statements to date on the topic of killer robots.
Italy
On 6 December 2017, the lower house of the Italian parliament debated concerns over fully autonomous weapons systems. The Chamber of Deputies voted to approve a resolution proposed by the Democratic Party (PD), which forms the majority government, that calls on Italy to work “towards” a moratorium on lethal autonomous weapons systems and cooperate with like-minded states in the international discussions. The debate coincided with the publication of a VICE Italia article by Philip di Salvo.
Four other motions on the topic were debated and dismissed during the session. “Rete Italiana per il Disarmo” or the Italian Network for Disarmament, a member of the Campaign to Stop Killer Robots, called on parliamentarians to support a resolution introduced by Deputy Stefano Quintarelli of Scelta Civica Group in May 2017 that demands the government support a moratorium on lethal autonomous weapons systems. Another motion proposed by right-wing party Lega Nord suggested that the Italian defense sector set aside space for future production of lethal autonomous weapons systems.
Netherlands
On 13 November, Campaign to Stop Killer Robots co-founder PAX published new research on autonomous weapons. The 22-page Where to draw the line report by Frank Slijper documents the trend towards increasing autonomy in weapon systems by identifying systems with the ability to select and attack targets with automated ‘critical’ functions, such as loitering munitions, autonomous fighter aircraft, and automated ground systems with varying levels of human control. The 58-page Keeping Control report by Daan Kayser provides an overview of the positions of European states on lethal autonomous weapon systems, including on the call for a ban and on how to ensure weapons systems remain under meaningful human control.
United Kingdom
On 28 November, Professor Noel Sharkey, chair of the International Committee for Robot Arms Control, addressed a House of Lord’s select committee hearing on artificial intelligence. The committee is looking at whether the United Kingdom should support a ban on the development and deployment of fully autonomous weapons. Sharkey highlighted the opportunity for the UK to take a leadership role in seeking a prohibition on autonomous weapons systems or, at minimum, develop national policy explaining what the UK considers to be legitimate human control of weapons systems.
On 9 November, Campaign to Stop Killer Robots members Article 36 and the United Nations Association – UK wrote to UK Foreign Secretary Boris Johnson to urge the government to shift its position on the call to ban killer robots by working to define a global standard for the level of human control necessary in weapons systems. Article 36 also published a discussion paper that considers how national-level weapons review processes could provide an implementation mechanism for an international legal commitment to ensure adequate human control over weapons systems.
On 2 November, Sharkey and other members of the Campaign to Stop Killer Robots spoke on a panel on lethal autonomous weapons at a WIRED Live conference in London. Machine learning company DeepMind’s co-founder Mustafa Suleyman introduced the panel and affirmed his company’s support of the call to ban these weapons systems.
United States
“Slaughterbots,” a 7:47-minute fictional film by artificial intelligence expert Professor Stuart Russell of the University of California at Berkeley, has been watched more than two million times since its release on 12 November. Russell first screened the film at a Campaign to Stop Killer Robots briefing for delegates attending the Convention on Conventional Weapons meeting on lethal autonomous weapons systems in Geneva last month. The film has been translated into multiple languages, generating a slew of media coverage around the world. The Boston-based Future of Life Institute funded production of the film and has created this new website to encourage other actions in support of the ban call: http://autonomousweapons.org.
On 15 November, Campaign to Stop Killer Robots representatives met with the US delegation to the CCW meeting on lethal autonomous weapons systems. The delegation said that the US supports the CCW process on this topic, but believes a legally-binding instrument or political declaration are premature at this time. The Department of Defense representatives confirmed that the department’s 2012 policy on autonomy in weapons systems (Directive 3000.9) has been renewed for the next five years with no substantive amendments and a few “administrative updates.” The policy will expire in November 2022. The adjusted policy is available online with track changes to show where amendments have been made.