Davos considers killer robots
Concerns over fully autonomous weapons or “killer robots” have been raised at the 2015 World Economic Forum in Davos, Switzerland with various publications and speakers addressing the matter. This is the first time that the annual meeting of world leaders from government, business, and civil society has considered the challenges posed by weapons systems that select targets and use force without further human intervention.
Several publications issued in advance of the 2015 World Economic Forum looked at the challenge, including the Forum’s 2015 Global Risks report, which includes a brief section under “artificial intelligence – rise of the machines” (page 40). According to the report, “several nations are working towards the development of lethal autonomous weapons systems that can assess information, choose targets and open fire without human intervention” which raises “new challenges for international law and the protection of noncombatants.” The report finds a lack of clarity as to who would be accountable if an autonomous weapons system violates international law and notes that “proactive and future-oriented work in many fields is needed to counteract ‘the tendency of technological advance to outpace the social control of technology.'”
These findings are repeated in a 15 January post published by the Forum and Professor Stuart Russell of the University of California at Berkeley, who earlier this month signed an open letter urging research on how to reap the benefits and potential of artificial intelligence “while avoiding potential pitfalls.” Russell co-authored a research priorities memo issued with together with the open letter that lists questions over autonomous weapons among other concerns to be studied.
At a World Economic Forum panel on how advances in artificial intelligence, smart sensors and social technology will “change lives” on 22 January, Russell warned that fully autonomous weapons could change the nature of warfare completely and asked the audience, “is this the direction we want to go? I’m not so sure.”
The hour-long all-male panel was moderated by anchor Ms. Hiroko Kuniya on behalf of Japan Broadcasting Corporation NHK.
Another speaker on the panel was Ken Roth, the executive director of Human Rights Watch, which is a co-founder of the Campaign to Stop Killer Robots. He addressed legal, ethical, and proliferation concerns and asked if machines have “the refinement of judgment to decide on who is a combatant.” Roth described the “key impediment to war” as “human empathy” and expressed skepticism that “better programming” can address concerns raised, warning, “there are moral judgments at stake.”
Roth observed that for certain technologies “it’s much better to keep the genie in the bottle,” because “once it’s out, it’s much harder to get back in,” giving the example of nuclear weapons. He urged a preemptive ban on fully autonomous weapons and described the positive example established 20 years ago by the Convention on Conventional Weapons when nations preemptively banned blinding lasers because they found the prospect of weapons that would permanently blind so appalling.
One World Economic Forum participant who attended the panel was Norwegian businessman Johan H. Andresen, chair of the ethics council of the Norwegian Petroleum Fund. In an interview with Norwegian business daily DN (Dagens Næringsliv), Andresen said the prospect of fully autonomous weapons concerns him. On the recommendation of the ethics council, the Fund has disinvested from companies involved in the production of antipersonnel landmines, cluster munitions, and nuclear weapons. When asked if the council should propose excluding producers of autonomous weapons, Andresen said that he hopes such a decision will not be need to be taken if efforts to preemptively ban these weapons are successful.
For more information:
- World Economic Forum “A Brave New World?” panel and media coverage
- Human Rights Watch publications on fully autonomous weapons