Model of a killer robot

Thousands of scientists and engineers pledge not to help build killer robots

Image credit: Dreamstime

2,400 individuals and 160 companies have signed a pledge promising not to assist in the development of lethal autonomous weapons systems, and calling on governments to restrict these weapons.

The scientists and engineers – including leading tech figures – promised to “neither participate in nor support the development, manufacture, trade or use of lethal autonomous weapons”. This refers to machines which use artificial intelligence (AI) to identify, target and attack people without human oversight.

“We the undersigned agree that the decision to take a human life should never be delegated to a machine. There is a moral component to this position, that we should not allow machines to make life-taking decisions for which others – or nobody – will be culpable,” the pledge stated.

The pledge says that lethal autonomous weapons could be “dangerously destabilising” for countries and risk becoming “powerful instruments of violence and oppression, especially when linked to surveillance and data systems” or becoming part of a dangerous and unprecedented arms race.

The pledge was coordinated by the US-based Future of Life Institute and announced at the International Joint Conference on AI in Stockholm, Sweden. The 2,400 individual signatories represent 90 countries and include celebrity industrialist Elon Musk, Google DeepMind’s Demis Hassabis and Skype founder Jaan Tallinn. One-hundred-and-sixty companies and other institutions, which work with AI, have also signed the pledge, including University College London, Google DeepMind and the Xprize Foundation.

“I’m excited to see AI leader shifting from talk to action, implementing a policy that politicians have thus far failed to put into effect,” said Max Tegmark, president of the Future of Life Institute and a signatory, in a statement. “AI has huge potential to help the world if we stigmatise and prevent its abuse.

“AI weapons that autonomously decide to kill people are as disgusting and destabilising as bioweapons, and should be dealt with in the same way.”

The pledge calls on governments to move towards restricting the development of autonomous weapons; this could include restraining from funding the creation of these systems or by introducing regulations putting a halt to their development. Last year, the Future of Life Institute organised an open letter calling on the UN to ban the development and deployment of lethal autonomous weapons.

Internationally agreed restrictions on these weapons – much like those on chemical and biological weapons, blinding lasers and landmines – have been under discussion at UN meetings, although the government of Russia has already indicated that it would ignore these restrictions.

While nothing approaching the fictional Terminator robot is likely to be seen soon, autonomous military technology already exists and is under active development in a number of countries.

At the Farnborough airshow this week, defence secretary Gavin Williamson announced a multibillion pound project to develop a new RAF fighter – the Tempest – which will be capable of flying unmanned and autonomously hitting targets, as well as using concentrated energy beams to inflict damage; the government has stated that human operators will always have oversight over all weapons systems.

Earlier this year, Google came under internal and external fire over its Pentagon contract (Project Maven) to provide machine learning technology to help the armed forces identify drones. Internal rebellion led to more than 4,600 Google employees signing a petition calling on an end to the involvement while more than a dozen employees resigned in protest. In response to the critics, Google later announced that it would not be renewing the contract and released a set of company guidelines on AI ethics.

Amazon and Microsoft have also been criticised for working with defence and security agencies, while an international boycott by academics of KAIST University, South Korea, forced the university leadership to pledge not to assist in the development of lethal weapons which could kill without human oversight.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them