Threat of killer robots imminent, warns Musk

Scout robot: This machine could operate in places too perilous for humans. © Sergio Moraes

How can humanity control the terrifying prospect of autonomous weapons? Elon Musk has joined over 100 artificial intelligence experts in asking the UN to ban the use of “killer robots”.

“We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”

These are the ominous words in a letter sent to the UN in recent days by 116 robotics experts, including Elon Musk and the head of Google’s DeepMind. They were warning about the dangers of “lethal autonomous weapons” — technology which is designed to kill without human supervision, most likely using artificial intelligence (AI).

The technology does not exist in this form yet, but countries including the USA, Russia and China are developing autonomous weapons — also known as killer robots. Some experts fear they could lead to a dystopian future, in which major countries compete in an arms race and governments feel forced to use mass surveillance on their own people.

Picture, for example, a quadcopter, just one inch in diameter, carrying a small charge. Last year Stuart Russell, a professor of computer science, said these could exist within two or three years; a few trucks full of them could kill millions. Robotics professor Noel Sharkey has suggested another “nightmare” scenario: a nuclear deterrent could be placed on an unmanned system and “go completely ballistic”.

This may sound like science fiction. But unmanned drones already exist. The next generation could use AI to target people without human action. “Really, no technological breakthroughs are required,” said Russell. “Every one of the component technologies is available in some form commercially.”

Hoping to avoid this, Musk and his colleagues have asked the UN to add killer robots to the list of banned weapons under the UN Convention on Certain Conventional Weapons (CCW).

The UN has agreed to debate it. But securing agreement could prove difficult. Many military leaders think the weapons could provide a cheap, effective way to protect their peoples, minimising the risk to their own troops. Russell has warned discussions of a ban could soon become “academic”.


Secure a global ban now, some say. Once some countries develop these weapons, others will get scared and make their own. The best way to ensure these awful weapons are never used is to get rid of them. It is in everyone’s interests to invest in this process. And it is possible to reach an effective agreement: look, for example, at the ban on biological weapons in 1972.

A dangerous waste of time, retort realists. Organised criminals, terrorists and tyrants would ignore a ban, just as they do that on chemical weapons. The answer lies in strong nation states. Democracies such as the USA and UK must invest their energy in developing the best weapons available, so they can defend people against them and deter those prepared to use them for ill.

You Decide

  1. Do you see autonomous weapons as a threat or an opportunity?
  2. Should killer robots be banned internationally?


  1. Think of a way a robot could help to address a problem you care about. Draw a sketch of the robot you would invent. Then compare with the rest of your class and explain five key features of your robot.
  2. Find out how the Chemical Weapons Convention (CWC) was agreed and how effective it has been. Then write a one-page memo explaining whether it works, and whether a ban on killer robots would work.

Some People Say...

“Peace only comes through strength.”

What do you think?

Q & A

What do we know?
Autonomous weapons are already being used with human supervision. The United States, in particular, has launched drone strikes in countries such as Pakistan, Afghanistan and Syria since 2001 (it first used them in Afghanistan after 9/11). And more are being created now — although currently there are no weapons in use that rely entirely on AI.
What do we not know?
It is impossible to predict how quickly the technology will evolve, and some of the more outlandish scenarios outlined here could take a while to become reality. We also do not know what the UN will conclude once it begins its debate. The last time it discussed this issue, in 2014, only five countries supported an outright ban. (Cuba, Pakistan, Egypt, Ecuador, and the Vatican, if you were wondering.)

Word Watch

Pandora’s box
In Greek mythology, Pandora’s box contained all the world’s evils, which were released once it was opened.
Elon Musk
The inventor and entrepreneur is chief executive of several technology companies, including Tesla and SpaceX. He has often warned about the dangers of AI.
An AI company based in London, which is currently leading research in the field.
The technology is evolving rapidly. South Korea has an autonomous gun on its border with North Korea. The USA has autonomous ships; the UK is developing a supersonic stealth drone.
Individuals could adopt the technology to defend their property. Police forces may also use it: in Dallas in July, a robot killed a gunman who was murdering police officers.
At least ten countries already use weaponised drones.
Several countries are resistant to a ban. Israel intends to move towards fully autonomous weapons. BuzzFeed’s Sarah Topol says Russia and China have “expressed little interest in a ban” and the USA “is only a little less blunt.”

PDF Download

Please click on "Print view" at the top of the page to see a print friendly version of the article.