New fighter plane heralds era of robot wars
Should robots be used in war? The US Air Force wants to stage a battle between a robot fighter plane and a piloted one. But many warn that any use of AI in war could lead to disaster.
It would be just like a video game. A high-stakes spectator sport.
Armies of autonomous machines would take each other on in a pre-arranged battlefield. Countries would be invaded, history rewritten, all without any violent loss of life.
Why put a person in harm’s way when you could use a machine?
Such real-life robot wars are not out of the question.
Just this month, Lieutenant General Jack Shanahan of the Pentagon announced what he called a “bold, bold idea”.
The US Air Force plans to pit an autonomous fighter jet against one piloted by a human in 2021, a huge step towards a new age of warfare.
But for many, the prospect of robots that could kill represents the opposite of progress.
“Why do people think it is okay to create machines that on their own can target and kill?” asks Nobel Prize winner Jody Williams, a supporter of the Campaign to Stop Killer Robots.
Up until now, there is always a human in the “kill chain” – the series of decisions that need to be taken before a trigger is pulled.
For example, drones controlled from the US are regularly used to fire missiles at suspected terrorists on the other side of the world.
Paul Scharre, an expert on autonomous weapons, argues that militaries should always keep humans involved. People can spot errors, understand context and get “inside the mind of an enemy commander”.
Nonetheless, automation is already everywhere and the military is no different. It helps focus cameras, steer missiles, and identify enemies.
Some AI workers are trying hard to prevent such an outcome. Laura Nolan resigned from Google after being asked to work on a new military drone. She warns that autonomous weapons “have to be banned because they are far too unpredictable and dangerous”.
It is this lack of knowledge around what a killer robot could one day do which alarms many researchers.
Military scholars in China have even written about “battlefield singularity”, a point at which warfare happens faster than humans can notice or even understand.
“If technology grows at a faster rate than our wisdom,” warns MIT Professor Max Tegmark, “it’s kind of like going into a kindergarten and giving the kids a bunch of hand-grenades.”
Despite these risks, many countries continue to pursue research into automated weapons.
In 2016, then US Deputy Secretary of Defence Robert Work explained why.
“If our competitors go to Terminators,” he said, “and it turns out the Terminators are able to make decisions faster, even if they’re bad, how would we respond?”
So, should robots be used in war?
Yes. Using autonomous technology in battle could keep soldiers safer and allow them to be more precise and effective in war. Ultimately, one could imagine armies of droids confronting each other. Humans would become like generals in historical battles, watching from a safe distance. A safer, quicker form of warfare is an inevitable. Those who fail to embrace it will lose the battles of the future.
No. We don’t understand artificial intelligence well enough to use it without enormous risk. So many normally tech-friendly experts are calling for killer robots to be banned. We should listen to them. Even in the idealistic situation of robots fighting robots, the smartest robots will know that the key to victory will be to destroy those who designed and control their opponents.
- Would you prefer sending real people or robots into battle to defend your country?
- Is there a difference between being killed by a robot or by another human? Why?
- Make a list of all the artificial intelligence that makes life easier for you right now.
- Write three laws that you would pass in order to limit the danger posed by robots.
Some People Say...
“If AI has a goal and humanity just happens to be in the way, it will destroy humanity as a matter of course without even thinking about it. No hard feelings.”Elon Musk, technology entrepreneur
What do you think?
Q & A
- What do we know?
- According to the Bureau of Investigative Journalism, drones have killed between 2400 and 3900 people in Pakistan since 2004, at least 400 of whom were civilians. Killer robots are already in development. The US navy is developing a gunboat described as a “completely autonomous watercraft equipped with artificial intelligence capabilities”. Meanwhile, Russia’s T-14 Armata tank will be unmanned and autonomous.
- What do we not know?
- We do not know how killer robots might be regulated. As Paul Scharre points out, “They can’t be observed and quantified in quite the same way as, say, a 1.5-megaton nuclear warhead.” Indeed, the difficulty in defining what exactly constitutes an autonomous weapon in a world of algorithms and technology is one of the main stumbling blocks in preventing them from being banned.
- Something that can function and make decisions without any human control. As of now, all robots are pre-programmed to make certain decisions. Artificial general intelligence (a robot that can truly think and act for itself) is yet to be reached.
- The centre of military headquarters in the USA. A huge complex in Washington DC, it is the world’s largest office building and has five sides like its name.
- Campaign to Stop Killer Robots
- A coalition of non-governmental organisations and leading individuals who seek to enact a ban on lethal autonomous weapons – even before these are developed. Thirty countries so far support its demands.
- In artificial intelligence theory, the singularity is the moment that an AI overtakes human intelligence and rapidly exceeds what we can think of or imagine. At that moment, technological growth would become uncontrollable and would likely transform the world.
- Massachusetts Institute of Technology is a private research university in Cambridge, Massachusetts.
- The Terminator is 1984 movie about a powerful humanoid robot that comes from the future to assassinate someone. Starring Arnold Schwarzenegger, it is seen as a classic of the sci-fi genre.