Science and Tech
Should We Ban Killer Robots?

David Rodgers | 6 May 2015 13:00
Science and Tech - RSS 2.0

A man in Colorado Springs recently made headlines across the internet by venting his frustrations upon a defenseless personal computer, unloading into the device with his pistol. Just days prior, the international community came together to ask: Would, or should, we ever allow the machines to reply in kind, either in self-defense or acting on human orders?

On April 14th, a number of leading military, scientific and legal experts met to hold a five day round of talks at the United Nations second convention on Certain Conventional Weapons in Geneva involving 90 member states. The talks were called in order to establish how best to regulate the potential future development and deployment of Lethal Autonomous Weapons, or LAWs. In other words, should we create robots to be used as soldiers? And if yes, how should we control their use?

What is a Lethal Autonomous Weapon?

Unlike current remote-controlled drones, LAWs encompass armed and autonomous machines capable of independent action without a controlling operator.

Roboticist Professor Noel Sharkey and leading member of the Campaign to Stop Killer Robots explains that, "According to the US Department of Defense, a LAW is a military weapon that, once activated, will engage targets without any further intervention. And what that means is that it will track and select its own targets to kill them or apply violent force without meaningful human control. In essence, a killer robot."

Organizations such as Stop Killer Robots, Amnesty International and Human Rights Watch have been appealing to member states to ensure that future autonomous weapon systems are always subject to "meaningful human control."

To be clear, we're not talking about sentient killing machines, either. For instance, a Roomba vacuum-cleaning robot is autonomous, with the ability to change its own cleaning route depending on factors such as whether your foot is in it's way, or stop if it encounters a steep drop like a flight of stairs. Autonomous military machines could similarly make independent decisions in the field based on sensory input and pre-planned objectives, without constant supervision and guidance by expensive and inefficient human operators.

How close are we to the killer robots seen in movies?

The subject of armed automatons has captured the public imagination in both heroic and villainous guises throughout modern science fiction, expressed in Hollywood feature films such as the Star Wars prequels, and most recently Marvel Films Avengers : Age of Ultron. But one of the most currently pertinent examples of killer machines in film is from last year's RoboCop (2014). During the film's introductory vision of 2028, a patrolling platoon of US military robots deployed within a fictional Iranian suburb is confronted by a team of insurgents, utilising target recognition and threat evaluation software to identify and destroy their targets.

However, Prof. Sharkey has doubts that similar recognition software with the correct level of sophistication to accurately discern legitimate targets will be created at the same pace as other aspects of LAW development. He said,"At the moment, these weapons could not be discriminate, the principle of distinction that's the cornerstone of the Geneva Conventions. The ability to recognize civilians, surrendering or wounded soldiers, we just don't have that at the moment. It doesn't seem likely that'll happen quickly... Current smart targeting systems are designed so you don't waste ammunition."

"These systems can see and accurately identify a tank by its silhouette if it's in an open desert, but as soon as you put it near trees or woodland, it can't tell the difference between the tank and a civilian truck."

"Another problem is the concept of proportionality. When deciding what is a target, and what amount of force to apply, human commanders must make decisions in complex situations, weighing humanitarian needs with military advantage. You can't make an algorithm for that; it requires a human to make that call."

RELATED CONTENT
Comments on