Science

A.I. experts say killer robots are the next 'weapons of mass destruction'

The choice of despots everywhere.

Campaign to Stop Killer Robots

A former Google software engineer is sounding the alarm on killer robots.

Laura Nolan resigned from Google last year when the tech giant started working with the U.S. military on drone technology, and since then, she has joined the Campaign to Stop Killer Robots, warning that autonomous robots with lethal capabilities could become a threat to humanity.

Discussions concerning possibly banning autonomous weapons fell apart on August 21 during a United Nations meeting in Geneva, when Russian diplomats allegedly made a fuss over the language that was used in a document meant to begin the process of establishing a ban.

“If you’re a despot, how much easier is it to have a small cadre of engineers control a fleet of autonomous weapons for you than to have to keep your troops in line?” Nolan tells Inverse. “Autonomous weapons are potential weapons of mass destruction. They need to be made taboo in the same way that chemical and biological weapons are.”

Even if it’s not a despot or a terrorist group taking advantage of this technology, a robot capable of killing that’s just in charge of security at a facility, for example, could unintentionally cause major problems.

Through her work, Nolan has often seen that the kind of software that would control autonomous weapons “behaves in ways we don’t intend.” Because these are lethal machines operating in an open environment, Nolan says they could “cause significant harm if they malfunction.”

Gfycat

Nolan isn’t alone in fearing the possible consequences of developing autonomous robots that are capable of killing. Elon Musk has repeatedly warned about the dangers of killer robots, and back in 2017, Musk and Alphabet’s Mustafa Suleyman led a group of robotics and AI experts who called on the United Nations to ban the use of this technology. They referred to such robots as “weapons of terror.”

“We do not have long to act,” the experts wrote in an open letter. “Once this Pandora’s box is opened, it will be hard to close.”

Google committed to not developing artificial intelligence for weapons last year after public outcry over its work with the U.S. military.

As depicted in the short film above, Slaughterbots (2017), we’re not necessarily talking about humanoid robots with guns for hands. A killer robot might be a flying quadcopter, an autonomous tank or something along those lines. If it’s a weapon that operates autonomously, there is an inherent risk to using it. These robots could be hacked by dangerous groups or simply start behaving abnormally and wreak havoc on a town or a city.

Nolan believes all nations should sign a treaty saying they will not develop killer robot technology. This would be similar to when 193 nations agreed to ban the use of chemical weapons in the late 1990s.

“We need to get these countries to understand the reality that autonomous weapons are not a strategic advantage: they would soon be developed by or sold to a plethora of nations,” Nolan says. “This is similar to drone warfare: they provided an advantage to a small number of states initially, but now have proliferated widely.”

See also: Ban Killer Robots Before It’s Too Late, Experts Warn the UN

We’re truly only beginning to imagine the implications this kind of technology could have. Nolan says something that keeps her up at night is research like this study that was published in WIREs last year. It focuses on how facial recognition is getting better at identifying people from certain ethnic groups and explains that this could be useful for “border control, customs check, and public security.” If applied to killer robots, it’s not hard to imagine how this could be abused and could result in ethnic groups being targeted by these killing machines.

The Campaign to Stop Killer Robots has been working on this issue since it formed in 2012. The group says autonomous weapons “would lack the inherently human characteristics such as compassion that are necessary to make complex ethical choices.” Thus far, 28 countries are backing its call to ban the use of killer robots, including Mexico, Brazil and China.

Mary Wareham, global coordinator at the Campaign to Stop Killer Robots, tells Inverse that the main obstacle to getting this ban on killer robots done has been Russia and the United States. She says talks about a ban have been “held hostage” by the two countries, as they have claimed it’s “premature” to be considering such a ban.

Kremlin.ru

“Diplomacy to deal with the killer robots threat is, unfortunately, going nowhere at the moment,” Wareham says. That said, Wareham says the campaign is always gaining more support outside of the U.N.

Wareham says that even if we get a president who agrees that these robots need to be banned, Russia will need to get on board. She says countries might adhere to a ban without joining the treaty, as we’ve seen with the U.S. not signing the landmine treaty but largely giving up the use of landmines, but it’d be better if everyone was signed on to it. As things stand, the situation may be moving in the wrong direction.

“I think it’s pretty obvious, in terms of the developments that you see today, that the money is being sunk into autonomous weapons systems,” Wareham says.

Back in February, the U.S. military began seeking out vendors that could help it develop machine learning and artificial intelligence technology so ground vehicles will be able to autonomously target enemies. Military code currently dictates that a human must be involved in any decision to fire on an enemy, but it’s possible that code could soon change.

Countries will almost certainly start utilizing killer robots if a ban is not successfully put together. Wareham says she worries that countries might not act on a ban until something terrible has already happened.

“Every day states delay moving to regulate is a day we get closer to pointing at a weapons system and saying, ‘That’s the killer robot we’ve been talking about,’” Wareham says. She says she doesn’t want to wait until there’s been a “mass casualty event” to get a ban done.

Related Tags