Stephen Hawking: Weapons/AI Like Terminator Needs A Ban – Is He Right?


According to Stephen Hawking, weapons and Artificial Intelligence which can produce killer robots like the Terminator need to be banned. But are offensive autonomous weapons really likely to turn on us in the short term?

In a related report by the Inquisitr, Stephen Hawking’s weapons ban recommendation was part of an open letter signed by other notables like Elon Musk and Steve Wozniak.

“Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity,” they wrote. “We believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.”

This is not the first time Hawking has warned against AI-based weapons. The scientist has compared Artificial Intelligence to nuclear weapons, and back in January he claimed it might be inevitable that AI could quickly get out of control.

“One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand,” Hawking said. “Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”

In short, Hawking fears humanity will be treated like the poor protagonist of Ex Machina, a movie which took a look at the moral questions surrounding AI development. But the reason the Artificial Intelligence known as Ava could be dangerous in the first place was because it was contained within a humanoid body that was completely autonomous.

The first issue with Stephen Hawking’s weapons ban recommendation is that no one is even close to reaching this goal. While Hawking is right that preventing such a scenario is good foresight, the idea that it might happen very soon seems unlikely.

All autonomous robots currently used for warfare are controlled remotely by humans. In addition, any supercomputer currently within reach of the processing power of the human brain weighs hundreds of tons, are housed in warehouses thousands of square feet, and require a massive facility with huge cooling requirements and a nuclear power plant to provide the necessary energy to fully simulate the human brain. This means the “offensive autonomous weapons” feared by Hawking are simply not possible… as of now.

Hawking’s warning requires that we reach the singularity, a point in time when machine intelligence beats out all the human brains in the entire world in not only processing power, but also true consciousness. The test conceived by Alan Turing requires that computer AI “produce more ideas than those with which it had been fed.” In addition, these AI-driven autonomous weapons need to produce the idea that turning on its human masters is a legitimate concept while also not caring about their continued existence.

If all of these factors do come together, then Hawking does have a good point. It seems very unlikely to say the current simplistic AI tricks being used in autonomous robots will suddenly become self aware and then turn into homicidal berserkers. But it may be a good idea to ban any plans for combining all of these factors into one Terminator-like package.

At this point, some might claim that Isaac Asimov’s “Three Laws of Robotics” could save us. Rodney Brooks works for the company iRobot, maker of the real-life Packbot military robot and the Roomba vacuum cleaner, and he claims such a programming trick is not a real world solution for this problem.

“People ask me about whether our robots follow Asimov’s laws,” Brooks said, according to Gizmodo. “There is a simple reason [they don’t]: I can’t build Asimov’s laws in them.”

In the end, most Americans are not worried about killer robots running down the streets. A 2015 poll about the end of the world said most people fear nuclear war or climate change. In the long term, Stephen Hawking’s weapons warnings are valid, but in the short term even zombies or an alien invasion rank higher than the Terminator.

[Image via YouTube]

Share this article: Stephen Hawking: Weapons/AI Like Terminator Needs A Ban – Is He Right?
More from Inquisitr