Stephen Hawking Talks About A.I.

hawking1

Professor Stephen Hawking recently viewed the new Johnny Depp movie, Transcendence, and the movie gave the renowned physicist plenty of food for thought. in fact, Hawking has concluded, according to an interview in The Independent, that artificial intelligence may be potentially the worst thing ever to happen to mankind.

The theory of artificially intelligent machines attempting to take over the world is not new among sci-fi fans. In the past, there have been movies like the Terminator series, the Matrix series, Wargames, Johnny Mneumonic, and many others in which free-thinking machines have caused mayhem.

According to Hawking, the potential for artificial intelligence to destroy humanity is far greater than many of us realize. A.I. has progressed substantially over the years, and the closer we come to a completely sentient machine, the closer we may be coming to disaster. Self-driving cars such as the ones legally allowed on California roads, and computers that win games using deductive reasoning, are just the beginning, he states.

Hawking went on to agree that there were potential benefits, such as an end to disease, war, and poverty, but an equal potential that the machines which we are building to end wars could easily start them at some point instead. In spite of the efforts of the United Nations to ban them, some countries are attempting to create autonomous weapons even now. The primary concern? Hawking says that a thinking machine could figure out how to improve itself, even to the point of making the need for humans obsolete. And, as plenty of sci-fi shows will tell you, once they realize we’re not needed, machines could decide to get rid of us.

Now, I know some of you may be thinking that Steven Hawking is indulging in a bit of fantasy, and to some extent this may be true. But don’t forget that this is one of the most brilliant men of our time. He is one of the top mathematicians at Cambridge University, where he continues to contemplate physics, robots, and the fate of humanity. In his opinion, the only way we’re likely to thrive is to venture off into outer space and spread across the universe.

So if we do travel beyond this world as Hawking suggests, I’m thinking we probably won’t want to name the onboard computer something like HAL…just in case.