Will a Terminator takeover, led by a self-aware Skynet-like artificial intelligence, be the future in 2045? Some experts believe we'll reach that so-called singularity within our lifetimes.
In a related report by The Inquisitr, Terminator 5 is dropping the "Genesis" sub-title although whether that means the plot has changed is unknown. What we do know is that Emilia Clarke is playing as Sarah Connor and Doctor Who star Matt Smith and Hunger Games star Dayo Okeniyi are joining Terminator 5.
Louis Del Monte, the author of The Artificial Intelligence Revolution, believes the combined artificial intelligence of all the computers in the world will eventually become a singularity, a point in time when machine intelligence beats out all the human brains in the entire world. He's also concerned that government officials are taking this potential scenario as seriously as popcorn movies:
"Today there's no legislation regarding how much intelligence a machine can have, how interconnected it can be. If that continues, look at the exponential trend. We will reach the singularity in the timeframe most experts predict. From that point on you're going to see that the top species will no longer be humans, but machines."
Del Monte believes the singularity will be reached some time between 2040 and 2045. He's worried an AI may view humans as a kind of "harmful" insert that is "unstable, creates wars, has weapons to wipe out the world twice over, and makes computer viruses." The good news is that he believes we will not face a Terminator 5 scenario in real life:
"It won't be the 'Terminator' scenario, not a war. In the early part of the post-singularity world, one scenario is that the machines will seek to turn humans into cyborgs. This is nearly happening now, replacing faulty limbs with artificial parts. We'll see the machines as a useful tool. Productivity in business based on automation will be increased dramatically in various countries. In China it doubled, just based on GDP per employee due to use of machines. By the end of this century, most of the human race will have become cyborgs [part human, part tech or machine]. The allure will be immortality. Machines will make breakthroughs in medical technology, most of the human race will have more leisure time, and we'll think we've never had it better. The concern I'm raising is that the machines will view us as an unpredictable and dangerous species."
The biggest issue is based upon actual intelligence and not just cleverly programmed mimicry. It's noted that, in 2009, an experiment resulted in robots learning to lie to each in order to keep precious resources for themselves, but that may have been just a side effect of the particular genetic algorithm used, not true intelligence. But Del Monte believes that distinction does not matter:
"The implication is that they're also learning self-preservation. Whether or not they're conscious is a moot point."
That point was recently said to have been covered when it was blasted over the media that the Turing test had been passed by an AI that attempted to simulate a 13-year-old Ukrainian boy named Eugene Goostman. But critics point out that Alan Turing really wanted a computer AI that could "produce more ideas than those with which it had been fed," yet "Eugene" managed to fool real humans with cleverly programmed tricks only about 30 percent of the time. It's also believed Eugene would not have managed to fool nearly as many if the AI had posed as an older person instead of a teenager, whose odd quirks could be passed off as immaturity.
The other issue is that current supercomputers are a far cry from the power of the human brain. To put things into perspective, Japan's K supercomputer can produce 10.5 PetaFLOP/s (or a quadrillion floating point calculations per second) and its 83,000 CPUs took 40 minutes to simulate one percent of one second's worth of human brain activity. The K supercomputer is not the world's fastest supercomputer, but even the current champion China's Tianhe-2 "only" produces 33.86 PetaFLOP/s. I've heard of some computer scientists talking about eventually simulating the human brain at full speed but they're also talking about building a massive facility with huge cooling requirements and a nuclear power plant to provide the necessary energy. Meanwhile, human brains operate at full speed at much lower temperatures, fit in a relatively small box, and can be powered by beer and still beat out a machine AI.
Are you concerned the so-called singularity may cause supercomputers to attempt a Terminator takeover in our lifetimes?