Intel this week announced a collaborative research initiative to encourage experimentation with its new self-learning research chip codenamed Loihi. The chip mimics how the brain functions by learning to operate based on data from the environment. It will learn to make inferences using the available data.
On March 1, the company hosted the Neuro Inspired Computational Elements (NICE) workshop at its Oregon campus. Researchers from different scientific disciplines attended the workshop to explore the development of next-generation computing architectures, including neuromorphic computing.
Intel has now invited applications from research collaborators who are working on a variety of applications including sensing, motor control, and information processing.
Neuromorphic computing, besides raising the bar for artificial intelligence, aims to solve real world problems that normally takes months or even a couple of years to compute.
The Loihi chip, which is made up of digital circuits, mimics the brain’s basic mechanics and uses less compute cower to accelerate machine learning.
For instance, it will enable first responders to use image-recognition apps to analyze streetlight camera images and quickly solve cases pertaining to missing or abducted persons.
Neuromorphic computing, which makes decisions based on patterns and associations, could also be used in controlling traffic on the road and reducing gridlock. In short, it aims to organize unstructured data through computing, making it easier for stakeholders to collect, analyze and make decisions.
Software development tools remain one of Intel’s key focus focus areas, and the company intends to run much larger-scale applications in conjunction with research collaborators.
Fabrication and packaging of Intel’s Loihi test chip was completed last November, and the chip maker began power-on and validation. “We were pleased to find 100 percent functionality, a wide operating margin and few bugs overall. Our small-scale demonstrations that we had prepared on our emulator worked as expected on the real silicon, though, of course, running orders of magnitude faster,” said Dr. Michael (Mike) C. Mayberry, chief technology officer, Intel Corporation.
Last month, Mike Davies, Intel’s neuromorphic computing program leader, showcased the potential of the self-learning research chip.
As wireless networks transform from 4G to 5G, these Intel scientists are studying the best ways to connect billions of people who currently lack internet access. https://t.co/mE8EiUleh9 #MWC18 pic.twitter.com/MlgyW8cjQP— Intel (@intel) February 26, 2018
The chip mimics how the brain learns based on feedback from the environment. Using a bunch of photos, Loihi was able to distinguish a rubber duck, an elephant figurine and a bobblehead of scientist Rosalind Franklin. “It’s a small but exciting example of how neuromorphic computing could deliver more efficient artificial intelligence,” said Mike. “While this is a proof-of-concept that uses less than 1 percent of the chip’s resources, it shows that the architecture works, and we expect to see orders of magnitude gains in efficiency as the networks are scaled up to larger problems,” he added.