Intel unveils self-learning ‘neuromorphic’ AI chip

It might not have totally nailed AI yet, but with its new research project Loihi Intel claims to have crossed a bridge which many are still trying to build; on-chip learning.

In its search for a new growth area, Intel has focused on the burgeoning world of AI, but it isn’t alone. The microprocessor world is becoming very big, confusing and congested. Fortunately, if you believe the hype, there will be a lot of business to share out. Considering Intel’s ill-fated venture into the world of mobile, the latest bet to replace the PC-cash cow could be a more secure one.

So what is the key difference here? The deep-learning processes, critical to the success of any intelligence ambitions, are done on the chip itself. You don’t have to wait for an internet connection, or an update from the cloud, the learning, reinforcement training and adaptation is done there and then.

Should the research project progress successfully to the real world it could mean the introduction of genuinely autonomous robots or devices. Cord cutting could be given a new meaning, as the machine would learn on the go, without a need to connect to the cloud.

“As part of an effort within Intel Labs, Intel has developed a first-of-its-kind self-learning neuromorphic chip – codenamed Loihi – that mimics how the brain functions by learning to operate based on various modes of feedback from the environment,” said Dr Michael Mayberry, MD of Intel Labs.

“This extremely energy-efficient chip, which uses the data to learn and make inferences, gets smarter over time and does not need to be trained in the traditional way. It takes a novel approach to computing via asynchronous spiking.”

The official name for this area of artificial intelligence is neuromorphic computing, and was kicked off by CalTech professor Carver Mead. Here, Mead combined chip expertise, physics and biology to explore new areas which have been taken on by the microprocessor giants of the world. The idea here is to replicate the structure and performance of the human brain, to increase efficiencies and performance in the chips. How the brain actually works is still largely a mystery, but applying the understanding we have could be the foundation of the next breakthrough in the chip world.

Intel might be shouting about its own breakthrough in neuromorphic computing, but it isn’t the only one with its finger prodding the pie. IBM has been playing around with its TrueNorth chips and Qualcomm has Zeroth. There will be numerous other projects around the world, but these are the ones which seem to be making the most progress.

So how does it actually work? We’ll let Mayberry explain this one.

Intel Chip

It is worth emphasising this is a research project at the moment, but it is a pretty exciting one. Right now the power of artificial intelligence will be limited to those areas which are connected to the internet, as there will be a need to relay information to the learning machine in the cloud. Should a self-learning chip be commercially feasible, limits on where AI can be applied will be essentially removed.

It would be foolish to try and predict a winner in the AI arms race this early on, but Intel looks to have made a good start. Well, it is certainly more promising than the efforts in mobile.