News

Intel Labs and Tu Graz achieve big efficiency gains thanks to neuromorphic hardware

Joint research by Intel Labs and the Institute for Theoretical Computer Science at the University of Graz, located in Austria, has shown that neuromorphic computing is up to sixteen times more efficient to work with deep learning networks, a truly impressive fact that has not left anyone indifferent.

The basis of this research has been the intel lohi chip, about which we already had the opportunity to talk to you in this article a few years ago, and which was the subject of a generational renewal a few months ago, as we also told you in this other article. Intel Loihi neuromorphic chips are inspired by the structure and functioning of the human brain, and specialize in deep learning. That design, and that specialization, are the two key factors that enable it to deliver superior performance and efficiency.

This efficiency responds precisely to one of the most important challenges facing intelligent machines and computers. These can work with a certain degree of autonomy, they are able to recognize objects and subjects, and thanks to deep learning the artificial intelligence systems on which they are based can continue to improve. However, power consumption is one of the biggest obstacles we currently face in expanding the reach of artificial intelligence, and deep learning.

That’s where this research comes in and the efficiency gains that Intel Loihi-based PCs can achieve. Improving efficiency by up to sixteen times means that neuromorphic hardware is capable of offer the same performance than traditional hardware in deep learning, but consuming 1,600% less.

Impressive, right? Well wait, there is still a key fact that will leave you speechless, and that is that, according to the source of the news, neuromorphic chips with hundreds of billions of neurons can process huge amounts of information with a consumption of only 20 wattsthe same as a light bulb.

Joint research by Intel Labs and the Institute for Theoretical Computer Science at the University of Graz focused on algorithms that work with time processes. Thus, for example, the system had to ask about a story that had been told to it previously, and capture for itself the relationships between objects and people based on the context of the story. To carry out this research, we used a Nahuku block with 32 Intel Loihi chips, and it was possible emulate human short-term memory.

The parallels between neuromorphic hardware and the human brain are impressive, and the excellent relationship between power and consumption that this type of hardware has only serves to highlight that, deep down, our brain is a wonderful thing. On this subject Mike Davies, director of the Intel Neuromorphic Computing Laboratory, commented:

“Intel’s Loihi research chips promise significant improvements in AI, especially by reducing their high power cost. Our work with the TU Graz provides further evidence that neuromorphic technology can improve the energy efficiency of today’s deep learning workloads by rethinking their implementation from the perspective of biology.”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *