Dharmendra Modha, founding manager of the cognitive computing centre at IBM Almaden Research Labs

Memory points way to synapse chip

Researchers attempting to emulate the operation of the human brain have switched from trying to run their experiments on ever bigger supercomputer to developing new types of chips that they think will do a much better and power-efficient job.

Dharmendra Modha, founding manager of the cognitive computing centre at IBM Almaden Research Labs is confident that it will be possible to run a simulation with as many virtual neurons as those in the human brain by the end of the decade based on current experience with algorithm scaling. But it could consume 100MW of power and still not be able to simulate brain-like behaviour. Experiments on Blue Gene supercomputers up to now have simulated the way in which neurons link in networks as big as those found in the brain of a cat but no evidence of cognition, he explained in a keynote at the Design Automation Conference (DAC) in San Diego on Thursday (9 June 2011).

This power consumption is orders of magnitude higher than what the DARPA SyNapse project, on which Modha is working calls for. "The DARPA goal is 1kW of power and fits in a shoebox," he said.

A further, more fundamental drawback of the supercomputer-based approach is that it cannot account fully for the behaviour of synapses – relatively simple connections in the brain that are so much more numerous that they do not fit well into a simulation.

"Synapses dominate the number of connections in the brain. The idea of the neural network is wrong. It's really a synaptic network," said Modha.

However, the behaviour of the synapse is very simple. Electrical pulses hit a barrier in the synapse and pass across the interface chemically from where they continue. The team hit upon the idea of using a memory cell-like structure – very similar to crossbar memories such as DRAM, magnetic or oxide memories. "We are equating the synapse with the bitline and wordline of a memory. We figured out how to implement learning in an efficient way in this simple translation from biology to electronics," said Modha. "A pulse in one direction along the bitline turns it on; a pulse in the other along the wordline turns it off.

"It provides a completely new computer architecture that breaks the mould of von Neumann computing where to flip a bit you have to fetch it, flip it and then store it. This becomes the von Neumann bottleneck. But here the bit doesn't move. This puts data and processing together in a different way where data is processed in place," Modha added, claiming: "We have demonstrated learning in the synapse."

"We have built two integrated chips and they are now back from the fab. The roadmap for phase one was to have one million synapses. Then we will move to 10E+10 and then eventually 10E+14. Our system will be an SoC that will emulate the wiring diagram of the mammalian brain. The next step will be to build learning systems," Modha explained.

Modha tagged the approach being taken by the team as "right-brain computing" in contrast to the "left-brain computing" of today's von Neumann architecture machines. "There is an enormous opportunity in this nascent area of cognitive computing."

"I'm not claiming that we have discovered the operation of the brain. But we have found going through this project that when we were searching for brain-like function we were making no progress. But the moment we brought in the need to design a low-power implementation, the very process of incorporating technological constraints into our work of thinking led to a new direction," said Modha.

"With the chips we are building we are clearly heading down an engineering path. Our next goal is quite different from what we have done before," Modha explained. "We want to build a chip that is ten thousand times more powerful and then use it: take a right-brain chip and put it on a patient to monitor their health or pepper the world's oceans with sensors to detect rogue waves or fish traffic. That will be the next challenge."

From a chip design point-of-view, Modha intends to take a similar path to that of Professor Steve Furber of the University of Manchester. "The brain is clockless. It's event driven. It does what is necessary, when it's necessary and only that which is necessary. We need to learn to do nothing better. The problem that we face with designing our chips is that the key thing that we need in the next generation is to be able to do synchronous and asynchronous codesign. We need to build very large event-driven systems. Asynchronous design is one of the ways to get to low power, as well as low leakage devices. Our chips don't need to run more than 100MHz at best."

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close