Electronic brain

Neuromorphic computing: bringing brain to bear

Computers inspired by brain functionality have been with us for a while. Now a number of academic research projects are exploring circuit-level innovation based on the inner workings of the old grey matter.

When Alan Turing developed his famous test for machine intelligence, he constructed it to avoid any reliance on any particular type of computer. The machine's answers just had to be indistinguishable from those of a human. But in the intervening years, attempts to create artificial intelligence using the computer architectures devised in Turing's time have met with limited results. We have search engines that can collate text and offer results that sometimes satisfy what we are looking for, but often the results seem to be little better than random.

Maybe we need computers that look more like brains. This at least is the focus of major research projects that straddle the Atlantic Ocean - to simulate the mechanics of the brain in the hope the demonstrations will point a way to make smarter computers and help our understanding of what goes on inside our own heads.

There are two strands in the brain- simulation research going ahead today. One is to make computers behave more like brains. Dharmendra Modha, founder of IBM Research's cognitive computing group based at Almaden, California, points out brains fare better than computers when "faced with noisy, ambiguous data". The other strand is intended to carry over other lessons from brain structure to deal better with noisy or simply unreliable hardware as manufacturing precision heads further into the realm of quantum mechanics.

Professor Karlheinz Meier of the University of Heidelburg explained at the recent International Electronics Forum organised by Future Horizons: "Computers are fast and reliable but they need software and are fault sensitive. The brain is slow and unreliable but it is fault tolerant and energy efficient. It also has the advantage of being able to work software out for itself.

"Eventually designs based on the brain are likely to be very attractive for future technologies because they can use unreliable elements," Meier adds.

Meier sees a further split in the Human Brain Project between Europe and US in terms of focus. The US he sees as being more focused on understanding the neuroscience, with the simulations used to help drive medical research as much as computing. "Ours is more directed at computing architectures. It has a large neuroscience component," he says, "but it is about designing new computer architectures."

So little is known about the way in which neuronal behaviour translates into thought that the projects underway take radically different approaches to emulating the brain. They do however pay attention to several clear properties that have been uncovered by neuroscientists over time.

Early attempts to model the brain focused on artificial neural networks, which in effect treated neurons as 'logic gates', explains Jeff Hawkins, founder of the Redwood Center for Theoretical Neuroscience, who turned to neuroscience with his start-up Numenta - now named Grok - in the mid-2000s after leaving mobile device maker Palm. Neural networks found their way into commercial products on the basis that they could self-learn based on well-prepared training data. "The good thing is they are learning systems. They are good for classification problems but they are limited and not brain-like at all," argues Hawkins. "I'was working on these in the mid-1980s and I was disappointed because they were not like neuroscience."

Later work has taken a closer look at the way that the brain wires itself. Neurons don't'form themselves into networks - instead dendrites and synapses, which are far more numerous than neurons - provide the bulk of the wiring. This presents an immediate problem for computer architects.

In classic computer processor architecture, which is still heavily based on the ideas proposed by renowned mathematician John von Neumann (1903-1957), the core of the machine is an arithmetic unit with a comparatively narrow connection to a large bank of memory. There is one way in and one way out. Architectures such as the Harvard improved on the throughput by allowing multiple paths into memory, but it's still a single, local memory.

Interconnect in computers, and even within chips, is relatively expensive. In the 1960s, IBM engineer Ed Rent found a relationship between the number of pins on an IC and the number of logic functions inside. Plotted on a graph, the relationship followed a power law and became known as Rent's Rule - but there were always many more logic blocks than connections between them. Attempts to boost the Rent exponent have proved expensive because chip wiring takes up more space than the transistors that form the logic.

Synapses in the brain, on the other hand, outnumber the neurons by five orders of magnitude. This imbalance presents a major challenge to conventional computer design. "The challenge is to replicate the very high connectivity of the brain," says Professor Steve Furber at the University of Manchester's School of Computer Science. "This connectivity is very difficult to replicate in silicon systems."

Furber's group is using an extensive battery of ARM processors to simulate brain activity, as previously reported in E&T (see Vol 6 No 12). The SpiNNaker (Spiking Neural Network architecture) machine at Manchester combines the custom ARM processors with a grid of switches optimised for frequent but extremely short messages to copy and switch packets around a much more extensive virtual network of software synapses. The ARM processors run software to emulate the behaviour of multiple neurons apiece.

"Each board contains 864 ARM processors," says Furber. "Our ultimate aim to produce a machine with a million ARM cores." Ten cabinets will be needed to house the million-core array. "It looks a lot like the Manchester Baby," University of Heidelburg's Professor Meier notes, referring to the first stored-program computer built at the Victoria University of Manchester in June 1948. Furber adds: "It's humbling to recognise that, even with one million ARM cores that will represent only 1'per cent of the human brain, and that is based on existing assumptions of how the brain works."

Conceptually, the approach is similar to that taken by IBM's Blue Brain project. However, in this case, the brain is simulated on a more conventional supercomputer architecture. So far, the team claims to have built a software model of a brain that has the equivalent complexity to that of a cat.

The project has attracted some criticism for its bold claims of modelling the behaviour of a mammal brain on the basis that its existing models may not be detailed enough to capture the subtleties of the many processes that go on and around each neuron. Even with the simplifications, Blue Brain consumes a lot of energy.

A cortical column, which contains around 10,000 neurons consumes just 3'W on average. The Blue Gene supercomputer simulation of this column requires 8,000 cores and 100kW of power.

"For a human brain you would need one trillion watts to simulate it. It would consume five times more power than the total generating capacity of Germany and still run 1,000 times slower than biology," says Meier. "To simulate a day's thinking, it would take years. Although simulations are extremely important, it is clear that current computing technology has a problem."

One option is to simplify the neuron further: but what is not clear is how much of a simplification is too much. "A criticism of the Human Brain Project is that it is too early," says Meier. "If we don't understand how the brain works, how can we simulate it?"

Meier points to other simulation work, such as those used to predict the formation of galaxies. They come up with results that may differ in detail but produce broadly similar answers. If those models have some predictive power, they have some use. "A simulation is always a cartoon of reality," says IBM Research's Dharmendra Modha.

The other option, which drives the second strand of European neurocomputer research, is to build different types of computer entirely. In contrast to the SpiNNaker system being assembled at Manchester, systems such as the University of Heidelburg-led Brainscales (brain-inspired multiscale computation in neuromorphic hybrid systems) project uses analogue rather than digital processing.

By using analogue circuits to emulate neurons, the system can reduce the amount of energy needed to perform each computation and run up to 10,000 times faster'than real time.

Building brain-like redundancy

As it uses analogue circuits, Brainscales cannot turn to the packet-switching scheme used by Furber's group - the connections are programmable but are configured as permanent until rerouted. Placing integrated circuits (ICs) on the surface of a printed circuit-board (PCB) would increase the energy needed for communication too much. So, the Brainscales machine keeps the 384 ICs on their original wafer that is mounted on a high-density panel to provide the links between each chip. Each individual die on the wafer contains 128,000 synapses and up to 512 neurons, although at the maximum number the ratio of synapses to neurons is less than 300.

"It looks like a science-fiction approach, but this is ready," Meier explains. "We are assembling 20 such wafers to provide four million neurons and one billion synapses." A significant proportion of the chips will not be fully functional - but the mechanisms used to route around those failures will help inform the development of future conventional computers in anticipation of high-density nano-fabricated machines that will themselves contain numerous broken or only partially functional elements.

Some of the choices made for simulating neuron behaviour are based on estimates that may be wrong. "Our analogue precision is equivalent to four bits - but that's a pure guess. I hope this can be systematically studied in the Human Brain Project. That is something that you can do with computers: you can change the resolution and see the impact," says Meier. "With a digital simulation, you can try out different neuron models because we really do not know the best model for neuron yet. With analogue you are stuck with the architecture."

Taking a hybrid path

A second IBM project has chosen a hybrid path between general-purpose computing and specialised chips to model the neurons and synapses. Led by Dharmendra Modha the aim, he says, is "to build brain-like computer as quickly as possible; for cognitive chips to be everywhere, in everything".

The team has designed several distinct types of chip that use simple digital processors for the neurons and a crossbar memory array to provide the synaptic connections. A programming language, based on the syntax used by MathWorks' MATLAB, is used to define how groups of neurons are wired together. These neuron groups will then be let loose on different problems. Modha sees it as essential that the systems learn in a real environment and are not kept as a "brain in a black box". By arming the brain-like machines with sensors and actuators, it should be possible to train them in solving problems such as navigating a maze: "The idea is, in a stylised fashion, to demonstrate perception, cognition and action," according to Modha.

Using a more abstract model of the neocortex part of the brain, Jeff Hawkins of the Redwood Center for Theoretical Neuroscience has taken his company's work to a commercial piece of software. The architecture has been pared down considerably from that of a mammalian brain to make it run on standard computers. "We spent a lot of time making Grok run efficiently," he says: Grok is the name of the flagship product from Hawkins' company of the same name, that ingests data streams and creates actionable predictions in real time.

As the simulations become increasingly accurate, scientists and engineers hope that they will uncover the key pieces of the brain's wiring that unlock intelligence - and point a way to doing what Grok is attempting, to build a much simpler, thinking machine.

Further information

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close