
Heat may be the answer to energy-efficient computing, scientists claim.
Is information real? Silly question. Of course it is. According to Einstein's special theory of relativity, information cannot be sent faster than the speed of light so it must have some physical reality. In the 1940s, the American electrical engineer Claude Shannon also inferred that information must obey the laws of physics by showing its close relationship to the concept of entropy - or disorder - in thermodynamics.
It begs the question why, when calculating the energy efficiency of computers, have scientists assumed that computation can be carried out without energy loss, turning information into a mathematical abstraction? Those in any doubt need only look at the work of IBM's Charles Bennett from the 1970s and 1980s, based on this idea - called 'reversible computing' - which remains the status quo.
Dr Mike Parker and Professor Stuart Walker from the University of Essex's school of computer science and electronic engineering started to explore this anomaly a couple of years ago in an effort to map the environmental impact and carbon footprint of future computers and communications. When they went back to first principles and factored in the need for a computer to obey exactly the same physical laws as any other machine, their results indicated, counter-intuitively, that you will waste less energy if you can run computer chips hotter.
The efficiency of a machine, whether it is a steam engine or an information processor, rests on the second law of thermodynamics, which is all about entropy: that is, the energy 'lost', usually in the form of heat, whenever one kind of energy is transformed into another.
In their analysis, Parker and Walker revisited the heat engine concept that the 19th century physicist Nicolas Carnot developed and which provided the foundation for the second law of thermodynamics. They applied this to a computer, which is not a new idea but their view of the entropy of the system, inspired by Shannon's information theory, moved in a different direction from previous work.
Some people find the connection Shannon made between entropy and information slightly mind-scrambling. He said that a well-shuffled pack of cards has more entropy and also contains more information than one in which the cards are grouped by suit. Why? Because when you pick a card from a well-shuffled deck you are more uncertain about the outcome than when the card is taken from an ordered deck. The more surprised one is about the card - the greater the uncertainty - the more information it conveys. In information theory, news that the sky above the clouds is blue has little information content. But telling someone in a basement that it is cloudy outside contains much more.
Shannon is most famous for showing that when information is transmitted, errors start to creep into it with distance, thus increasing the uncertainty. Because computers cannot be infinitely small, this will happen inside them, generating some entropy as heat. For a given processing power, therefore, the smaller the machine, the less heat it will lose. Moore's law has contributed to efficiency improvements along those lines for decades.
Conventional Carnot analyses regard computers as 'reversible' - they work at maximum efficiency. So, no entropy is released into the external environment. But as Shannon showed, information is intrinsically associated with entropy. For a computer to be of any use it has to output entropy. 'People have made the analogy between Carnot engines and computers but not the connection that entropy is an intrinsic part of the useful output of a computer,' says Parker.
Carnot proved that for any heat engine to do 'work' there must be a temperature difference between a working fluid, such as steam, and a heat sink that absorbs the waste. The greater the difference, the better the efficiency. In a modern power station, for instance, steam is superheated to spin a turbine to produce electricity. By using the steam discharged from the turbine for a secondary purpose, say warming cold water to heat nearby offices, the process can reach as much as 85 to 90 per cent efficiency.
According to the Essex researchers, the bits written to and from memory as an unavoidable part of a processor's calculations are, in essence, the 'steam' that drives the machine. Like a power station, the key to efficiency is to run at as high a temperature as possible, while ensuring the water-cooling is ultimately discharged at as low a temperature as possible.
The 2008 paper in the journal Optical Communications that described this analysis came to attention last year when Parker and Walker used the results in a formula that uses chip temperature, bit rate and power consumption to compare the absolute energy efficiency of any computer-based system on a logarithmic scale (see 'Formula for Efficiency', above).
Their calculations showed, to no great surprise, that today's IT equipment is shockingly wasteful. Modern microprocessors are four orders of magnitude less efficient than human brains. But to declare that entropy is intrinsically part of the useful output of a computer and that processing efficiency is temperature dependent is more controversial. Electronics engineers have been working hard to reduce the amount of heat their processors generate. Are they wrong?
Full circle
Entropy is a notoriously difficult concept. When we ran Parker and Walker's findings past several physicists, twice we were referred to Charles Bennett's work: back to where we started.
Parker says computer scientists familiar with the work of Shannon haven't properly understood the implications of combining his insights with the more loosely formulated principles of IBM's Rolf Landauer, who showed that information is physical in the sense that every bit will require at least kTln(2) of energy to erase it. 'Computer scientists have taken Landauer at his word and applied his theories to erasing information, but not to transmitting it from A to B,' Parker explains.
Running chips hotter, even if it is more efficient, is not entirely practical. 'Any real engine has a maximum temperature before it starts to melt,' points out Dr Neil Broderick, a scientist at the University of Southampton's optoelectronics research centre. 'But what we can take from this is that we could make more efficient computers if we had materials that worked better at higher temperatures.'
Examples of such materials are wide-bandgap semiconductors such as diamond, gallium nitride and silicon carbide. They can run hotter than silicon, which is why some of these materials are being pushed into the power-conversion systems of electric and hybrid cars. In servers, says Walker, swapping to these technologies could reduce the need for air conditioning. Harvesting and recycling the waste heat could create further energy savings.
Parker and Walker's theory doesn't explicitly consider sources of noise and assumes that the computation is always error free. A more realistic model would include a probability of an error being made during computation that would depend on temperature, suggests Broderick. 'Noise would decrease the efficiency since more resources would need to go into error correction,' he says.
But because noise is dependent on the bandgap of the semiconductor materials, a sufficiently high bandgap would still allow overall improvements in operational efficiency. 'Error-correction codes offer exponential improvements in accuracy/confidence limits with only linear increase in complexity. While thermal energy effects also increase exponentially with temperature, they are dependent on the ratio of thermal energy to the bandgap barrier. An ever larger electronic bandgap causes an exponential decrease in possible errors,' says Parker.
So how do we regard the human brain as an information processor in the light of these arguments? It doesn't seem to run all that hot. Brains use 20 to 30W to process around 40 terabits per second in an extremely small space. The brain generates a great deal of heat, says Professor Walker, but it is kept in temperature equilibrium by a large blood flow.
Parker and Walker's work may seem to confound our expectations of how to make an efficient computer but it might provide a new direction in electronics engineering where, instead of fighting heat generation, we harness it.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.