E&T

Scalable data centre chip boosts speeds lowers energy usage

A computer chip that promises to boost the performance of data centres that lie at the core of online services - from email to social media - has been revealed by Princeton University researchers.

Data centres - essentially giant warehouses packed with computer servers - enable cloud-based services, such as Gmail and Facebook, as well as the storage of vast quantities of website data.

The computer chips at the hearts of the biggest servers that route and process information often differ little from the chips in smaller servers or everyday personal computers.

The Princeton team has therefore designed a chip with massive computing systems in mind, which they say can substantially increase processing speed while slashing energy needs.

The chip architecture is scalable; designs can be built that go from a dozen processing units (called cores) to several thousand.

The architecture also enables thousands of chips to be connected together into a single system containing millions of cores. Called Piton, after the metal spikes driven by rock climbers into mountainsides to aid in their ascent, it is designed to scale.

"With Piton, we really sat down and rethought computer architecture in order to build a chip specifically for data centers and the cloud," said David Wentzlaff, an assistant electrical engineering professor at Princeton University. "The chip we've made is among the largest chips ever built in academia and it shows how servers could run far more efficiently and cheaply."

The chip represents the culmination of years of work by Wentzlaff and his students. Mohammad Shahrad, a graduate student in Wentzlaff's Princeton Parallel Group said that creating "a physical piece of hardware in an academic setting is a rare and very special opportunity for computer architects."

The current version of the Piton chip measures six by six millimetres and contains over 460 million transistors, each of which are as small as 32 nanometres, too small to be seen by anything but an electron microscope.

The bulk of these transistors are contained in 25 cores, the independent processors that carry out the instructions in a computer program.

Most personal computer chips have four or eight cores. In general, more cores mean faster processing times, so long as software ably exploits the hardware's available cores to run operations in parallel. Therefore, computer manufacturers have turned to multi-core chips to squeeze further gains out of conventional approaches to computer hardware.

In recent years companies and academic institutions have produced chips with many dozens of cores. Wentzlaff said the readily scalable architecture of Piton can enable thousands of cores on a single chip with half a billion cores in the data centre.

"What we have with Piton is really a prototype for future commercial server systems that could take advantage of a tremendous number of cores to speed up processing," he said.

The Piton chip's design focuses on exploiting commonality among programs running simultaneously on the same chip. One method to do this is called execution drafting. It works very much like the drafting in bicycle racing, when cyclists conserve energy behind a lead rider who cuts through the air, creating a slipstream.

At a data center, multiple users often run programs that rely on similar operations at the processor level.

The Piton chip's cores can recognize these instances and execute identical instructions consecutively, so that they flow one after another, like a line of drafting cyclists. Doing so can increase energy efficiency by about 20 per cent compared to a standard core, the researchers said.

A second innovation incorporated into the Piton chip parcels out when competing programs access computer memory that exists off of the chip.

Called a memory traffic shaper, this function acts like a traffic cop at a busy intersection, considering each programs' needs and adjusting memory requests and waving them through appropriately so they do not clog the system. This approach can yield an 18 per cent performance jump compared to conventional allocation.

The team intends to release its chip design as open source which would allow anyone to develop and adapt it without paying the developers.

The energy costs associated with data centres can be phenomenal with companies such as Facebook building facilities that run on renewable electricity to lower running costs. In February, Microsoft trialled an underwater data centre which can be powered by the tide. 

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles