Graphical future

The CPU is dead, long live the GPU. E&T talks to nVidia CEO Jen-Hsun Huang to find out why he thinks the end is nigh for the regular processor.

The future of computing is something that is very much on the mind of nVidia CEO Jen-Hsun Huang, not least because he thinks his company is going to have a hand in it. As a maker of graphics processing units (GPUs), nVidia has had more of a walk-on role in the PC. If you want to run games, then you need a fast GPU. But for everything else, the x86 processor rules.

Things, however, have been changing. Last year, Apple said it would place much more emphasis on the GPU in its plans for the Snow Leopard update to Mac OS X. At the same time, Apple launched an effort to standardise a programming interface to GPUs so that they could be used more easily to offload tasks from the host processor.

The Khronos Group developed and approved the OpenCL standard at breakneck pace - nVidia produced the first implementation at the beginning of this year - with a view to the software turning up not just in Macs but in PCs and mobile phones and media players.

Microsoft has played a quieter role, but aims to roll out a similar GPU-acceleration layer, called DX11 Compute, as part of Windows 7.

Huang has his sights set further than simply having the GPU act as an accelerator unit for the host processor, however. At the beginning of the year, nVidia hired a scientist whose work has influenced a number of the new generation of multicore processors that are vying for dominance in markets as diverse as cellular communications, graphics and supercomputing. Bill Dally's stream-processing concept could lead to the replacement of the venerable but problematic von Neumann architecture that lies at the core of processor architectures such as the x86.

That's why Huang makes this claim right at the start of the interview: "I think it's a foregone conclusion that GPU computing is the future of computing."

As the only standalone GPU maker with any clout in the mass market - AMD bought ATI in 2006 and Intel is spawning a GPU business off the back of its dominant position in PC processors - Huang has much more of a vested interest in making the GPU not just an effective accelerator but the successor to the von Neumann processor. So, he is not going to rely on Apple and Microsoft to drive programmers towards writing code for GPUs.

Sticking with the atom

"The second part of our strategy is putting GPUs everywhere," Huang claims as he pulls out two boards.

One of the boards is the reference design for the Ion - nVidia's take on the cheap PC. It combines an Intel Atom processor with a 9400M GPU made by nVidia. Both devices use flip-chip packaging that clearly show the relative sizes of the chips. The 9400M is made on a slightly older process - 55nm - than the Atom, which makes use of Intel's 45nm process. But the GPU dwarfs the 25mm2 Atom. The graphics unit is four times its size. A shift to a 40nm or 45nm process might come close to halving the size of the GPU, but it would still be the bigger device.

From a cost perspective, Intel has the upper hand. It can sell its device for $25. Although that price includes the two support chips, they are again simple devices and made on older, cheaper processes. Manufacturers using the Ion buy those devices and simply don't put them on the PCB. But Intel can cut thousands of Atoms from one 300mm wafer.

Although analysts have criticised Intel for producing the Atom at all when it could perhaps sell more, pricier Core2 Duos, Huang is in no doubt that Intel is still doing well out of the arrangement.

For its part, nVidia gets far fewer 9400M devices from the same-sized wafer. But, in terms of the compute power deployed, the GPU on an Ion board is now doing more of the work.

Parallel processors

A GPU deploys many processors able to run in parallel. By running at comparatively slow clock speeds, they use less energy than a single high clock-speed processor. Those that are not in use can be put in a low-power mode, helping to save power overall.

"The common belief is that if you add another processor, you increase power," Huang argues. "But that's not the case. When you are finished with that processor, you can turn it off."

Computer maker Acer put the Ion to use in its latest small and cheap "family living room" PC, the AspireRevo. It's cooled by a small fan - it does need the large and loud fans that accompany many larger PCs. For nVidia, the AspireRevo is a prime example of what you might call processor inversion: that the graphics processor is potentially more important than the central processor in consumer products. The 9400M is in there to run HD-video as well as games, and offload things like video compression and photo editing, in the hope that no one will notice that the x86 is just an Atom and not a Core2 Duo or an i7.

The argument from nVidia is that, for around £200 to £300 you can buy a PC that consumes less power and is about as fast on games as a £500 Core2 Duo machine with a discrete graphics card in the back. At a demonstration in London, the AspireRevo machine happily played 1080p HD video as well as PC games - the uses to which Acer reckons a living-room PC will be put to at a price that might tip the decision from a Nintendo Wii to a PC. Acer plans to bundle a 'Wiimote'-like controller with the higher-end machines for use with games.

"My favourite CPU is the Atom: because it's small and good enough," says Huang. "Suppose this is the future of the PC. Why shouldn't it be? Think what it will be like in 25 years' time."

Gesturing towards the sliver of silicon that is the Atom: "How could they be the biggest in 25 years' time? Nobody is going to care what CPU you have because it will be offloaded to GPUs and audio processors."

Huang believes the idea of relieving most of the processing burden of media-rich applications from the host processor can extend further downwards. In place of the pocket calculator-sized Ion board, Huang holds up a module shaped like a memory module. This is the reference design for the Tegra: a combination of an nVidia GPU and the ARM11 processor core.

Time for the Tegra

"We believe the mobile computing era is here. It is dying to explode. But somebody has to build something that delivers a complete computing experience," Huang says.

To show how the Tegra might be used - nVidia claims device makers will have Tegra-based units in the market later this year - the company has made a limited number of demo units. Running on Windows CE, the demo software is meant to show what a GPU-accelerated machine can do. The user interface uses similar physics-based graphics to the iPhone.

If you flick the screen, videos spin round as though they were mounted on a wheel. Stop the wheel and zoom in and the video plays full screen. It is a demonstration of how many of the things that manufacturers can do to make their designs more attractive relies more on graphics processing than what the CPU will be doing.

Working on the assumption that most mobile devices will call on the Web browser to be the main application that users work with, nVidia is working to offload a lot of the browser's function to the GPU. That includes Adobe's Flash: a key component of YouTube as well as other interactive sites. By doing this, nVidia hopes to overcome one of the big problems with using the Web on mobile devices: the sites work differently.

Internet as operating system

"It is a full-function computer. A website looks exactly the same whether you are running on Tegra or on a PC," says Huang. "I believe ultimately the Internet will be the operating system. The most important thing for the success of Tegra is a world-class browser, together with Flash."

The company will compete in the same space as other ARM licensees with much longer track records in mobile devices, such as Qualcomm, STMicroelectronics and Texas Instruments. But Huang believes nVidia has an advantage coming from the PC space: "We are going to focus on computing because the computing stack is so complicated. It is not just about the processor but the software stack that you have to create."

Since the summer of 2008, nVidia has been hit hard by the recession. Margins are good on graphics processors and cards for professional users and hardcore gamers but these are segments that suffered badly as customers slashed their spending.

Recession implications

"The economy is difficult but we had our worst quarter in Q4 2008. The market is stabilising and we are working off our inventory," Huang claims. He reckons the severity of the recession is changing attitudes to computing but notes ruefully: "The market being down doesn't help me either because our own sales are down.

"But this recession comes at an interesting time. Consumer electronics never go up in price: they always come down. The PC has held up in price better than most because the types of usage model have increased over time. Now, however, the PC seems good enough. Transistors and CPUs are good enough and people have discovered what they can do.

"Now, all of a sudden, here comes the recession and people are willing to try something new. Vendors are now trying to convince people that their little processor is good enough. When the economy comes back, consumers will not change their behaviour. You remember when the market discovered the sub-$1,000 PC?" Huang asks, noting that prices for Windows-based PCs, at least, did not recover after that. "Well, we have now discovered the sub-$400 PC."

Huang avoids using the term 'netbook' to describe the latest crop of portable computers based on processors such as the Atom. "It's a non-category. The netbook is just a cheap PC. The netbook inspired two ideas, but one of them fails. That's the idea that the netbook was created for Internet access. In reality, the netbook is notoriously terrible at Internet access.

"But it is a PC that is extremely cheap. And the day will come when that netbook is always on," claims Huang, nodding to the Tegra, which, although it deploys less compute power than the Ion, consumes one-fiftieth of the power. "When that day comes, I don't think the world will call them netbooks."

They will just be computers. "The future of computing is clearly going to be disruptive," Huang concludes.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close