Can a programmable approach to mobile handset design help ease the pain of supporting multiple standards and operating frequencies?
Cellular operators are gearing up for a new round of network upgrades. This year sees continued deployment of an enhancement to third-generation cellular (3G), called HSPA+, which can provide peak data-transfer rates of 21Mbit/s. Other operators are beginning to roll out 4G networks, using the WiMax or Long-Term Evolution (LTE) standards.
The life of RF, antenna and baseband designers would be easier if handsets only had to deal with a narrow set of standards. However, because operators cannot afford to roll out universal coverage for the newest protocols, handsets also have to cope with 2G standards such as GSM. Add in the short-range protocols, such as Bluetooth and Near-Field Communications, and the result, in the words of Gilles Delfassy, president and CEO of ST-Ericsson, is that 'the number of radios on a terminal is sky-rocketing'.
Instead of dedicating logic and analogue circuitry to each family of air interfaces, Delfassy says the approach needed now is software-defined radio, in which a powerful processor can be configured and reconfigured to handle each of the protocols. In terms of chip area, and therefore cost, the tipping point is already upon us. In a 2007 paper, Ulrich Ramacher, head of software radio research at Infineon Technologies, compared the area on the then leading-edge 65nm process of a software implementation with one based on dedicated hardware, for the set of air interfaces necessary in a modern digital baseband. Once digital TV, WiMax and Wi-Fi support were added to the cellular protocols, the balance shifted strongly in favour of the software-based approach.
Silicon technology, however, is a moving target. The arrival of 40nm and, soon, 28nm processes makes the silicon area argument less convincing. On a 65nm process, the discrete-circuit architecture for the 4G-capable baseband that Ramacher suggested took up around 50mm2 of die space. Move that to a 28nm process and the same baseband design may need as little as 15mm2.
Professor Gerhard Fettweis of the Technical University of Dresden points out how much scaling can affect trade-offs.
'On a chip that we designed as a research vehicle for LTE using a 130nm process, which is not a high-end process, we could do 4G in 10mm2. When we started E F doing cellular, the digital part was two or three pieces of silicon. There were 650 discrete analogue parts on the Nokia 2110,' he says, talking about a phone that effectively launched the GSM business in Europe in the mid-1990s. 'By 2000 we managed to integrate all that into a single chip. Then along came Andrew Viterbi and his team from Qualcomm, who told us that CDMA was the best thing. It filled a whole die on its own. Now, with LTE, we can fit everything into a quarter chip or less. Maybe a tenth of a chip.'
Although silicon area does not seem a critical issue today, Charles Sturman, vice president of sales and marketing at Cognovo, argues there are still savings to be made with a processor that can run multiple algorithms in place of dedicated hardware. 'There is huge pressure on margin. And, with HSPA+, LTE and LTE Advanced, the whole equation changes. They all call for hugely complex parallel data processing. You end up designing half a processor, so you might as well use one.'
Richard Fry, vice president of business development, adds: 'We are half the size of an ASIC [custom silicon] approach. And we can support the legacy protocols for the price of a little bit of flash memory.'
Another pressure on margin is design cost. LTE and WiMax have to squeeze into whichever parts of the radio spectrum mobile operators can buy. The worldwide frequency map for these services is much more complex than it was with GSM or even 3G, where the allocations were more or less fixed for entire continents. Today's chip designers also have to bring together many pieces of intellectual property to design one baseband processor that can cover all the standards - or face the cost of implementing region-specific parts that each alone may still cost as much to develop as the all-in-one design.
'The impact of the increase in combinations of band and mode requires a higher degree of adaptability,' says Francis Sideco, principal analyst for wireless communications at iSuppli. 'Some might move to software-defined radio, some may use more conventional architectures. The change will come not because the device guys are saying 'we must use software-defined radio' but because they are saying they want more adaptability.'
Gerd Teepe, director of digital design at GlobalFoundries, points out tensions between software and hardware implementation. 'We need to be very careful with what we want to run in software and what needs a dedicated hardware engine. As technology gives us more hardware to play with, we see a balance between generic software and the hardware incarnation of a software model. A software modem can be very power-hungry. So, over time, as specifications are fixed, we will want to make this into hardware. We need to have a balanced view.'
Sideco adds: 'The architectures are still maturing: it is a matter of getting the efficiency and performance up.' He reckons that for the LTE rollout, 'this first wave will be based on more traditional architectures. Icera is the only implementation based on software-defined radio out there. Although companies such as ST-Ericsson have done a lot of work in this area, their mainstream parts are not yet full software-defined radio.'
The idea of a 'soft phone' that can download everything it needs to support a new radio interface is still some years away. But the demand for adaptability is pushing chip and handset makers in that direction.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.