Mobile device

The porcupine problem

Software could give mobile phone makers a way of packing increasing numbers of standards into the same space.

The porcupine problem

Gadget makers have spent years talking about convergence - where many different functions are combined in just one piece of hardware. Want to access email? Use your phone. TV on the move? The phone can do that too. Want to know where you are? The top-end phones can now work out their location by decoding signals from Global Positioning System satellites. But it comes at a cost: the phone makers have to deal not just with the radio demands of the phone portion itself, but all the other radio-frequency interface these extra functions need.

"The key challenge for us is that we now have to deal with as many as 11 radios and some of them need to work at the same time. This is really a nightmare for us," said Mikko Terho, vice president of leading phone maker Nokia.

Before you know it, you can build up to more than ten different radio interfaces on a single device. GSM on its own needs potentially four different radio interfaces: two for Europe and two for the US. However, it is, at least, one digital protocol. On top of that, you now need to add EDGE. Then there are the 3G standards. Most phones only support one flavour of 3G but global usage demands support for the others, putting CDMA2000 or W-CDMA on the same phone. Then there are the new wireless access standards such as WiMAX and, further out, the Long-Term Evolution (LTE) upgrade to 3G, not forgetting the high-speed data modes such as HSPDA and HSUPA. For close-range networking, Bluetooth is essential. Even that has spawned variants such as WiBree and a high-speed version based on ultra-wideband (UWB) modulation. Some phones need wireless LAN support - which means handling both the 2.4GHz and 5GHz ranges as well as interference with Bluetooth in a fully featured design. New features such as location awareness and mobile TV mean supporting GPS and DVB-H.

The end result is what NXP Semiconductors has termed the "porcupine problem": a proliferation of antennas on the circuit board. They all take up space and add cost to the overall design. Not only that, they make it difficult to optimise the antenna design as they may have conflicting requirements.


There is an alternative, and it's one that the military has been working with for the past ten years. In the late 1990s, contractors such as BAE Systems and Raytheon were faced with the problem of having to cope with as many as 30 radio systems.

The answer was to take advantage of the rapid advances made in processing power by the chipmakers to try to move much of the processing needed for military radios into software. In most conventional radios, a lot of the processing is carried out using specialised radio-frequency components in the analogue domain. These components have fixed functions, so that they can only deal with a small number of frequencies. A software-defined radio digitises the signal as close as possible to the antenna. As a result the system will need to deal with digital samples at gigahertz frequencies. This demands a lot of processing power, but is potentially more flexible as, to deal with different radio interfaces operating up the sampling limit of the digitiser, you only have to change the software and very little of the hardware.

In 1999, work by the Joint Tactical Radio System programme in the US led to the creation by BAE Systems, ITT, Raytheon and Rockwell-Collins of the Software Communications Architecture, a manner of formalising the way in which different modules in a software-defined radio system talk to each other.


In mobile communications, it is much more of a free-for-all as startups and established manufacturers contend for sockets in the phones made by the leading handset vendors. There are no established standards for software-defined radio: the chipset makers are defining their own interfaces and architectures. They are also some way from the ultimate goal of having a fully software-defined radio able to cope with everything from sub-gigahertz GSM and TV broadcasts up to the 5GHz-plus needed for Wi-Fi and UWB, let alone the 20GHz that LTE might use.

There are some front-ends that have been built experimentally that can deal with multi-gigahertz bands. NXP, for example, showed at the International Solid State Circuits Conference in San Francisco last month a receiver front-end aimed at UWB that is able to digitise signals from 600MHz all the way to 10GHz. The University of California at Los Angeles has built receivers that can deal with signals up to 6GHz in frequency for a variety of radio standards. The design uses a comparatively low-speed sigma-delta analogue-to-digital converter - sampling at less than 10Msample/s but using a programmable mixer to separate out the frequencies of interest and filters to remove unwanted harmonics.

Digital change

Where change could come quickly is in the digital part of the receiver. In reality, this does very little to help with the porcupine problem but handset makers are beginning to look at software-radio techniques to deal with the problems of cost and development cycle time.

The first companies into the area were startups such as Morphics and Systemonic, springing up towards the end of the 1990s. But they were badly damaged by the 2001 technology-market slump and disappeared into the arms of large chipmakers such as Infineon Technologies and NXP Semiconductors. The technologies did not die out completely but were absorbed into longer-term projects.

As the technology sector began its recovery in the first half of this decade, a second wave of startups appeared, primarily UK-based Icera and Sandbridge Technologies in the US, claiming that they could displace incumbents such as Infineon, NXP, STMicroelec-tronics and Texas Instruments with their programmable architecture. The mainstream players have, until recently, stuck with designs that dedicate silicon to each radio standard. However, Infineon and NXP have started to introduce baseband parts that use software-radio techniques, believing that the time is right for a switch.

Crossover point

According to Ulrich Ramacher, head of software-radio research at Infineon Technologies, the key is silicon cost. For a paper published in the journal IEEE Computer, he performed an analysis of how much silicon you need to implement each radio standard - using Nokia's chips as the benchmark - versus a programmable architecture. Based on the analysis, it does not make sense to go to software radio if the device only needs support for GSM, 3G, Bluetooth and Wi-Fi. Adding TV decoding through DVB-H brings the crossover point. Moving onto WiMAX or WiBro leans heavily in favour of the software-programmable version.

René Penning de Vries, chief technology officer for NXP, broadly agreed with the Infineon results. "It is always this way with a flexible solution. If you compare to a point solution you always lose out. But when you compare with three, four or five point solutions, it becomes far superior."

De Vries said that, with the introduction of HSDPA, HSUPA and LTE, the seesaw swings dramatically in favour of software radio as "we have to carry the legacy standards with us".

However, for de Vries, the key to software radio is not so much the silicon cost but timing. The company has folded work performed by Systemonic into its EVP programme which has resulted in its first software-radio baseband processor, launched late year. But, were it not for the Beijing Olympics, software radio at NXP might have had to wait a little longer for its market breakthrough.

Although handset makers expect the first TD-SCDMA networks to go live ahead of the Olympics, letting the country showcase its largely home-grown technology, uncertainty continues to surround the protocol. Trials of the basestations continue and network access permits were provided to six vendors at the start of February.

One of the vendors favoured with a permit is Samsung: NXP's primary customer for the first commercial EVP implementation. De Vries claimed that Samsung opted for the software-radio design because it would let the Korean handset maker tune the design quickly.

"It provides the ability to work on a relatively unknown standard without having to wait until the standard is finalised. On the fly, it can adapt to the latest version of the standard," said de Vries. "The Olympics will be held in the summer of 2008, so we know what the real deadline is. But the real final decision on the specification is not there yet. That is an issue for the semiconductor vendors, the handset companies, operators and content providers. In this case, not being dependent on a piece of hardware is vital."

Hardwired circuits

But software cannot do everything efficiently: that is why most baseband processors have some form of programmable DSP inside but devolve much of the high-bandwidth work to hardwired circuits. Terho said research performed at Nokia found that some elements of baseband processing map well to a programmable engine.

Others are more problematic. Nokia's aim was to reuse complete modules in different radio standards that, on the face of it, looked similar. Many radio protocols use Viterbi and Reed-Solomon codes to provide a degree of error correction. They effectively add redundant codes to the bitstream that help a receiver decode the signal more accurately.

"We found that, for many of the pieces, it was not possible to reuse anything," said Terho. The error-correction modules were key culprits: it turned out that it was better to have dedicated silicon for each one in each radio interface that needed them.

Conversely, with the scramblers used to spread the power spectrum of the transmitted signal across the full channel, software-programmable techniques proved to be much more effective. "It is possible to have reusable algorithms with different parameter sets," Terho claimed.

De Vries agreed: "There are places, such as channel modulation, where you can obtain big benefits from a vector processor. For Reed-Solomon and Viterbi decoders, an efficient hardware implementation is always best."

As a result, there will be parts of the baseband that will remain fully hardwired. But elsewhere, the chips are beginning to go soft. 

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them