When it comes to superfast broadband, predicting the future may not be as easy as it appears, says Ofcom's chief technology officer, Steve Unger.
"I suppose the main conclusion I drew was that I didn't really know for certain what the answer to the question was," says Steve Unger. Ofcom's chief technology officer is describing a recent evening of calculated futurology at the IET's headquarters in Savoy Place, London, where he delivered this year's Appleton Lecture, during which he squared up to the challenge of answering his own question. It's the question everyone wants the answer to, he says. And so he entitled his lecture: 'Superfast broadband – what will it take to make it happen?'
Ofcom is the UK's regulatory and competition authority for the broadcasting, telecommunications and postal industries, and as its CTO Unger is well placed to know why he doesn't know. "On the positive side, my main point was that you can learn quite a lot from history. If you look at the way in which our sector has evolved in the past 20 years or so, what is striking is the interplay of evolution of networks, devices and services. Essentially, what we're looking at is a virtuous circle linking innovation in all of those areas, often in ways that would have been unimaginable just a few years ago."
Unger says that if you analyse the changes in the sector over the past decade, it would have been hard to predict them beyond the general direction of travel. "When it comes to the specific outcomes in the way new services feed into networks, I think you would have struggled."
Unger opened his IET lecture with an image of the Great Pyramid of Giza, the oldest of the Wonders of the Ancient World. "It seemed like a good way to preface a history lesson about the Internet. We might be tempted to think that 5,000 years is a long time, but in the communications sector 20 years is a significant timeframe. If you go back to the discussions we had in the 1990s and look at how they have played out, the outcomes are positive and surprising, even with the benefit of hindsight. The pyramids reference was made to show how such a short period of time in global history can be an epoch in technology."
Multiple user future
Unger has been with Ofcom for a decade and is currently working on critically evaluating external market and regulatory developments, as well as leading the process of setting the organisation's strategic priorities. He is also responsible for several specific policy areas, including Ofcom's work on communications infrastructure.
One of Ofcom's responsibilities is management of radio spectrum, and Unger is particularly interested in the extent to which innovation in wireless technologies can drive growth, and the role of spectrum regulators and policy makers in enabling this.
After an academic career that included taking a degree in physics from the University of Cambridge, followed by a PhD in radio astrophysics from the University of Manchester, Unger worked in industry for two technology start-ups. In 2001 he joined Ofcom's predecessor Oftel as head of network analysis. Two years later he became Ofcom's competition policy director and by 2009 he was CTO. He accepts that when he joined the organisation "we knew that a lot was going to happen, but we didn't necessarily know what".
Projecting forward a further decade is the second part of the story Unger is keen to tell. "There is no doubt that we can have confidence in the virtuous circle continuing across the entire value chain. But I don't believe that there will be a killer app. People talk about HD television, but the idea that superfast broadband will be driven by that - or even 4K television - seems slightly unrealistic."
What interests Unger is the trend towards multiple users per home. He says that a decade ago the key issue was to get broadband to a single PC per home. "But what we see today is a model where the story is about getting fibre to the home so that a number of different devices can have connectivity. So fixed devices - smart TV, PCs - and a whole range of mobile devices, machine-to-machine and so on can benefit from a fat pipe to the home. That feels like the trend that will drive superfast broadband."
Capacity crunch and digital divide
Ofcom says that mobile data consumption will grow by a factor of 80 over the next two decades leading to a potential capacity crunch. "No one quite knows the exact figure, but it is approaching two orders of magnitude." To meet the demand for such increased capacity, Unger says that there are only three options available.
The first is to provide more spectrum ("although that in itself is not sufficient"), while the second is to be more efficient in the way in which spectrum is used. But the most important of the trifecta is the deployment of large numbers of smaller cells. "Of course, what that requires is a much more extensive back-haul network, which is about getting fibre closer to the home." One of the key drivers in this is the need to support large numbers of wireless cells that are adjacent to the home.
But it all depends where your home is. Unger says that Ofcom is concerned that it avoids a "digital divide" where urban areas have a better service than rural areas. The idea that the divide straddles the "town vs. country" fault-line is a simplification "because there will always be certain areas within cities (on the edge of exchange areas, for example) that experience a lesser service. Broadly the economics of broadband delivery favour cities, and so when there is a need for intervention it tends to be in rural areas". He says that there is clearly an issue if some citizens are receiving bandwidths of 100Mbit/s in cities, "while a significant number is excluded from those services".
When it comes to access to the Internet the UK citizenry is broadly equal in its entitlement. But inevitably some users are more equal than others. "It is always going to be more expensive to deploy networks in rural areas. What's needed is a practical way of minimising the risk associated with the digital divide. Quite a lot of work in recent years has been to find a way to reduce that risk, but without necessarily delivering fibre to every home."
Unger goes on to explain that commercial providers cover the first two-thirds of the country. After that, and up to about 90 per cent, the current government Universal Service Commitment "should deliver superfast broadband. At the moment the expectation is that he final 10 per cent will benefit from 2Mbit/s service under the Universal Service Commitment. There are questions over whether that's sufficient and affordable".
Despite the fact that coverage is not universal, compared with the rest of Europe the UK has high availability of current generation broadband, low prices and high levels of competition. Added to this there is "a great deal of effective economic use of broadband. For example the Brits are world-beating e-shoppers. But the story for superfast broadband is more mixed. Every country across Europe is currently undergoing transition and the UK is not the fastest, but there are reasonable grounds for confidence".
Unger is referring to a recently published Ofcom scorecard on broadband availability in the so-called EU5. He says, however, "you have to be careful how you regard the comparability of the data sets. It's clearly interesting to do international comparisons and if we didn't do it then somebody would.
"There are a number of technical considerations about how you gather data," Unger adds. "In preparing the scorecard we went through the data looking at the extent to which we thought it was accurate and where we thought the data wasn't of adequate quality we didn't publish it at all. We have published it with caveats, and those caveats are important."
Although it is tempting to think that with 4G we might have seen the last big broadband auction, there are "other chunks of spectrum that we think need to be released into the market. It's worth noting that the UK government has a target to get 500MHz of spectrum and on the European level there is a target of 1,200MHz. And these targets really do need to be hit if we are to meet consumer demand for new services.
"At the moment we are working on the release of some spectrums currently held by the public sector. We're also working very actively on the potential release of more spectrum on low frequency – 700MHz band – which we think could provide another further important contribution to mobile".
But in terms of spectrum management policy, Unger says that there is still "quite a lot of work to be done. We are always looking at ways of identifying underused spectrum and making it available for mobile". Ofcom's overarching duty is to consumers in the UK to ensure the widespread availability of broadband services.
"There are some levers here that are under our control. For example, when we run spectrum auctions we can impose coverage regulations, which is what we have just done with the 4G auction. That is one way in which we have intervened to improve mobile coverage." There are other situations, Unger explains, where the only answer is simply to get more funding from the public purse, "which is a matter for government. But in those cases we will advise government and provide technical advice. But when it comes to the amount of money that is affordable for the country, that has to be the government".
For Unger the real practical issue is that of making the future happen. And as we come full circle, back to the title of his Appleton Lecture, he reflects on how compressed the history of digital technology is. "I wrote up my PhD on one of the first proprietary word processors. Things have changed and there are downsides to the way we live today as a result of digital technology. It's possible to point to behavioural changes brought about by the widespread use of digital devices that are regrettable. But I am quite an optimist. I live in a small village in Cambridgeshire where my children have more access to social networks, educational material, entertainment than I did growing up in London. But it does bring risks."