vol 8, issue 1

Broadcast - lessons from London 2012

21 January 2013
By Philip Hunter
Share |
Bolt and Blake being interviewed post-race

Techies meet talent: after the men’s 100m final

For the BBC the 2012 Olympics presented one of the biggest technological challenges in the corporation's history of sports coverage - so what were the key lessons the broadcaster learn from the experience?

The 2012 London Olympics heralded a new era of maturity for online video streaming and stimulated further growth in viewing on screens other than conventional 'dumb' TVs. For the major broadcasters such as the BBC, NBC, and France Television that owned exclusive rights, the event was both a huge challenge and opportunity, given the massive demand for online streams and the potential to establish a platform that provided a learning curve for future events where they will compete for hearts, eyes - and ears.

It demonstrated the value both of competition and clear goals for major ICT projects. Just about all the major broadcasters succeeded in delivering multiple simultaneous streams integrated with live real-time information about athletes and events at a level of quality/scale beyond that of previous events, introducing technologies untried and untested.

Although the Corporation was not directly responsible for shooting the primary video of the actual events (done by Olympic Broadcasting Services), for the BBC the Games was particularly successful, enhancing its reputation as a global leader in online video established by its iPlayer 'catch-up' - and now also live - platform. Perhaps more importantly, its whole online delivery platform whose development was accelerated by the Olympics, will play a key role in future efforts to generate advertising and subscription revenues outside the UK to compensate for inevitable real-term decline in license fee income.

Its Olympic legacy has already permeated beyond the BBC organisation itself, with the whole project being studied within the wider IT development community as a casebook example of successful project management. It did not baulk at employing novel technologies, and yet delivered on time with only minor hiccups.

It was also (more or less) on budget, although that may partly have been because it over-provisioned resources, such as the bandwidth it would need in order to provide headroom for the possibility of heavier than predicted traffic during peak events such as the 100 metres final.

There were also one or two instances when the project first toyed with and then rejected technologies that will become widely deployed in future, certainly by Rio 2016, but which were deemed too risky or as yet unready. An example of this occurred during the design of the dataflow combining real-time information about events taking place with the live online video.

Yet while avoiding unnecessary risks, the BBC did take some calculated ones in order to deliver the best overall experience possible, according to its head of product Cait O'Riordan, responsible for the BBC's digital streaming development as a whole. "We did everything people tell us not to do in a big project," O'Riordan says. "We had new video technology, new technology for controlling that video, and a new data backend. That's risky - but it was a calculated risk in that we were really robust in terms of delivery."

The use of new technology meant that testing had to be even more rigorous and lengthy than usual, especially as there was an absolute deadline and low tolerance of bugs emerging during the Games.

"We had an enormous testing regime, with eight streams of work and testing teams running alongside each," O'Riordan says. The BBC was fortunate in that there were several big events occurring just before the Olympics, including the Queen's Diamond Jubilee, Wimbledon, and Formula One racing, which offered the technical teams a dry run. The platform was delivered early in May to allow as much time for heavy load testing as possible.

Although important in themselves, the pre-Olympic events did not impose anywhere near such a big load on the infrastructure and there was more tolerance for errors. In fact, a number of significant bugs were found and fixed during this testing phase, which would have been embarrassing had they occurred during the Games.

"We found serious problems in testing outside of the Olympics which, if we had found in the middle of the Games, would have been a big issue," says O'Riordan. These were found not during the pre-Olympic events, but during 'cloud testing' involving the use of a cloud environment to simulate a real event, in this case generating large fake audiences requesting and manipulating live and on-demand streams. "We threw all this at the real suite and gave it a real pounding and as a result found problems we fixed prior to the games," O'Riordan adds. "These were really about the robustness of the code not behaving as expected when you threw data at it under end-to-end testing."

Code freeze

When the testing was over and the resulting changes made, the BBC imposed an absolute code freeze two weeks before the Games - good practice because edits to one software component can have unanticipated consequences elsewhere, and that close the games there might not be time to eradicate all the knock on effects. As O'Riordan notes: "People weren't digging up the roads during the Olympics, and we weren't making any software changes."

Despite all these precautions, human nature being what it is, O'Riordan and her team were in a high state of anxiety as the first weekend of the Games approached. "I really thought the first weekend would be massively hairy," she recalls; but there was immense relief all round when events passed smoothly with only minor hiccups. "I remember saying to somebody at the end of day one that if I had been told two years ago that we would have only one bit of trouble starting one stream on the first day, I would have bitten my hand off for it."

24 simultaneous streams

Fortunately that sacrifice was not necessary, and this success highlighted not just the importance of rigorous and comprehensive testing, but also how the challenge for the BBC had changed in the years before the Games. The requirement to run 24 simultaneous streams was set early on in the project and was then thought to be the greatest challenge, but over time as technology matured and the cost of bandwidth and storage plummeted, a new issue became predominant, which was the need to support not one but four target platforms - PCs, connected TVs, tablet PCs, and mobile devices such as smartphones.

This had not been envisaged at the outset, says O'Riordan: "We are in a completely different media landscape than we were even four years ago, and the massive take away for me was our mobile and tablet coverage. During the week around 33 per cent of the traffic went to a mobile or tablet device, rising to almost half at weekends. This is only going to become more popular and more important in terms of how we deliver stuff."

With connected TVs also likely to become more popular as premium content is delivered increasingly over IP networks rather than traditional satellite, cable, or terrestrial video delivery infrastructures, multi-channel broadcasters like the BBC are seeking to serve all these outlets from a single converged platform.

Catering for different platforms

O'Riordan pointed out that the same content and data was sent to all the platforms, with the differences lying in the formatting and the extent of the graphics. People use the screens in different ways, with the emphasis on the video on the big screen, veering perhaps towards graphics on PCs and tablets, and statistics on the small screen, but these are generalisations.

The BBC does though, like other broadcasters, see the potential for differentiating more 'between screens' to exploit the full potential of each while meeting varying viewer expectations. This will be done partly by harnessing Big Data models - a major priority for O'Riordan now that 2012 is past, requiring if nothing else a change in job title for her. "We did a fantastic job in terms of getting people around those streams, but if we had had better data about how different people were using them, then perhaps we could change what we promoted to them," she reflects. "There are interesting opportunities."

In terms of video quality, the goal is to improve the viewing experience, particularly on big connected TV screens, on which viewer expectations are the same as for traditional screens, with glitches such as pixelation and waiting for buffering much less acceptable than on PCs, tablets, and mobiles. "It can drift away from live," O'Riordan admits. "It is all about the quality of the pipe and the server. I would hope that will get better."

Even so the connected TV picture quality during the Olympic Games was much better than some had anticipated, so the case for online streaming for delivering premium TV is proven, even if further enhancements are needed before it can match the three traditional transmission types.

It seems hard to believe that overall online delivery capability will make as big a leap forward between 2013 and 2016 as 2008-2012, even though O'Riordan believes that Big Data has some interesting tricks up its sleeve that can only be guessed at for now.

Share |

'Push-pull' delivery: Real-time graphics just-in-time

The BBC needed to find a reliable way of delivering statistics in real-time to the IVP (Interactive Video Player) such that it could be combined smoothly and cope with exception conditions. This had to work for video accessed on-demand after events had taken place, as well as with live content and also cope with the situation where the viewer had stopped or rewound to resume watching after a break.

The fundamental challenge is that as the latency - the time gap between the event being captured on camera and played out on the user's screen - decreases, so the window for injecting associated statistics becomes even tighter. With online video streaming the latency is generally coming down all the time, although it will always depend on factors such as the geographical distance between the event and the viewer.

The issue is that the statistics required depends on the video showing, which would seem to call for a 'pull' mode of operation with the client requesting information from the server. But this would take long, since by the time the server had received the request, processed it, and then sent the requested data, the associated video would have been played.

A push mode where the server sends the graphics as they becomes available, or when the time seems right, solves this latency issue, but the problem lies in knowing what to send and when in the absence of instructions from the client.

The streaming world is finding its way towards hybrid solutions that combine the best of both, a leading contender being 'long polling'. With long polling the server accepts requests in advance from clients and then holds them until the data is available, or until it calculates the time is right to send the information. In this way the lines of communication are kept open between client and server giving the latter sufficient information to know what to send and when. This achieves the performance of push, while delivering only what the client wants as in 'pull'.

The BBC was hoping to use this hybrid push/pull approach and did prototype it before the Games; but according to Oli Bartlett, its product manager for the BBC's Olympic Data Services, it was then deemed too great a technical challenge to solve in time for the Games, given the requirement to scale to millions of users. There was not time to test it thoroughly at that scale, so instead the BBC combined two techniques, one for live content, and one for on-demand or delayed viewing.

For either live viewing, or where the user had rewound by less than two minutes, a technique resembling live polling had to be used because of the short time available to provide the statistics. The BBC therefore delivered statistics relating to the most recent events that had been shown in response to polls every three seconds. The number of events for which statistics were given was chosen to be larger than the maximum possible number of events that might occur within that three second window. For on-demand viewing this level of polling would be an unnecessary processing load, and so statistics were sent in blocks of time, at a level designed to ensure that they are sent before the client has to request them. This two-method approach was an effective compromise, but posed its own challenge at the point where one method has to be replaced by the other within a live stream.

This would happen when a user rewound a live stream past the two-minute limit. The BBC came up with a solution involving snapshots of statistics to handle these transitions, or for the situation where a user suddenly returned to live after having rewound say to watch a goal that had been scored 10 minutes earlier. The BBC's Oli Bartlett notes, this is only a stop-gap solution, and push techniques such as long polling will take over and be in place well before the Rio 2016 Games.

Delivery system: Simulation helps to meet traffic patterns

Given the emphasis on data integration across multiple servers avoiding single points of failure, we might have expected to find that the BBC's Olympic architecture featured a distributed database system such as 10gen's open source MongoDB, or the Apache Cassandra developed by Facebook used for its inbox search facility until 2010. Instead it deployed the less sophisticated, but even more tried and tested MySQL, and flat XML files, highlighting the emphasis the BBC paid to battle-hardened software.

This relates to the other architectural aspect: the massive amount of testing involved, shaping the whole design of the system. For a one-off event like the Olympics that allowed no full-scale dress rehearsal, the BBC had to be as sure as possible that problems arising during the Games itself would be easily fixed, and so it proved. Given that the Olympics occurs every four years, and that the whole online streaming scene had much changed from 2008, it was not possible to create theoretical models that would simulate the event realistically, or generate loads that mimicked real-world usage; so the BBC contracted data centres around the world to generate the load and simulate the activity of a million users requesting streams concurrently.

The architecture was shaped by four other key design principles apart from the emphasis on testing, which was performed rigorously at every stage. These principles were that the eventual streaming service be easy to use, provide as much information as possible in real-time synchronised to the video, serve not just the Games but the BBC's future evolving online platforms, and support data reuse across multiple services.

There was the original design goal of supporting 24 simultaneous streams, followed later by the requirement to provide for the four principle online platform categories of Internet-connected TV, PC, tablet, and mobile devices. The re-use principle then ensured that as much of the code as possible would serve all four platforms within a common architecture providing access to data through shared APIs.

The whole programme was then split into teams, with two focusing on the infrastructure, one for delivering the video, and one for bringing in the data. The remaining four teams dealt with applications, one for the crucial interactive video player (IVP) handling the access to and playout of the streams, one for the 2012 portal and Torch Relay, one for the connected TV apps, one for mobile browser and apps, and one for the red button service. Red button is a legacy service that has been available for years, and needed to be updated and merged with the online platform and be fed by the same data services, avoiding redundant effort and facilities. Red button was introduced as the digital replacement to the Ceefax service, but was extended to enable viewers to select multiple video streams via digital terrestrial/satellite TV, starting with the 1999 Wimbledon Championships, to provide the option of watching different matches. But it only allowed up to six channels, so had to be upgraded to support 24 for the 2012 Olympics, and at the same time the user interface was updated, although still kept simpler than the equivalents for the four online platforms.

Timeline: The BBC at the Games 1920s-2010s

1924 Paris Olympics first to be broadcast on public radio

1928 Amsterdam Olympics again with limited radio

1932 Los Angeles Olympics are first Games with recordings in BBC radio archives

1936 Berlin Olympics first live TV coverage by two German firms, Telefunken and Fernseh with 138 viewing hours, but still only radio on BBC

1948 London Olympics first to be televised by BBC with six cameras and two outside broadcast units

1952 Helsinki Olympics BBC captures Emil Zatopek winning 5,000m, 10,000m and marathon treble in a feat most unlikely ever to be repeated

1956 Melbourne Games coverage rights awarded exclusively to Associated Rediffusion, one of the new commercial (ITV) broadcasters, for '25,000 ('520,000 today)

1960 BBC back on air for Rome Olympics beginning period of fierce rivalry with ITV

1964 Tokyo Olympics first to be broadcast live internationally via geostationary satellite by BBC (and others)

1968 Mexico City Olympics first to be covered by BBC in colour, on the recently launched BBC2

1976 BBC wages war with ITV over Montreal Olympics coverage after failure to agree how to split airtime

1984 Los Angeles Games gives BBC clear run as ITV pulls out due to pay demands for overseas working from unionised technicians

1988 Seoul Olympics. Last to be covered by ITV before it withdraws longstanding battle with BBC

2004 Athens: BBC broadcasts in High Definition for first time, but 20 years after first US HD TV coverage in 1984

2008 Beijing Games. BBC's first digital Olympics, broadcasting six simultaneous online channels to PCs

2012 London, declared by BBC as first truly digital games, scaling up to 24 simultaneous channels and transmitted to four platforms rather than one

Related forum discussions
forum comment To start a discussion topic about this article, please log in or register.    

Latest Issue

E&T cover image 1404

"Power cuts might seem like a 1970s fad, but they could be on the way back. How can we prevent them happening again?"

E&T videos

TomTom mapping the neighbourhood

E&T jobs

E&T Marketplace

The essential source of engineering products and suppliers.

E&T podcast

Tune into our latest podcast

iTunes logo

Subscribe

Choose the way you would like to access the latest news and developments in your field.

Subscribe to E&T