The Langer Road Area of Felixstowe on February 1, 1953, following the East Coast Flood

Computing advances make 1953 flood repeat unlikely

On the anniversary of the 1953 East Coast Flood experts say forecast computing advances should prevent a repeat of  the disaster.

60 years ago today more than 30,000 people were evacuated and 24,500 houses were damaged as tidal surges inundated coastal communities from Shetland to Kent, with 307 people losing their lives.

A lack of warning was the prime reason for the level of casualties in 1953, but modern advances in computing mean warnings are now often well in advance of potential flooding.

In the wake of the 1953 flood the UK Coastal Monitoring and Forecasting, a partnership of public bodies was set up to provide warnings of impending high water levels.

The body is funded by the Environment Agency and costs roughly £2 million per year, but with agency recently revealing that 1.3 million people are at risk of coastal flooding in England and Wales, the service is vital.

Data is fed in from a network of tidal level gauge stations at 44 key locations around the UK, buoys measuring wave height run by the Centre for Environment, Fisheries and Aquaculture

Science, tidal surge models run by the National Oceanography Centre (NOC) and climactic information from the Met Office.

Tim Harrison, senior advisor for monitoring and forecasting at the Environment Agency, said: “You’ve got all these different sets of telemetry gathering different observations out there and feeding into the tidal surge models.

“They provide a forecast of tidal surge at the key 44 locations round the country and all that information feeds into the National Flood Forecasting System.”

Improvements in forecasting meant that when the worst storm surge in more than 20 years hit the east coast of the UK in November 2007, precautionary evacuations in East Anglia and the raising of the Thames Barrier meant there were no fatalities or significant financial losses.

However 2007 saw widespread flooding throughout the summer prompting the Government to commission a review of the response from Sir Michael Pitt and in April 2010 the Government passed the Flood and Water Management Act 2010, which implemented many of his recommendations.

One of the outcomes was The Flood Forecasting Centre, a partnership between the Environment Agency and the Met Office, which forecasts for all forms of flooding from rivers to surface water to coastal and groundwater.

“All this involves a massive amount of computing,” said Mr Harrison. “So really the only practical way to do that is to exploit the benefits of the Met Office super computer.”

Another outcome of the Pitt report was a recommendation to provide longer lead times on warnings.

Mr Harrison said: “The only way of providing very long lead times is to provide probabilistic information, so saying there is a 10 per cent chance of the level getting to XYZ. We have been taking steps to build a fully probabilistic service.”

Since the report the NOC has started running their models multiple times, creating ensemble models, which can predict conditions up to 7 days in advance though only the 5 day forecast are used operationally at present.

“You perturb the initial conditions of your model and then do multiple runs of the model, maybe 50 runs of the same model, and you get a whole range of outputs,” said Mr Harrison.

“What I call the deterministic run or the most probable run may say the surge will be five metres, then you may have all 49 other runs with a distribution around that five metres. So based on those you can form a probabilistic forecast, say there is a 10 per cent chance the level might get to six metres.”

But this approach to forecasting is still in its infancy and presents challenges to those making decisions.

Mr Harrison said: “The big issue with probabilistic forecasts is how do you convert probabilistic information into operational decisions? If you just have one model that says you’ve got a five metre surge you have a clear idea. Either you do something or you don’t.

“If you have probabilistic information that says there’s a 30 per cent chance you’re going to get five metres then you’ve got to make a judgement about what to do. Are you going to issue a warning and close a gate, such as the Thames Barrier?”

Forecasters are working on a formal system for using this probabilistic information for operational purposes, but another idea is to publish the forecasts so the public are aware of conditions.

“What we want to do is publish the forecasts on the internet to get them to the wider public. We’re not there yet but we’ve got work in hand to do this, then the step after that is to publish not just the forecasts but the probabilistic forecasts,” said Mr Harrison.

“Of course it’s very complex and some people would be frightened to death to see something like that and trying to interpret it, but in my mind I think this is where we should be heading.”

The advances in forecasting mean experts are even beginning to be able to forecast flash floods in rapid response catchments, like the one experienced in Boscastle, Cornwall, in 2004.

“They are highly dangerous and people can die,” he said. “But with this increase in computing power available to us and the Met Office to provide rainfall forecasts, it is becoming increasingly realistic to provide forecasts in these situations and we are starting to do that.”

But despite the huge amount of technology feeding into the system, Mr Harrison says it is still ultimately the experts calling the shots.

“There is still a human element to the forecast service. We’ve got all the data collection and all the modelling. But then a person collects it, reviews the forecasts, and makes a judgement on if the forecast is good and if information feeding it is good. Then we can contact our colleagues and if necessary issue a flood warning.”

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close