vol 7, issue 12

Smart grid gathers pace

17 December 2012
By Sean Davies
Share |
Smart grid graphic

The evolution of the smart grid gathers pace

A large tank at Highview Power Storage’s Cryo Energy System

Highview Power Storage’s Cryo Energy System uses liquefied air, a process that reduces its costs by a third

Large plant at Canaveral National Seashore, Florida

The smart grid will focus on maximising the use of our existing network, as well as integrating new infrastructure

The smart grid continues to evolve as the network faces the challenges of new patterns of generation.

The transmission and distribution network is creaking, struggling under the onerous challenges of distributed generation, renewable energy and increasing demand. These are all challenges for which the grid was not designed: its strengths lie in bulk transmission of power from huge, decentralised power stations through 400kV lines stepping down through 132kV, 11kV until it reaches the domestic consumer at 400V. That scenario is now a model that is well past its sell-by date.

Renewable energy enters the grid at a variety of stages: at the distribution level local generation enters the system and at consumer-level combined heat and power (CHP); add domestic photovoltaic (PV) and electric vehicle (EV) charging to the the mix, and you increase the complexity and damage the power quality.

So how is this situation to be remedied? One proposal that has been championed for several decades is the Smart Grid, an often elusive concept that has remained the same despite having had to suffer numerous rebrands over the years: the intelligent grid, the self-healing grid, and so on.

The essence is always the same: it is really a grid that permits network operators to maximise their assets with real-time information, which allows them to react to changing demand and fluctuating generation patterns, as well as avoid power disruption caused by failures in part of the system. It is a transmission and distribution system that employs fledgling technologies such as real-time monitoring, autonomous control, two-way communications, smart meters and energy storage.

Regulatory framework

At the heart of grid development in the UK is the regulatory and financial framework supplied by Ofgem. The regulator's Project Discovery identified that £200bn needs to be found over the next ten years to guarantee energy supplies. Its Electricity Market Reforms programme will be addressing the shortage of generation, but that still leaves a £32bn gap required for network investment – a huge amount when you consider that the industry is currently worth only £43bn in total. Ofgem's answer is 'RIIO', heralded as "a new way to regulate energy networks".

"Our interest in smart grids is driven by the benefits they bring rather than the technology: we are technology neutral," Gareth Evans, head of professional services, Ofgem, says. "They are a means to an end, not an end in themselves.

"We are all aware of the challenges that the industry faces in terms of decarbonisation, security of supply, trying to keep costs down and dealing with ageing assets. There are a whole bunch of technologies that can be deployed, but the networks are the unsung heroes of the process who join the whole thing together."

The nucleus of RIIO is outputs: ensuring that the money that companies spend delivers outputs. "In order to deliver those outputs we are encouraging the companies by the core price mechanism itself; but on top of that an incentive is built into that price mechanism, and of course this stresses innovation," Evans continues. "At the heart of the RIIO process is the well-justified business plan. This must convince us that the money that they are spending is going to deliver those outputs in an effective and efficient way and those outputs must be clearly specified."

There is a real stress on innovation in the framework. First of all the core price control incentives encourage companies to perform more efficiently because that transfers straight to their bottom line. But more importantly there is the Innovation Stimulus package which is an important part of the RIIO framework.

This has three components. First, the Network Innovation Competition (NIC), which is effectively a development of the Low Carbon Networks Fund (LCNF). Whereas the LCNF only applies to electricity distribution companies, the NIC applies to all four sectors in the network industry. "We are proposing £240m for electrical transmission over the eight-year price control and there will be an additional amount, which is yet to be agreed, for distribution that will top that fund up," Evans says.

Together with this is the Network Innovation Allowance (NIA) which permits companies to invest money unilaterally, subject to a set of rules and governance: they will be allowed to spend between 0.5 and 1 per cent of their allowed revenue in this way. The third component is IRM, which is a mechanism to allow the roll-out of projects that would otherwise have to wait until the next price control.

Network hierarchy

One major barrier to innovation in the grid is the aversion to risk traditionally maintained by the transmission and distribution companies. Tried-and-tested products are still preferred to innovative technologies that may only have tasted action in small-scale pilot schemes. This creates another problem, that of technology lock-in as most transmission and distribution (T&D) projects have upwards of a 40-year shelf-life.

"We have clearly got a power network that has been optimised over many years and, as with any organisation, that creates inertia," says John Scott, director at Chiltern Power. "But it makes good sense to deploy new solutions, not just have them as demonstrations. If we roll them out it will bring a benefit to customers and companies and deliver a return on the investment."

In a sector that has got long lead-times, long asset-times and a demanding regulatory framework, there is a need for a strategy to ensure that the innovative technologies become part of the planner's tool set. "Innovation always has uncertainty and this may equate to business risk for the company; do you want to be the project manager to try and deliver this if there is still some uncertainty around?" Scott adds.

His belief is that we need a change in the hierarchical thinking of the network to make effective changes. "Today's network model is a hierarchy; this has been very effective and has traditionally met the market's needs," he says. "We are adapting it today to various technologies such as active network management, electric vehicles and so on, but still staying inside the established framework.

"But is this suited for the future? If we take this hierarchy we will be adding technologies such as smart-meter data, distributed storage, EV charging, home automation and storage. At the moment we tend to plug them into the existing architecture. But these weren't conceived for such a volume of multiple real-time data. The complexity and consumer interaction is all going to start to appear. We will have requirements appearing for concurrent processing – close to real time."

He argues that this power system based on a centralised architecture, centred on DNO and TSO control centres needs a rethink and both operational and asset management data needs to be considered. His idea is to create a coherence engine, an intelligent framework that allows a coherent approach to the systems and data. According to a report published this year by ENA/Telent, DNOs have 54,000 sites connected. The prediction is that by 2030 there will be 700,000 connected sites – a growth of huge proportions – and therein lies the dilemma.

"How about thinking of a stable core grid system, which we are familiar with today, along with its architecture working with devolved, semi-autonomous sub systems?" he says. "Under this system you would start to devolve the intelligence and control. This allows it to be more scalable and flexible, but of course they still need to be secure, maintainable and fail-safe. But I believe this will be a key enabler for continuous change."

Grid optimisation

Grid optimisation, the ability to utilise the grid to its full potential, is a crucial weapon in the smart grid armoury. Armed with real-time data the transmission system can often carry more than its stated capacity, by fully understanding its loading and utilising often generous safety margins. "There is a certain amount of installed grid capacity which has the ability to deliver a certain amount of power," Dr Bob Currie, technical director and co-founder of Smarter Grid Solutions says. "What usually happens is that some of that capacity is reserved for the worst-case outage – so there is spare capacity to supply the demand.

"What we are talking about with this new approach to allocating capacity for connections and managing the grid is trying to use as much of that as possible during times when the system is intact. Still focusing on the reliability and security of supply for demand, but recognising that there is more capacity available in real time."

In London, Smarter Grid Solutions are conducting a project on distributed generation trials with UK Power Networks called Low Carbon London. The aim is to overcome barriers to connecting more generation to urban networks and look at deploying a system that can autonomously manage clustered areas of technology adoption to keep things within limits while providing enough energy throughput access to those new devices.

"There is CHP, solar and diesel back-up in London," Currie adds. "It is about constraining them on to the network when we are trying to reduce the amount of power being fed into a particular area, but it's also integrating with aggregators.

"What we are doing is seeing if we can control aggregator resources using an autonomous system for network benefit. We monitor and if we detect an overload on a substation we would normally just interface directly with the generator and ask it to do something, increase or reduce power output. Now we're going to see if we just send an automatic message to the aggregator saying that we need 2MW at this substation in the next minute or seconds."

Western Link

A lot of the focus in smart-grid research focuses on maximising the use and productivity of the existing network but that is not the sole emphasis. New infrastructure is often required to mould the grid to the future energy-generation trends.

In the UK transmission system power tends to flow from north to south, from Scotland to England along some of the most congested power trunk routes that have been creaking at the seams for years. Various grid optimisation projects have augmented their capacity but with the growing offshore wind and potential marine energy coming on line there has been a pressing requirement for additional capacity. The latest project added some series compensation to enhance the existing connection but there was no escaping the need for extra capacity. The option selected was the Western Link – or bootstrap – between Scotland and England that would accommodate an extra 2GW of transmission capacity to bring the total up to 6GW. Having assessed the technologies available HVDC was selected as the preferred solution.

"The reason we did not go down the AC route was cost and the fact that for an overhead line we would have to deal with local authorities for rights of passage which would have delayed us," Vandad Hamidi, system performance manager – network strategy at Transmission Network Services (TNS) explains. "The Western link is 2.2GW with another 200MW overloading capability based on the proven technology of LCC (Line Commutated Converter) using LTT (Light Triggered Thyristors). It offers a short-term rating and has low losses and does not require a black start capability. This is the first project in the world that is using a 600kV DC cable."

This project complements the aims of smart grids in renewable energy by providing an HVDC link to transmit 2GW – designed to connect Hunterston in the north to Connah's Quay in North Wales, running offshore through the Irish Sea. Even though its route down the west coast traverses several existing services that link Ireland to Scotland and England, taking a cable route along the Irish Sea was deemed more expedient than securing a route 400km long by land. "Overhead lines are hardly an option anymore because of local opposition," Ian Talbot, project director Western Link, Siemens, says.

Cables are a viable transmission medium across open water, but there are two characteristics that come into play. First of all, working against you is the capacity characteristics of cables for the charging currents. For AC these charging currents need to be supplied in every half cycle and they are dependent on the length of cable; the longer the cable the higher you have to charge it. So you reach a limit where the charging current at the feeding end becomes the same capacity as the current capacity of the cable. However with a DC cable charging current is only required when you energise the cable and then it runs continuously.

The second phenomenon is skin effect, which is caused by the voltage and frequency of the cable causing the electricity to flow around the surface rather than through its core. Standard conductors improve the ratio of surface area to volume for AC, but DC avoids the effect altogether. The whole cross-section of the cable becomes used.

The plan is for the southern terminus to use an old power-station site at Kelsterton, although at present there are some planning issues that need to be resolved. The facility would include a filter yard, IAS switching yard, reactive compensation as well as some fire and ancillary services.

Inside the converter halls will be huge, ceiling-mounted thryristor stacks that are capable of switching 4000A and blocking 8000V, these are triggered by light photons. These are arranged in a bridge across the three phases where they chop up the AC waves and combine them into DC at the sending station and then chop them back to AC at the receiving station. "All this chopping action creates a lot of harmonics on the system which is why we require so much filtering to tune out specific frequencies before passing back into the system," Talbot adds.

Searching for storage

Storage is a vital component of any smart-grid system all along the T&D value chain. Many technologies are challenging for the growing market - pumped hydro, hydrogen, fuel cells and a variety of batteries from Lithium to Metal-Air and Sodium Sulphate to Nickel, but one contender based on mature technology is striving for a slice of that very lucrative pie. The company behind it is Highview Power Storage.

The Cryo Energy System uses liquefied air or liquid nitrogen (78 per cent of air) which can be stored in large volumes at atmospheric pressure. Liquefied air has a high expansion ratio between its liquid state, -196°C, and more common gaseous state; expanding about 700 times when regasified. As with a traditional steam engine, a cryogenic engine relies on phase-change (liquid to gas) and expansion within a confined space e.g. engine cylinder or turbine.

Since liquid air boils at -196°C, ambient temperature will superheat it, creating regasification and expansion. An engine can therefore use freely available environmental heat as the heat source.

The energy density of cryogenic fluids such as liquid nitrogen compares favourably with alternative energy-storage fluids such as compressed air. Cryogenic storage also has the advantage over compressed gases in that it can be bulk stored above ground in low-pressure tanks.

"We are pursuing this particular technology because we think it has some aspects that differentiate it from deeply embedded storage technologies," Gareth Brett, CEO Highview Power Storage, says. "We are looking for a replacement for a pumped hydro-storage technology or carbon-based compressed-air technology, the problems of which are that they are geographically constrained. We are looking for large scale – tens to several hundred megawatts that can be delivered where you need it without any physical constraints."

Cryogenic liquids are widely used in a variety of industrial applications, but their use as an energy vector is only just emerging. There are plenty of infrastructures around; it is a very mature industry that is already well regulated. In addition the energy density of liquid air compares favourably with other low-carbon competitors such as compressed air.

"Like a lot of storage systems, it has a charging device, an energy store and a discharging device," Brett explains. "The charging device in our case is an industrial liquefaction plant. The energy store itself is a tank that has the advantage over a battery that you can empty a tank as many times as you like without wearing it out. The power recovery device, you take the liquid air, pump it to high pressure, add the heat that you've just rejected to make it a liquid which gives you high-pressure gas that you expand in the turbine process.

"All of the kit needed to do this is already in the market and benefits from mature supply chains. We have a pilot plant at Slough. It's basically a thermo mechanical storage system, I know that's not very sexy nowadays, but we know how to look after it. It is a mechanical system so it doesn't respond within a cycle like a battery would, but it does respond pretty quickly. You can wind it up in a couple of minutes; carry out big load changes in less than ten seconds. The reason why it's quite responsive is that we're heating the air to ambient temperature so there aren't any fancy changes going on in the turbine so you can change load very quickly.

"Even in our miserable little pilot plant you can convert 47 per cent of the heat you add to it to electricity." The Centre for Low Carbon Futures in its report 'Pathways for Energy Storage in the UK' concluded that LAES is commercially competitive in terms of cost, discharge duration, capacity and lifetime when compared to other technologies. The centre also estimates that the capital cost for LAES per kilowatt can be more than one-third cheaper than Sodium Sulphur (NAS) batteries which are currently commercially deployed.

The move towards an effective smart grid will be an evolution. Incremental changes as assets need replacing coupled with focused development of new transmission and distribution circuits, such as the Western Link, will continue to shape the network, but as always the investment is crucial.

Further information

Share |
Related forum discussions
forum comment To start a discussion topic about this article, please log in or register.    

Latest Issue

E&T cover image 1407

"Even the smallest of creatures in the most far-flung places around the world are getting wired up for tracking"

E&T jobs

E&T Marketplace

The essential source of engineering products and suppliers.

E&T podcast

Tune into our latest podcast

iTunes logo

Subscribe

Choose the way you would like to access the latest news and developments in your field.

Subscribe to E&T