Suppliers of electricity, gas and water to European homes and businesses are finding ways to analyse the vast volumes of data their new smart systems are generating in order to gain insights in customer trends and operational efficiencies.
The new world of service supply, based on smart meters, smart grids, and enhanced customer relationship management systems, has already started to generate massive data sets that utility companies are keen to analyse. Around Europe suppliers of electricity, gas and water services are starting to explore how 'big data' analytics can process the vast amounts of information which are either already available, or are shortly to come online from extensive smart-meter rollouts in process or being planned.
Market analyst GTM Research predicts global utility company expenditure on data analytics will grow from $700m in 2012 to $3.8bn in 2020, with gas, electricity, and water suppliers in all regions of the world increasing their investment. GTM's David J Leeds describes the new emphasis on data analytics software which allows utilities to track, visualise, and predict usage as "a complete reinvention of the utility business", with companies expected to take on data management platforms based on Hadoop and other technologies and massively parallel processing (MPP) big-data appliances.
To date, the most common use of analytics applications has extended little beyond traditional business intelligence and data warehouse tools used for marketing purposes, with dedicated hardware and big-data processing platforms beefing up processing capabilities to filter, analyse, and condense a wide range of larger data sets into meaningful insight much more quickly than was previously possible. However, use-cases are now extending beyond these staple areas.
The prevention of customer churn is a major issue for many utility companies, especially in deregulated markets where there are more competitors offering discounted deals to potential customers – especially where those deals are time-limited. Armed with a bucketful of statistics around such things as usage and customer profiles, price packages and special deals optimised to suit individual residential or business consumption, patterns are much easier for suppliers to put together. In addition, forays into the maze of information thrown up by social media sites delivers 'trend analysis' – usually defined as an insight into the extent of consumer dissatisfaction with current price levels and customer service.
"This is data that [utility companies] had access to in the past, but now there is just much more of it – the same data in greater volume, but also new data sets that they never had access to before," observes James McLelland, director of utility solutions at software-solutions provider SAP. "We have seen some interesting use-cases starting to come through, mostly smart-meter analytics and smart-grid operations."
All the information being analysed follows a standard pattern. Raw data is collected from the smart meter, on-grid sensor, industrial device, database, collecting point, or other asset before being transmitted over some form of telecommunications, radio-frequency identification (RFID) or wired/wireless network for ingestion onto a server or dedicated appliance which has the sole job of cleansing it – a job customarily performed in giant data warehouses, but now speeded-up significantly using Apache Hadoop clusters and/or in-memory databases (those that store data in some form of RAM rather than hard-disk) for example. Apache Hadoop is an open-source software framework for storage and large-scale processing of data sets on clusters of commodity hardware.
After being cleansed of 'dirty' data - duplicate or incomplete records – the remaining information is sent, sometimes via the data warehouse, to the analytics engine, which runs a series of algorithms, the so-called 'secret sauce' designed to transform it into meaningful insight which can be deemed of some use to the business, and which can then be presented within third-party reporting or visualisation software.
Smart meters expand data footprint
Accurate assessments of scale are hard to come by, if not impossible to make. SAP's McLelland believes smart meters will help to swell the gigabytes of data now being generated by utility infrastructure to thousands of terabytes in the future, with other estimates suggesting smart meters could generate around 1,000 petabytes of data a year globally once full rollouts are complete (GigaOM). The Centre for Sustainable Energy (CSE) is a UK coalition of public, private, and voluntary organisations formed to explore technologies and processes able to address rising energy costs and climate change.
In 2012, it undertook a project in partnership with Western Power Distribution, Scottish and Southern Energy and the University of Bristol to develop a prototype big-data platform Smart Meter Analytics, Scaled by Hadoop (SMASH) designed to store, process, and retrieve data from datasets up to 20TB in size, a figure estimated to emulate smart-meter data from 13 households per year.
Certainly, smart-meter deployments take readings from devices much more frequently than was previously feasible – the CSE estimates that more than half the electricity consumed in the UK sees meter readings which are taken only every six months, for example, despite electricity commodity prices being traded on a half-hour basis. The CSE also noted that current meter readings deliver no information about the amount of electricity being used at different times of the day – data crucial to understanding and building more accurate customer profiles.
"We are now expecting consumers of energy to take part in [utility company] decision-making," explains Mark Osborne, from the UK National Grid's electricity transmission future strategy team. "Historically, you just got a quarterly bill and maybe went to uSwitch.com to swap provider nown and again, but now you can make a choice if you want to do your washing early in morning or in the afternoon when it is cheaper."
E.ON Group, which operates in 30 countries around the world, has contracted Swedish telecommunications equipment vendor Ericsson to install and operate over 600,000 smart meters in Sweden between 2013 and 2018, described as "an important step towards" real-time big-data measurements intended to supply E.ON customers with accurate and fully up-to-date information on their energy consumption.
Though utility companies have tended to focus publicly on the benefits of smart meters for customers themselves – ostensibly as a means of justifying the cost of smart-meter upgrades – all are privately more concerned about using the insight that big-data analytics can provide to improve their own operations, particularly when it comes to forecasting demand in order to optimise supply, predicting likely outages, and identifying leaks and/or fraud. Many are now planning to deploy extensive machine-to-machine (M2M) networks which connect a wide range of industrial devices and sensors across their infrastructure, and which throw information from control centres, virtual power plants, and computerised logs into the mix.
"We are trying to get more information and data to make [the grid] more efficient, data about transformers, overhead lines, cables, substations, engineers; these are the bits of information we need to know," explains Osborne. "How hot is a transformer, how much charge is there on the overhead line, and suchlike – all of this information can be gathered and used to support decision-making."
Meanwhile, Netherlands-based electricity and gas distribution company Alliander, with 3.5 million customers, says it is one of the first in Europe to use SAP's High Performance Analytic Appliance (HANA), an in-memory, column-oriented, relational database management platform, to analyse much more data pulled from sensors and business applications to optimise asset and grid management, for example. The company previously ran load-forecasting once a year, with the processing taking up to ten weeks – a time now shortened to three days, which allows the data to be mined once a month.
Identify leaks and spot fraud
The Arad Group, a company specialising in water-measurement technology, recently signed a deal with IBM to incorporate the latter's analytics algorithms into Arad's City-Mind master data management (MDM) and Dialog3G software, giving the application the means to process much larger and more diverse information sets pulled from smart meters and sensors with a greater degree of accuracy. Arad supplies MDM software to Southern Water and Welsh Water in the UK, though neither are yet using its IBM-enabled City-Mind platform as yet.
According to IBM, those algorithms are based on machine learning, data mining and statistical analysis techniques, which allow City-Mind to discern crucial differences between unusual water consumption patterns based on historical and seasonal demand, compared with other sites or properties in the same area, which can help identify leaks and fraud (as opposed to excessive usage).
"The meters are usually configured to take readings every 15-30 minutes, and send the information directly to the City-Mind database, which runs the algorithms, then the utility company decides whether it wants to pass on alerts to the users," says Arad Group vice president of international marketing Rami Ziv. "So we can help the utility company give warnings if that person is consuming more water than similar people in the neighbourhood with the same profile, and we can compare groups of people with the same profile too."
Matching demand against supply
The UK National Grid's infrastructure is 60 years old in places, and the company also needs to adapt to new regulatory requirements outlined the Revenue = Incentives + Innovation + Outputs (RIIO) framework which governs the revenues that the UK's 14 electricity Distribution Network Operators (DSOs) are allowed to collect during the eight year period spanning 1 April 2015 to 31 March 2023. That means improving efficiency, not just through consolidation but by being more innovative when it comes to delivering performance-based metrics back to managers making everyday operational decisions. More efficient capacity management invariably involves being able to match supply more closely to demand, a particular problem in the electricity industry where the commodity in question is much harder to store than gas or water.
"Traditional capacity management and planning is changing and we need to be more responsive in predicting uncertainty," National Grid's Mark Osborne says. "The big challenge is predicting it: real-time is good, but if you can extract what you need to do in 10 or 12 days' time, that is the key. The important thing about electricity is that you cannot really store it – it has to be generated and then used, and there are always inefficiencies in trying to change it into something else."
Consumption of gas, electricity and water is keenly affected by the weather and top of utility companies' wish list are analytics platforms able to process and analyse meteorological information from a diverse set of sources. IBM has developed a weather modelling and power-grid management system designed to optimise the supply of wind and solar power as part of its Smarter Planet initiative, dubbed Hybrid Renewable Energy Forecasting (HyRef) developed in conjunction with Danish wind-turbine manufacturer Vestas Wind Systems.
Running as a pilot with China's Zhangbei National Energy Storage and Transmission Demonstration Project, HyRef pulls in data from weather reports and sensors, tidal phases, satellite images, deforestation maps and weather modelling research databases, with other IT companies developing or trialling similar systems. The Zhangbei National Energy Storage and Transmission Demonstration Project claims to be the world's first and (to date) only utility-scale hybrid renewable energy plant to integrate utility-scale wind and solar PV generation with large scale lithium-ion battery energy storage.
In hot climates, being able to predict whether a temporary cold front is coming in for the weekend and analyse what impact that is going to have on demand and supply for gas, electricity and water allows utility companies to be much more proactive in how they allocate supply or gear-up for additional storage or distribution; for example, rerouting unneeded energy in one region into others to save on generation costs.
"Things like the weather have a big impact on energy demand – people act differently when the sun is out than when it is raining," says Osborne. "Also, as people start putting in more renewable energy equipment, there is a noticeable shift when cloud cover goes over them and that can create big changes in supply, particularly when you go from having surplus energy to when it just dips."
Slow rate of adoption
Despite the advances being made on the technology side, and sweeping predictions around equipment sales and information volumes, there is of course no absolute guarantee that big data and predictive analytics will find their way into utility company infrastructure any time soon. Difficulties in integrating existing systems, capital investment requirements and a lack of familiarity with the technology will all delay or hamper adoption, at least in the short term. However, there are some compelling factors that may accelerate the process, such as pressure from shareholders and customer groups insistent that utilities from all sectors should do as much as they can to derive value from data assets.
"Smart grids have been described as a solution looking for a problem, and we have to identify a business case," National Grid's Osborne continues. "Electricity equipment has been installed long before these concepts ever appeared. The energy industry is slow to adopt this, because a lot of it is about legacy, and how to integrate with legacy."
SAP's James McLelland is inclined to agree with National Grid's reading of the situation. "We have had user groups where all those present nod vigorously [when we explain the technology capabilities],"he reports, "but when we ask what they are going to do with it, they are not so sure."