Big data and advanced analytics for energy management

Reviewing the risks and benefits of having access to accurate, real-time data to enable effective and efficient management of our energy grids

The energy world as we know it is changing and changing rapidly.

According to an Electric Power Research Institute report, there may be as much change in the next eight years as there has been in the last 25. This change is being driven by decarbonization of the economy and technology advancements, including big data and advanced analytics that trigger the development of new business models.

A gradual decarbonization of the power sector is being achieved by increasing the share of low-carbon energy sources, driven initially by incentives offered by governments and now through reducing technology costs. This decarbonization is being brought forward by shutting down large flexible dirty power plants that were connected to the transmission system and replacing them with smaller scale, inflexible renewable generation. Much of this new generation doesn’t sit on the transmission system.

In the UK, National Grid is responsible for ensuring that the system remains in balance (within +/-1% of 50Hz). It has a wealth of experience of predicting demand, and historically has been able to easily match this demand with large-scale flexible supply, and by calling on generation to balance the system – sometimes literally by telephone.

Today, National Grid has more difficult challenges in understanding how much power is going to be available at a point in time. Power availability depends on how the wind is going to be blowing and how much the sun will shine at a given location. Much of this energy is embedded at the edge of the grid, where National Grid is not directly aware of it.

In order to balance the system, National Grid and others need to be able to call on distributed energy resources (DERs) to provide flexible generation or demand.

The move to distributed energy

DERs are smaller power sources that can be aggregated to provide the power necessary to meet regular demand, as well as deliver services, such as frequency response. As the electricity system continues to modernize, the share of DERs, such as storageuctbzuuwrxbsztrw and advanced renewable technologies, keeps increasing.

In addition, evolving information and communication technologies make it possible to unlock extra capacity from previously latent and passive resources, such as power load flexibility (i.e. demand side response), small-scale fossil fuel and biomass/biogas generators, and combined heat-and-power plants. 

The opportunities for DERs to create technical and commercial value are becoming widespread and increasingly sophisticated. The complexity and versatility of DER operations now goes far beyond basic aggregation, demand-response and load-shedding; DERs can perform many functions critical for the power grid’s reliability and can tap into multiple pools of value. For example, multiple companies have been aggregating, managing and placing DERs in Britain’s so-called ancillary services markets managed by National Grid. These services span short-term operating reserve, static, dynamic and enhanced frequency response.

The speed, accuracy and reliability of a DER’s ability to respond to the grid’s frequency variations is approaching that of large-scale conventional power plants. In addition to providing aforementioned services to the centralized power system, DERs can also create value in medium and low-voltage distribution networks. This value comes in the form of avoided or deferred investments in wires and transformers in areas where DERs can serve increasing loads.

Cheaper, greener, more reliable electricity and fairer charging

In 2015 the UK government stated spending of up to £34 billion from 2014-2020 was required to meet current and future challenges for the electricity system.

This spending would be to upgrade the ageing infrastructure and enable it to cope with the changing requirements of energy. Proposed upgrades include: new and lower-carbon electricity generation connecting in new locations; supporting increased use of flexible generation to balance the system; greater interconnection with other countries, and smarter consumers taking more control and an active role in the generation and purchase of electricity.

At present, the electricity system has a lot of redundancy built in to allow for peaks and troughs, which safeguard the system from blackouts and power surges. More efficient use of the existing infrastructure could mean that some of the investment costs proposed by the government could be avoided or reduced.

Rather than expanding the local network and associated infrastructure to allow increased amounts of renewable generation, we could, for example, use big data and advanced analytics along with connected distributed energy assets to balance the system locally and make allowances for peaks and troughs in renewables.

For the electricity system and its participants, technology developments enable the digitization of the electricity system. The use of big data and advanced analytics means that systems will become more connected and operators will have a better understanding of both supply and demand in real-time. This improved understanding will present opportunities to increase efficiencies and reduce investment requirements.

Less investment in infrastructure should mean a lower cost of energy supply for all, and will remove key barriers for development of new renewable generation in the electricity system.

Big data, technology and analytics

In parallel to the developments in the energy industry, there is a boom in the technology industry. This boom is driven by cheaper and higher performance data storage and processing. This is enabling the energy system to become increasingly digitized and encouraging a proliferation of data.

The roll out of smart metering at the grid connection, as well as intelligent resource level metering at the asset level (making dumb assets smart), combined with the growth of new generation assets significantly increases the amount of information that is captured, stored and processed. 

This provides an increasingly valuable pool of data available to decision makers. Such data informs decisions about how to change the supply and demand of electricity at a given point in time to keep the system balanced. This data enables more efficient use of the assets connected to the system, and avoids the need to build out new, expensive generation to meet peaks in demand.

Aggregation, optimization and dispatch of hundreds, thousands and eventually, tens of thousands of DERs poses significant opportunities, as well as an array of challenges, in terms of data and analytics.  This can generally be thought of in three categories: Big data management; Forecasting, using machine learning; and DER fleet operations optimization.

Big data

Big data management is all about scalability and enabling better decision making based on this data. At Origami Energy we set the objective for our data management platform of being able to “scale out” to practically an unlimited number of DERs and to provide the security required by blue chip organisations who are fundamental to the UK’s critical infrastructure.  

We connect our customers’ distributed energy assets (generation, demand and battery storage) to our technology platform using our Energy Router. Our Energy Router is a bespoke design, specified to deliver current and future requirements (we expect it to be functional for the next 20 years of software upgrades). The Energy Router carries out a number of functions, including collecting, storing and transmitting data and providing control of the DER. It is capable of continuously streaming sub-second metrology data to our centralized Technology Platform, which processes and stores it in our Big Data system. This data is then used off-line or in near-real time for solving various operational optimization problems and deriving new insights for medium and long-term strategy. 

Our Technology Platform is built using market-leading platforms including Cassandra, Spark and Kafka to manage the large quantities of data returned from distributed energy assets. We use a blockchain approach to maintain integrity of audit data.

We use Cassandra because it provides scalability and high availability without compromising performance. It has Linear scalability and proven fault-tolerance. Some of the largest production deployments of Cassandra include Apple's, with over 75,000 nodes storing over 10 PB of data, Netflix (2,500 nodes, 420 TB, over 1 trillion requests per day). We use Cassandra to provide distributed storage of our data.

Kafka is probably one of the most powerful and popular distributed streaming platforms used globally. It is used by some of the world’s biggest companies: all the top-ten travel companies, and the majority of the top 10 banks, insurance companies and telecom companies – and many more, including LinkedIn, Microsoft and Netflix.

Kafka is designed to allow applications to process records as they occur and can be used to stream data into data lakes, applications, and real-time stream analytics systems. Streaming is an important shift in the way that data is used and transported. Historically data was generated by computers and transported in batches – at one time by post! But the data itself isn’t created in batches – it’s created as a stream and usually put into some sort of storage. Today, many of us are used to streaming content such as video from the internet, so we now have data production and use through streaming – it makes sense to move the bit in the middle to a streaming process to speed up data transfer, enable real-time decision making and reduce problems with data storage.

Spark is another key part of our technology infrastructure and is used to help with key data management activities such as data interpretation, cleaning and aggregation. Spark also helps us to develop and implement complex algorithms, where we use the same data in one algorithm as well as repeating database-style querying of data. Spark is used by leading global organisation such as Netflix, Yahoo, and eBay.

Prudent integration and management of the above – and other leading technologies – allows us to effectively pursue the goal of scaling out to an increasing number of DER assets.

Forecasting using machine learning and advanced analytics

At Origami Energy, we use advanced analytics and machine learning to develop forecasts for asset availability and behaviour, design of bidding parameters in various ancillary services markets and forecasts of balancing mechanism market prices. This means we have been very successful in bidding into the high value Frequency Response markets offered by National Grid.

For machine learning, we combine our internal data with external sources, such as historical power prices, weather forecasts and power system maintenance schedules published by the system operator. Our machine learning models are constantly developing and getting richer and deeper with time and increasing data.

Our approaches to big data, advanced analytics and machine learning, alongside extreme uptime performance, enables us to provide energy suppliers, energy traders and other electricity system participants with the opportunity to offer new products and services to their customers delivering improved returns for owners of distributed energy resources. We provide our partners with better decision-making capability and the opportunity to automate decisions about when and how to use an asset, leading to more accurate, more reliable and quicker responses to value opportunities.

In addition, we have built the UK’s only dedicated micro-grid emulation and testing laboratory enabling rigorous testing of new applications, algorithms and asset classes such as batteries and electric vehicles, prior to real-world deployment with customers across the country.

Optimization

As far as commercial and operational optimization is concerned, we rely on the vast body of knowledge and practice from the field of operations research and are also developing our own proprietary frameworks and algorithms for deterministic and stochastic optimization.

We aggregate and optimize the distributed assets of our clients strategically and operationally. At the strategic level, we optimize decisions as to which assets to bid into which submarkets and at which parameters. These submarkets can vary from imbalance price chasing and day-ahead trading to monthly frequency response tenders to annual capacity markets. We also model and simulate expected prices in the different submarkets and quantify and measure associated risks and opportunities.

The operational optimization takes place once assets have already been committed to specific submarkets and we need to realize the objective of meeting the relevant commitments at the least cost. These operational optimization problems are usually deterministic and amenable to mixed integer-linear programming solutions. We articulate the optimization frameworks and formulate efficient mathematical solutions in-house. We then use commercially available powerful solver engines to run our optimization models on. Finally, we integrate all the different components into a single platform.

In a world where big data is being managed securely and effectively and where the forecasts are operating well (and learning), owners and managers of DERs will be able to make better decisions about where value can be derived. Ultimately it may well be computers that are proposing value strategies and executing them, with minimal intervention by human managers.


show all news