Utilities stand on the brink of a data revolution. Smart meters, utility infrastructure sensors, weather data, and social media data all offer tremendous insight into and new possibilities for infrastructure operations. Artificial Intelligence (AI) and machine learning help utilities to harvest the potential of this data revolution.
Conversely, aging infrastructure, regulatory changes, and an increased distributed energy resources (DER) into the grid, however, are pressuring utilities to become increasingly more cost-efficient. Big data, AI and machine learning are three technologies that have already proven to make utilities more cost-efficient. This article will give you a short introduction to how and why this is crucial to future utility infrastructure operations.
Read also our guide to efficient power grid operations for the digital age.
Big Data Revolution and Smart Grids
Over 90 percent of the world’s data was generated during the last two years. As the number of IoT-devices increases, the amount of generated data will only become greater. According to Gartner, the number of IoT-devices will grow from 8,4 billion in 2017 to 20,4 billion globally by 2020. In the same year, Forbes predicts 1.7 megabytes of new information will be created every second for every person worldwide.
The IoT-enabled, data-driven era we live in will have an impact on utilities. The massive roll-out of smart meters is one example. With automated meter readings on an hourly or a sub-hourly basis, the amount of data these devices generates will be much greater than utilities previously have been required to manage. Combined with a growing number of IoT-devices in electric grids or other utility infrastructure, the volume of data is enormous.
As a result, data analysis and legacy IT systems are completely incapable of coping with the enormous amounts of data flowing from an increasingly digitalized infrastructure. Additionally, these systems are unable to utilize the data in operations. This is where AI and machine learning comes in.
In short, AI and machine learning are techniques that help us intelligently manage large data volumes. By using data-learning algorithms, machine learning enables computers to find hidden patterns and generate new insights without being explicitly programmed to search a particular place in the data. Let me give you a few examples how this technology will help the utility industry.
Intelligent Drones, Predictive Maintenance, and Outage Management
AI and machine learning have played a significant role in grid repair and electricity recovery after the hurricane “Irma” ravaged large parts of Florida and left over four million households without power. Intelligent drones, equipped with sensors and intelligent software, flew alongside the damaged electric grid to identify network faults, helping local authorities restore critical infrastructure to its normal, functioning state.
With the help of AI, these specially equipped drones are capable of flying along the grid and taking dozens of images every second and analyzing these images in real-time. Machine learning then helps these systems learn from the analyzed data to gradually improve the drones’ ability to identify faults in the grid on its own.
Undoubtedly, intelligent drones will become an essential part of future utility infrastructure operations. On the one hand, they have the potential to reduce outage time by 50 percent. On the other, the drones’ analytic capabilities provide grid operators with valuable information that helps utilities reduce time spent on the costly inspection methods commonly used today.
Load Forecasting and Reduced CAPEX
With more than 750,000 EVs sold worldwide last year, compared to the near 550,000 sold in 2015, the number of EVs is growing quickly. Furthermore, new electrical loads requiring significant power levels in short-time intervals have become widespread. Additionally, more renewable energy is increasingly being fed into the electric grid making it far more difficult to keep the grid in balance.
Utilities have traditionally relied on historical data to predict loads through generic models and systems. Usually, this involves operating with smaller data sets, analyzing them to test a hypothesis, and use the resulting insight to create a load forecasting model for the coming day, week or month.
Much in the same way as the financial sector use AI and machine learning to refine their mathematical models for predicting shifts in the market, the same technology will give utilities new, improved load forecasting methods.
AI and machine learning allow utilities to more precisely predict consumption levels during a particular period in a given area. Juxtaposing data from various households under a substation with weather data, historical data, and other relevant data can provide utilities with far greater insight into the capacity levels needed to avoid outages and whether or not infrastructure reinforce is necessary. As a result, utilities can significantly reduce their capital expenditures.
Detect Non-Technical Losses
Although I have mainly focused on electric utilities in the above examples, AI and machine learning are of equal importance to utilities in other industries. One example is water utilities. Water leakage is a significant problem in the water industry with close to 25-30 % of a utility’s water being lost in the network. Smart meter data, AI, and machine learning can give water utilities the opportunity to identify these leakages and other non-technical losses, as you can read more about in my colleague Davide Roverso’s blog article on the subject.
The utility of the future is digital. Data has already become a valuable asset in its own, and AI and machine learning will become essential tools to reducing both operating costs and capital expenditures. As the utility environment undergoes massive changes and efficiency requirements rise to a whole new level, these technologies are indeed a welcome addition to utility infrastructure operations.