Utilities Improve Grid Reliability with Hadoop

Big Data in the Utility

The utilities industry is historically slow to adopt new technologies, and big data is no exception. However, a handful of utilities are now looking to an enterprise data hub built on open-source Apache Hadoop as a powerful, scalable, and secure platform to derive and deliver greater value from the exponential increase in multi-structured and multi-source data.

As a highly regulated service provider, the utility’s primary obligation is uninterrupted provision of services without price increases due to unforeseen demand. The first half of this scenario is tied to grid reliability and is benchmarked against the system average interruption frequency index (SAIFI) and the system average interruption distribution index (SAIDI). With sufficient insight into data from a combination of advanced metering infrastructure (AMI) systems that handle measurement of consumption (e.g., smart meter, smart grid), outage management systems that identify and track both momentary and sustained downtime, and external systems tracking weather, roadways, public works, and other influencing factors, utilities are able to manage more uptime and respond to changes more quickly.

Better Uptime, More Satisfied Customers

With an enterprise data hub, service providers are able to converge complete streaming data from multiple sources in full fidelity. Where purpose-built systems historically lagged notifications provided by customer complaints, analytics and visualization based on Hadoop tools provide a real-time view of outages and their causes so that the utility can fully comprehend the extent of a reliability issue and respond accordingly, often even before the impact can be observed by consumers. Not only does this result in better service and higher levels of customer satisfaction, but it also drives efficient secondary resource allocation: reducing maintenance costs, preventing unnecessary truck rolls, and minimizing customer service and support.

According to the U.S. Department of Energy in a November 2014 report, combining AMI and outage management system data infrastructure, utilities who were running analytics on smart meter streams were able to effectively upgrade the most vulnerable feeders and substations and prioritize customers whose outage costs were the highest. In impact case studies, the Electric Power Board in Chattanooga, Tennessee, reported a 40% improvement in SAIDI and a 45% improvement in SAIFI between 2011 and 2014. PECO Energy in Philadelphia, Pennsylvania, reported avoiding 6,000 truck rolls and resolving issues up to three days faster than the historical average during Superstorm Sandy in October 2012. Florida Power and Light Company in Juno Beach, Florida, reported reducing substation transformer customer minutes interrupted by half a million in 2014.[1]

Future-Proof the Utility with Hadoop

As utilities adopt big data strategies and become more proactive, in terms of value creation and capture, the most insightful are implementing an enterprise data hub built on Apache Hadoop. An enterprise data hub helps utilities transform IT from a cost center to a profit center by centralizing data of all formats, structures, ages, and origins, bringing together multiple users across lines of business for the first time, and providing a single, scalable platform for multiple value-added workloads beyond storage and preparation: processing, analytics, visualization, and data science.


[1] U.S. Department of Energy. Smart Grid Investments Improve Grid Reliability, Resilience, and Storm Responses. SmartGrid.gov. November 2014.

The post Utilities Improve Grid Reliability with Hadoop appeared first on Cloudera VISION.

Leave a Comment

Your email address will not be published. Required fields are marked *