Big Data Analytics Could Reduce Power Grid Outages

Big Data – The power grid is one of those things that most of us take for granted, but it’s time to acknowledge that it’s vulnerable to power outages due to age, variability of distributed renewable generation resources and attacks. The annual cost of short power interruptions (i.e., five minutes or less) in the U.S. is $60 billion, and in Canada, momentary outages (one minute or less) cost $8 billion annually, while sustained outages cost $4 billion.

To help avoid such outages, the National Energy Technology Laboratory (NETL) of the Department of Energy (DOE) announced the award of nearly $7 million to explore the use of big data, artificial intelligence, and machine learning technology and tools to derive more value from the vast amounts of sensor data already being gathered and used to monitor the health of the grid and support system operations. A Texas A&M University team led by Dr. Mladen Kezunovic, director of the Texas A&M Engineering Experiment Station’s Smart Grid Center, received a $1 million NETL grant to use Big Data Analytics (BDA) to automate monitoring of synchrophasor recordings.

The DOE projects are expected to inform and shape the future development and application of faster grid analytics and modeling, better grid asset management and sub-second automatic control actions that will help system operators avoid grid outages, improve operations and reduce costs.

Kezunovic, Regents Professor and the Eugene E. Webb Professor in the Department of Electrical and Computer Engineering, will lead the project “Big Data Synchrophasor Monitoring and Analytics for Resiliency Tracking (BDSMART).”

The project will use BDA to automate the monitoring of synchrophasor recordings, which will improve assessing events that may affect power system resilience. The proposed BDA will be used to automatically extract knowledge leading to event analysis, classification and prediction, all used at different stages of the grid resilience assessment: operations, operations planning and planning.

This article originally appeared on today.tamu.edu To read the full article, click here.

Nastel Technologies uses machine learning to detect anomalies, behavior and sentiment, accelerate decisions, satisfy customers, innovate continuously.  To answer business-centric questions and provide actionable guidance for decision-makers, Nastel’s AutoPilot® for Analytics fuses:

  • Advanced predictive anomaly detection, Bayesian Classification and other machine learning algorithms
  • Raw information handling and analytics speed
  • End-to-end business transaction tracking that spans technologies, tiers, and organizations
  • Intuitive, easy-to-use data visualizations and dashboards

If you would like to learn more, click here.