While most industries will feel the impact of Big Data analytics in the next few years, the electric utilities are already in the throws of the Big Data revolution, writes Wikibon Big Data Analyst Jeff Kelly in his latest Professional Alert, “Big Data in the Utilities Industry”.
The key technology in this case is the “smart meter”, a.k.a. advanced metering infrastructure (AMI), that sends energy usage readings wirelessly from each meter back to the utility every 15 minutes. Instead of the 12 readings a year of manual systems, this generates 12 readings every 15 minutes, or 35,040 readings per meter per year. Multiplied by millions of customers, particularly for utilities serving large urban areas where each apartment in each high-rise building is a separate meter, that becomes Big Data.
And, Kelly says, that data can allow analysis never possible with manual systems. One basic example is a comparative chart showing individual customers their energy use for every 15 minutes throughout the 24-hour day vs overall demand levels. It can allow utilities to charge customers on a sliding scale according to overall demand, making it more expensive for them to run their dishwasher or clothes drier, for instance, during peak hours than during low-demand times. This can encourage demand shifts that will result in more efficiency across the electric grid, decreasing cost and environmental impact.
To an extent the same may be true for large users such as data centers. CIOs can see their data center’s electric demand through the day and potentially shift some load generators onto second- or third-shift timeslots, taking advantage of lower-cost, off-peak times.
This data also can potentially support better business decisions by providers, particularly in terms of investment in the right mix of energy sources as generation shifts from primarily coal, water power, and nuclear to a mix of natural gas, wind, and solar. However, as the use of AMI grows from the present estimate of 16% of customer meters in the United States today to an expected 50% penetration in 2016 and near-100% by 2020, the first challenge is just to capture and support business analysis of the data.
Kelly recommends that utilities should first identify the business use cases where AMI data analytics can provide the most value. Second, they should evaluate related business processes to gauge the level of adjustment necessary to implement data-driven decision-making. Finally, they should work with both internal IT and outside providers to evaluate the ability of current data management and IT infrastructure to support Big Data analytics processes.
Latest posts by Bert Latamore (see all)
- Wikibon view: IBM legitimizing Spark to compete with Hadoop - June 19, 2015
- Red Hat flying high as big customer event opens next week - June 19, 2015
- Wikibon: Key ingredient of future storage architecture is leadership, not tiering - June 11, 2015