Sunday, February 28, 2010

It's Time to Get Serious about Smart Grid Data Volumes

Ever since October 2009 when Jack ran the numbers and wrote about the dramatic increase in data utilities would soon experience, and then this post a couple of weeks later responding to some of the initial disbelief, our sensors began picking up a spike in discussions and debates about Smart Grid data volumes. (One great source is Jesse Berst's webcast "High Performance in Data Management" featuring experts from Accenture, Oracle and Xcel's Smart Grid City).

Jack's first post, a wake up call culled from real data, faced some understandable incredulity. But the second post, titled "That Smart Grid Data Surge We Mentioned Earlier? You Can’t Ignore It", mustered more positive and very constructive feedback, like this:
Nicely done, Jack. This is a hard issue to get across, especially the need to set up the data collection with the simple question "What do I want to know?" I think you're correct about the coming data surge - but it won't only come from the customer side. The ability to collect information and automate some decision-making from generation through distribution will also involve enormous data flows. What will make the grid smarter is to integrate the data on many scales; to provide analyses for optimizing all kinds of operations along the value chain; and to enable better planning and resourcing. And how to do all this securly - and with many different legacy systems in the mix - is a huge challenge.
It's gratifying when readers catch the conceptual ball you've thrown and run with it. Even more gratifying when kernels of your thoughts are amplified by industry leaders themselves. Such is the case with the recent article by Austin Energy CIO Andres Carvallo. While "New Epiphany: Smart Grids Require Real-Time all-IP Networks" concerns itself with immense Smart Grid data volumes, its emphasis extends beyond mere storage to collection, transport, analysis and architecture:
If we were collecting real-time data from the 500,000 devices on our network, we would be generating about 40 petabytes of data per year from 100 terabytes today. ...the amount of data that we would need to collect and keep might be close to 10 petabytes annually vs. 40 petabytes. That amount of real-time collection, analysis, and decision making can only be achieved with a real-time all-IP network.
and concludes with some very timely advice from the field:
If you are in the middle of deployment, you will need to find an upgrade strategy sooner rather than later. If you decided but not yet deployed your Smart Grid / AMI choice, you still have time to switch to the right technology and partners. If you have already made your decision and deployment, your partner(s) needs to give you an upgrade path at a very reasonable price.
You heard it here first. The Smart Grid data surge writing was on the wall. Now it's front and center, and it's time to start planning accordingly.

No comments:

Post a Comment