A couple of weeks ago, I took a look at the data
provided by the teams at PGE and Austin Energy, combined it with data provided by DOE, and I arrived at the conclusion that the Smart Grid will create a glut of information
that the utilities had best begin planning for, because it could easily swamp both the utility and the networks that are expected to carry it.
Unsurprisingly, there was a fair amount of interest in both the conclusions I had reached and in the substantiation of the data I had used. Some of the inquiries were pretty straightforward. My thanks to Editor Katie Fehrenbacher from Earth2Tech
for her thoughtful questioning and for introducing me to some equally reasonable experts from the IEEE.
Others were less open to the concept, and there were two main objections to the data. The first was based in existing utility practices. This line of questioning had within it the expectation that a meter read would only contain basic information about the identity of the power meter, the timestamp, and the meter reading itself. Were that the case, it would be possible that the data would be in a paltry range, around 14 bytes per read, resulting in a belief that such a small amount of data would never amount to anything like the avalanche I had described in the piece. The second objection was that there was little likelihood that such data was going to be stored for long, meaning, I guess, that we could design the system as though it had never arrived at all. Many of the questions came from individuals with strong/long histories in utilities, so I felt it my responsibility to validate, again, my data.
While I consider myself to be relatively well-versed on the core of these topics, it is the nature of this blog to focus on my expectations of the future based on information provided elsewhere, by others more directly in the path of the Smart Grid. That said, credibility is a big deal for us, and I decided to go back to Austin Energy, and understand better the reality of the situation from the folks who are actually doing the job, and who are considering these concerns as fundamental parts of their planning for successfully serving their clients on the new grid in the years to come. Andy and I called Andres Carvallo
and Karl R. Rábago
at Austin Energy, and they generously agreed to help us understand the world and the Smart Grid that they are planning for.Smarter Grid versus Simpler Meter-Reading
One of the first things I learned was the richness of information gathering and interactivity that these gentlemen expect to coax from the new grid infrastructure. While time, location, and power used are at the heart of a meter read, there is much more to be learned. Investment in the Smart Grid would have a maximum return when the savings were more than a human reader's footwear and gasoline. Some examples are:
Device Health Information
By watching for varying temperature, periods since outage, battery power, heartbeat, and other meter variables, it is possible to better predict and recover from any failures that may happen.
- Real Time Monotoring
- As has happened historically with most new technologies, it can be expected that people yearning for more data will only be satisfied by that which is most current. It is unlikely to happen in the general population immediately, but history shows us that it is likely that such a real time monitoring feed may be in demand almost immediately, as customers recognize that there is now more information through which they can better manage their energy.
- Energy Services Provision trumps Energy Provision Services
- There are doubtless going to be additional requirements from the newly informed and empowered customer base for functionality that is logically delivered by the provider. This was a real eye opener for me, that Power Providers are now actively thinking about services that they can offer over the new and smarter infrastructure. Things like profiled energy use: "I am going away, manage my power." or "There is a spike in prices, manage me down by 10%", or "I only want to use power that is generated from renewable resources." These all require data, new interfaces, and a channel overwhich all of the control and monitoring information can be passed. Winners in the new market will be finding ways to capitalize on the need for energy-related services, and will not limit their investment to further driving down the costs of simply providing energy.
- Networking Overhead
- Given the complexity, regularity, and importance of this data, it is clear that a protocol (Like IP) will probably be adopted to package up and send all of this information in a payload to central systems for analysis, aggregation, storage, and action. Protocols carry their own overhead in terms of describing their content, sources, destinations, etc. None of this is free from the perspective of the systems carrying or storing the data.
- Other Factors
- We are only just beginning to see the potential for Smart Grid and Soft Grid enablers, leading me to believe that even my estimates are very likely to be low, particularly as we clamor for realtime monitoring and data analysis.
Based on all of this, it looks like the numbers are far from a simple 14 Byte read
, and are more likely in the range given by Andres of 4K to 16K
per reading. If we estimate the maximum case, the numbers are even higher than I had referenced in the earlier article. Let's not think about real-time (the numbers are mind-numbing), but instead look at a simple check every 5 minutes. 12 (reads/hr) X 24 (hrs/day) X (365 days/yr) X 16K (Bytes/read)
yields roughly 1.7GB/meter/year
. Multiply that by the number of meters (pick your own scope), and I think the challenge is clear. For more reality, take that number and multiply by 5 for readings every minute, or by 300 for readings every second. That's big.
So, is this a problem because the data going to cause the Smart Grid to explode like a flawed radiator hose in July? I don't think so. I think that time has proven that technical advancement has always helped us stay ahead of crushing data or processing burdens by decreasing computing and memory costs. This has allowed us to paper over our excesses with iron and silicon.
No, this is a problem because rushed, tactical, and incremental hardware adds will not make that data secure. It has to be expected that as organizations run out of room for data, they will simply rush to add more. Caught in a flood of data, the pressures for survival
and successful operation will naturally trump
any meaningful consideration of rearchitecting data storage for adequate and appropriate security
.This planning (and budgeting) needs to happen now
. As Andres said on our call, "You cannot simply build an airplane for passengers who are 5'6" tall and weigh 140, because you can guess that your average passenger, much less your larger passengers, will simply not fit, because they are not that small." In other words, you need to plan for what you can reasonably expect, not for what will make your life, your business, or your CFO, ecstatic.
I think that this is the final insight. For firms that are seeing the Smart Grid as an enabler for cost-savings by transferring operations onto an IP infrastructure, or a wireless metering system, there is little reason to be concerned with a data glut.
For those who recognize that the Smart Grid and the coming Soft Grid will need data, and will need security, and will likely grow to fill whatever space is available, the call is clear. Plan for an avalanche, for a flood. Create systems and segregations that will allow for managing these flows reliably. Characterize what must come through, and what can be dropped, along the way to the back end. Do all of those things and the current systems will be fine, the next systems will not choke, and the ultimate end state will be similar enough to what has been planned to ensure stability, quality, and cost-effective services to all who connect to the grid.
The data surge is coming, and you can either surf it, or be pounded by it. You certainly will not be able to ignore it.
Image Thanks to: