Sunday, February 28, 2010

It's Time to Get Serious about Smart Grid Data Volumes

Ever since October 2009 when Jack ran the numbers and wrote about the dramatic increase in data utilities would soon experience, and then this post a couple of weeks later responding to some of the initial disbelief, our sensors began picking up a spike in discussions and debates about Smart Grid data volumes. (One great source is Jesse Berst's webcast "High Performance in Data Management" featuring experts from Accenture, Oracle and Xcel's Smart Grid City).

Jack's first post, a wake up call culled from real data, faced some understandable incredulity. But the second post, titled "That Smart Grid Data Surge We Mentioned Earlier? You Can’t Ignore It", mustered more positive and very constructive feedback, like this:
Nicely done, Jack. This is a hard issue to get across, especially the need to set up the data collection with the simple question "What do I want to know?" I think you're correct about the coming data surge - but it won't only come from the customer side. The ability to collect information and automate some decision-making from generation through distribution will also involve enormous data flows. What will make the grid smarter is to integrate the data on many scales; to provide analyses for optimizing all kinds of operations along the value chain; and to enable better planning and resourcing. And how to do all this securly - and with many different legacy systems in the mix - is a huge challenge.
It's gratifying when readers catch the conceptual ball you've thrown and run with it. Even more gratifying when kernels of your thoughts are amplified by industry leaders themselves. Such is the case with the recent article by Austin Energy CIO Andres Carvallo. While "New Epiphany: Smart Grids Require Real-Time all-IP Networks" concerns itself with immense Smart Grid data volumes, its emphasis extends beyond mere storage to collection, transport, analysis and architecture:
If we were collecting real-time data from the 500,000 devices on our network, we would be generating about 40 petabytes of data per year from 100 terabytes today. ...the amount of data that we would need to collect and keep might be close to 10 petabytes annually vs. 40 petabytes. That amount of real-time collection, analysis, and decision making can only be achieved with a real-time all-IP network.
and concludes with some very timely advice from the field:
If you are in the middle of deployment, you will need to find an upgrade strategy sooner rather than later. If you decided but not yet deployed your Smart Grid / AMI choice, you still have time to switch to the right technology and partners. If you have already made your decision and deployment, your partner(s) needs to give you an upgrade path at a very reasonable price.
You heard it here first. The Smart Grid data surge writing was on the wall. Now it's front and center, and it's time to start planning accordingly.

Monday, February 22, 2010

An Informed Public and an Informed Grid

"Secrecy is the enemy of efficiency, but don't let anyone know it."

Privacy advocates, forward-thinking utility CIO's and all manner of security folk are getting increasingly charged up over the influx of consumer information required to improve the efficiency and flexibility of the grid. Because there has been so much public scrutiny in cases of accidental or malicious revelation of private data in other industries, it's understandable that people are wary about adding yet another place where their privacy can be invaded.

In the case of banking, retail, and health care, the integration of private information was intended to provide personalized access to information, to trinkets, and to better medical care. This included very sensitive personal details about our bodies and behaviors. And the loss of it is always jarring, particularly when we are required to suffer the consequences of credit monitoring, ID theft, or the knowledge that our illnesses or treatments might become known to complete strangers. It has not been a pleasant road. All of these public exposures have left us feeling that our privacy is no longer truly our own, and we have yet to feel that an industry has taken adequate precautions to protect us.

Unfortunately, the Smart Grid requires even more information to make any sense at all. Without usage and identification information, the new grid cannot interact with us meaningfully. It cannot help us to understand and change our consumption behaviors, and it cannot treat us uniquely in our use or production of power. What's more disconcerting is that this consumption information is as intimately woven with every part of our lives as is our use of power, whether we are talking about our cars, our televisions, our homes, or our laundry. So what can be done differently, this time? Here are a few ideas for you.

Focus on Action, not just Awareness
The Smart Grid is already happening all around us. Historically, emphasis on security has been on creating an informed public, capable of making informed decisions about whether or not to share their records (HIPAA), to visit a website, or to use a bank's online systems. Because the Smart Grid's evolution is driven by information, and because that evolution is underway as we speak, informing the public is necessary, but it is not nearly enough. A good example of disclosure with little recourse can be found in privacy statements everywhere. Here is an example from an actual energy company website. I have redacted the name of the company in question:
Remote Monitoring Information Collected Automatically
The monitoring service itself includes an automated, Internet-based process of receiving transmissions from the XXXXXXX XXXX monitoring equipment about your solar equipment, its output, efficiency, and other variables. This information is recorded and preserved by XXXXXXX XXXX on our company computer storage facilities, and may be accessed by you, if you subscribe to our remote monitoring service, and by us whether or not you subscribe to that Service. The XXXXXXXXXXX Management Unit ("XMU"), once connected to the Internet, immediately begins reporting this information to XXXXXXX XXXX and will continue to do so as long as the XMU is connected to the Internet. By having your XXXXXXX XXXX XMU connected to the Internet, you consent to this automatic information reporting. We retain this information indefinitely, and we may use it for any purpose, in our sole discretion, including but not limited to quality assurance, engineering performance comparisons, and product improvements. If you purchase our remote monitoring service, you may also choose to provide others with access to this information, including the installation company which installed and/or which services your solar energy equipment.
This is not a bad privacy policy, nor is it inappropriate. It tells a story that will be repeated over and over again in the new world of the Smart Grid. Unlike traditional website privacy statements, however, the absolute requirement for customer acquiescence to these conditions removes any real ownership of the decision from the client, and places an enormous responsibility on the providers themselves. By requiring this information, they are committing to do what they must to protect it.

Be Reasonable
While both sides of the privacy debate position very strong arguments either for or against the sharing of data, there is clearly a middle ground to be reached. There is a good description of the potential damages resulting from over-exposure of private data by Rebecca Herrold, at While each of us can consume and understand these issues as raised, they will be most productively considered as scenarios to prevent, than as reasons to avoid the sharing itself. As well, each needs to be tempered with the likelihood and potential impact of occurrence in preparing a plan to prevent it.

Similarly, the Smart Grid does not need to know everything, all the time, and does not need to share everything with everyone involved. While consumers may accept the need to share more, in order to achieve the benefits described, there are many shades of grey when it comes to how much of that information needs to be stored, tagged, transmitted, or aggregated. Nowhere is this more clear than in the NIST 7268 discussion of information sharing. Take a look at this diagram (click to enlarge):

As shown in this figure, there are all kinds of systems, with all kinds of data, and all kinds of likely connections. There must be a construction of a new data-sharing paradigm, much like "least privilege", that relates to "least sharing".
  • No data element should be shared, at all, unless necessary to a specific function
  • No data element should be tagged with identifying information, unless necessary to a particular function
  • No data element should be stored without a compelling reason, it should otherwise be destroyed
  • If a data element is stored, the security of that storage should be appropriate to the data's characteristics, and not to some perception of likelihood of attack or compromise
Thinking Smaller to Make Protection Bigger
Because the Smart Grid and its requirements for information are changing so quickly, it will be foolish to think that data privacy can be completely figured out in the next 12 to 24 months. Individual states have varying regulations around ownership of customer data. The final set of information to be gathered or shared has not yet been described, and all of the systems that will be permitted to touch it are far from being designed or even adequately described. As such, draw no conclusions about which data elements can be automatically combined and sent or stored together. The easiest mistake to make in these early days will be to insufficiently separate the data elements. By better understanding and describing security characteristics of individual components, it is much easier to tailor and measure the security necessary to protect that element and it's particular security needs.

Is it so different?
These privacy challenges are not so different than those that could have been envisioned in other industries, but which were overlooked. On this blog, we often write about taking the opportunity to learn from past IT security mistakes in order to improve the future IT world of the Smart Grid, and there are definitely lessons to learn here, about planning, design, and resolution of security concerns early in the cycle.

In the past, when customer profiles or patient records have been treated monolithically, the breach of any accessing system has been enough to expose all. It is not simple to segregate the data, and to assess security policy for all elements. If it is done upfront with consistency, the benefits will definitely outweigh the costs, particularly as these systems and their exposure necessarily become at once more pervasive and more critical in our lives.

Images courtesy of:

Thursday, February 18, 2010

Cyber Shockwave Post Mortem

When the grid goes down, almost everything goes down. Lessons learned, there are plenty. But first, the Bipartisan Policy Center (BPC)'s own summary of the game:
Cyber ShockWave highlighted the immediate, real dangers of cyber-terrorism by bringing together a bipartisan group of former senior administration and national security officials playing the roles of Cabinet members. The simulation envisioned an attack that unfolds over a single day in July 2011. When the Cabinet convenes to face this crisis, 20 million of the nation's smart phones have already stopped working. The attack, the result of a malware program that had been planted in phones months earlier through a popular "March Madness" basketball bracket application, disrupts mobile service for millions. The attack escalates, shutting down an electronic energy trading platform and crippling the power grid on the Eastern seaboard.
By all accounts I've read, it was chaos from start to finish. An overwhelming trio of info problems faced the surrogate executive decision makers: 1) of a lack of quality information, 2) a lack of confidence in the information being received and communicated, and ultimately, 3) information overload ... all of which led to paralysis.

The echoes of 9/11 and in particular, the control room confusion depicted in the fantastic film version of "Flight 93", are quite strong. If you can't tell who's attacking you or how or why, how can you decide upon the right courses of action in near-real time? The compulsion to action is great in these situations, but absent the most fundamental situational awareness, almost all actions are futile or worse. And by the time you do begin to understand what's going on, it's far too late for meaningful defense. At best, offense and well-informed reprisal are for another day.

Dark Reading's take, which finds the US response wanting, is here. And the Dark Reading's CS blog touches on Shockwave as well.  Written by the Computer Security Institute's (CSI) director, Robert Richardson, some of his points are definitely worth a look. The first addresses the profound lack of crucial domain knowledge in the crisis room:
The unspoken, unquestioned common assumption on the panel seemed to be that policy about technological infrastructure and the security of that technological infrastructure could be readily decoupled from knowledge of the technology itself. Obviously, policy can't get mired in details. But, on the other hand, digital infrastructure is shaped by how it is implemented and managed--and policy responds to that shaping. So my take is that even at the highest levels, somebody in the room should probably know what he or she is talking about when it comes to, say, how viruses propagate. The Secretary of Defense, somewhere back in time, went through boot camp. Who in the room knows the basics on how packets are routed? Right now, nobody
While there's little cyber security practioners can do to address some of the initial Shockwave concerns, Richardson finds two gaps we could begin to help close:
... how we improve attribution of attacks to their perpetrators and the question of how easily subverted software is kept off the networks are two areas that the security community can potentially address.
The first is a cyber forensics master challenge and as to the latter, we're not going to keep software off networks (networks exist to move software and data). But I suggest we can make software much more difficult to subvert, and should be making that a top priority.

And of course, cyber attacks on US and Global assets never stop, they only escalate in strength and complexity. Here's the latest reported by the Wall Street Journal.

What's next? CNN will air the event exclusively as "We Were Warned: Cyber Shockwave" on Saturday, February 20 and Sunday, February 21 at 8:00pm, 11:00pm and 2:00am ET each night.

Wednesday, February 17, 2010

Mainstreaming the Smart Grid

Loved seeing a USA Today front page article this morning on early consumer experiences with the Smart Grid. To me, press like this is an important indicator of the education and mainstreaming process. The piece describes some money saving success stories and some setbacks too (as Jack did earlier here), but overall serves to demystify the Smart Grid.

The article drew over a hundred comments as of tonight, indicating big interest but also continuing big ignorance and paranoia about why the Smart Grid is being built, e.g.:

  • "I would rather spend money on solar panels on my roof"
  • "Surely you realize that if everyone en masse were to save 15%, the power company will need a rate hike to cover that?"
  • "Smart Meters - so smart the utilities can program them remotely to, show increased consumption?"
And there's always this not completely irrational response to consider and address: "Anything that takes control away from the consumer is a threat." 

Sitting back on our skis isn't going to get us where we need to go. As we've said previously (and others have chimed in similarly), before it gets on board, the public's got to get a big dose of openness and confidence from the industry and government. Now would be a great time for all parties to turn up the volume on where we are, where we're going ... and maybe most importantly, why we're on this trip to begin with.

Monday, February 15, 2010

Exercise Notice: Cyber ShockWave will Hit USA on Tuesday

In the continuing and expanding trend of war games looking at cyber threats, this one tomorrow will simulate a major attack on US critical national infrastructure. Here's an excerpt from the press release:
Washington, D.C. - The Bipartisan Policy Center (BPC) announces it will host Cyber ShockWave, a simulated cyber attack on the United States on Tuesday, February 16, 2010. Cyber ShockWave will provide an unprecedented look at how the government would develop a real-time response to a large-scale cyber crisis affecting much of the nation. The event will take place at the Mandarin Oriental Hotel in Washington, D.C. 
The Cyber ShockWave simulation, created by former CIA Director General Michael Hayden and the BPC’s National Security Preparedness Group, led by the co-chairs of the 9/11 Commission, Governor Thomas Kean and Congressman Lee Hamilton, follows the acclaimed series of Oil ShockWave simulations conducted in 2007 by the BPC and Securing America’s Future Energy (SAFE). Oil ShockWave addressed dependence on foreign oil as a national security threat.
Complete press release is here. We'll keep you posted on findings and lessons learned from this exercise as these are made public.

Friday, February 12, 2010

Seeing NERC CIP through a Software Lens

Thinking about the future grid, AMI and Smart Grid systems can get so complicated that they can be difficult to conceptualize unless you use a construct that limits the scope of what's being considered. Given that so much of the Smart Grid “smarts” involves new applications and other advances in software, an important way to think about NERC CIP and your organization is to focus on your software assets.

10 Seconds of NERC Critical Infrastructure Protection (CIP)
In 1998 Presidential directive PDD-63 introduced the concept of protecting critical national infrastructure across different sectors, from private companies to emergency responders and the DOD. PDD-63 referenced computers and cyber systems a number of times, but as a presidential directive, it was not specific about the component requirements; rather, it focused on the expected end states and the organizations and initiatives that would make them possible.

In the early parts of the last decade, there emerged the IntelliGrid, the Modern Grid, and ultimately, the Smart Grid, in 2006. After much deliberation and the recognition that cyber threats to the grid would loom increasingly large as we moved towards an increasingly networked, info-centric system, NERC’s CIP standards were born. Many of those threats were leveled at, or enabled by, software. The systems that would be providing access, that would be controlling operations, and that would be recording all of the activity were moving to software, and were moving to networks via even more software.

As we enter 2010, utilities’ compliance deadlines for NERC's CIP standards are looming and for some, more stringent deadlines requiring them to be "auditably compliant" are arriving soon. They are required to have a plan for achieving compliance, and by now, utilities must be well along the path towards achieving and maintaining compliance with that plan. What does that mean? As NERC CSO Michael Assante puts it:
“The CIP standards are accompanied by a phased-in implementation plan, designed to give asset owners and operators enough time to become compliant with the standards before they become enforceable. ‘Compliant’ means that the entities are required to comply with the standards and “self-certify” their compliance. ‘Auditably compliant’ means that regular, scheduled audits of compliance with the standards will be conducted.”
The 9 CIP Standards
For your convenience, all of the standards are linked below:
We note that software apps and tools play a role in the day-to-day management of the above domains, and software and software controls themselves are critically assessed in CIPs: 2, 3, 5 and 7-9. History has shown that software plus critical infrastructure begets regulation (see: PCI for the credit industry, HIPAAfor healthcare, DITSCAP/DIACAP for DOD, etc.). In preparation for this, utilities must plan for an uncomfortable amount of new attention to be paid to the ways in which they monitor, manage and demonstrate their compliance. In many cases this will mean certifying the security of their new and existing software, likely via even more software. This is not trivial, and a virtual industry has already sprung up around achieving CIP compliance.

NERC and NIST on Cyber Security
The focus of the NERC CIP has always been easy to see from its own name. It has always attempted to steer utilities to descisions that would enhance reliability. Current efforts underway from NIST, and their work in Smart Grid cyber security standards are different. As NERC’s own comments to the first NISTIR draft on cyber security called out:
“The CIP Reliability Standards apply to installed equipment and require security controls be applied to manage risk in the operation and maintenance of cyber assets. However, the protection goals of the Smart Grid, on the other hand, are broader, and address component security, integrity of communications, privacy and other cyber security considerations.”
So there’s plenty to consider regarding the acquisition, use and protection of software assets in a NERC CIP context. It’s a little ironic, but we note that many of the controls NERC and NIST are recommending to better secure critical cyber assets are themselves made out of software, and by definition, are susceptible to being manipulated or circumvented by determined assailants.

Focus on Critical Infrastructure Leads to Focus on Software
The Smart Grid is evolving and so are the CIP standards. We’ll be doing a CIP deep dive, one standard at a time, in subsequent posts. In the mean time, where critical and less-than-critical software systems are involved, it’s probably best to imagine what your organization will do if and when those systems are attacked and breached. That’s the nature of the cyber attack and cyber defense world these days. Best to have a Plan B warming up in the bullpen, and Plans C, D & E loosening up as well. Stay tuned.

image courtesy of: / CC BY 2.0

Monday, February 8, 2010

NERC Insights on NIST's Direction

In a piece today at Smart Planet, John Dodge wrote about the new version of Smart Grid cybersecurity guidance from NIST, and pointed back to an earlier piece I had written here, on a view of the first draft of NISTIR 7628, where I had referred to that tome as "dense, but readable". As I continue to review the most recent release, out this month, which lives here, I am still impressed by Annabelle Lee and the NIST-led team's capability to synthesize so much information into a digestible document, but I will admit that there is quite a bit here to get through. There is a sheer printed shelf weight increase in requirement detail of 34% (from 236 to 305 pages), not that I would print it out, but you get the point.

I'm not sure how others will approach the effort to understand the origin and evolution of the new version of requirements, but I thought that one way was to take a look at the comments that were submitted to NIST in response to the initial draft. I figure that the type and urgency of concerns with Draft 1 that find either resolution or rebuttal will give a rough sense for the industry's comfort with the process.

Much to see
NIST provides an open community and process for developing these recommendations, and part of that openness includes the contents and disposition of comments received. You can also take a look at them, (and I recommend it), here. It was in reading through these comments, and the responses to them, that it struck me how far we have yet to go, if we are to deliver a new grid that is flexible, resilient, and informed.

Andy and I have both spent a fair amount of time discussing the disconnects that we have seen between the security experiences and expertise of the Utility sector information technologists, and those of the residents of the more conventional IT and IT-security environments. Most articles you will find in the public arena describe within utilities a perceived unpreparedness for the polymorphic and omnipresent attacks that will arrive from the great unwashed networks as the Smart Grid advances the network underpinnings and interconnectedness of our power infrastructure. Reading through these comments, however, and taking the time to digest some of their meaning, caused me an odd combination of comfort in the level of thoughtfulness and thoroughness of some of the legacy community reviewers (particularly those from NERC), and anxiety that the Grids of present and future are not at different positions along a similar path, but are each seeking progress on very different, if parallel, tracks.

There are three comments that really caught my attention, not so much because they uncovered a new area of weakness that I hadn't considered, but because of the straightforward and conclusive manner in which they were posed. The first is Comment #35, and within it is this recommendation:
In an organized and designed way, NIST and the industry need to develop a focus on response and recovery. While the first goal of a cyber security strategy should be on prevention, it also requires that a response and recovery strategy be developed in the event of a cyber attack on the electric system. More planning and investment is needed to develop response and recovery actions, while continuing to develop a strategy for prevention of a cyber security incident.
Bravo! We have said for some time now that the sheer magnitude of the expansion of connectivity, access, services, companies, and personnel, will necessarily make the grid more susceptible to attack, but that sound design and deployment should nonetheless make it far more resilient. Less happily, the comment and recommendation can't get too far in this venue, given the nature of this document and draft. The response?
The NISTIR is a high level document addressing response, recovery, and prevention. Each organization will need to define the core components of their respective Smart Grid deployments.
Not so Bravo-ish. The response is mainly to a second recommendation in the comment regarding critical components, their reliance on technology, and their role in recovering service. It does not evoke support for the idea of a violable but reliable Smart Grid, engineered, like a Bop Bag, to bounce back every time someone tries to knock it down.

A second comment (#40) that attracted me was related to the context of the NIST risk assessment, and the relatively static way in which the document described the challenge of security the Smart Grid.
NIST’s overall risk assessment is flawed because it does not capture the essential idea that Smart Grid is not a point in time. That is, one specific action cannot be taken regarding cyber security that will protect the system as a whole. Because the Smart Grid will evolve in pieces and parts, every time a new piece or part is integrated into the Smart Grid, new system vulnerabilities and variations on consequences could be introduced. Very rarely will the introduction of a new piece or part take vulnerabilities away. Therefore, when they are integrated into the Smart Grid, that piece or part must be customized to ensure that cyber security is integrated into system architectures.
This is exactly right. This is particularly true in our present state, where Smart Grid investments are already well underway, and where new initiatives are more likely to be funded piecemeal than created from whole cloth. Again, though, this comment did not find a home in the document:
Currently, reporting vulnerabilities for controls systems falls under the responsibility of DHS and DOE. We will consider this recommendation in a future draft of the NISTIR.
I guess that if one considers the mode of the system to be one of deployed infrastructure, then the reliance on external expertise to notify of vulnerabilities makes sense. My view of the comment, however, was more that there is a need to consider the characteristics of any component prior to integration, so that augmentations for security can be made if required.

The last NERC comment I wanted to point out is related to the utility of their own approaches and checklists in the new world. Many in the Smart Grid world are shuddering to think of the possibility that the NIST document, or another, will provide some simple "yes/no" set of questions that will invariably lead to a less secure infrastructure, designed to survive the certification, not necessarily the real world. The comment in question is #41, and it calls into question any primary reliance on NERC's own Critical Infrastructure Protection Standards. In NERC's own words:
While the CIP Reliability Standards are designed to shape the behavior of asset owners and operators, they are not designed to shape the behavior of equipment and system designers, manufacturers and integrators. The CIP Reliability Standards apply to installed equipment and require security controls be applied to manage risk in the operation and maintenance of cyber assets. However, the protection goals of the Smart Grid, on the other hand, are broader, and address component security, integrity of communications, privacy and other cyber security considerations.
This recommendation is accepted into the new draft, and while the NERC CIP requirements remain, they are acknowledged as only partial criteria.

Where From Here?
Clearly the NIST effort is delivering real value in terms of illuminating a portion of the concerns regarding the newest parts of the Smart Grid, particularly AMI, and the IT-security heavy areas of network transmission, authentication, reporting, etc. This is the first arena of discovery and recommendation because so much of the operational iron that is early into the mix will rely on some form of standards, or recommendation, or expected best practices, in terms of security.

The arrival of well-informed and broad-based requests from the NERC team, in the form of comments to the first draft bring to light two important facts that I haven't seen given a lot of press:
  • The Smart Grid is not just for Newbies
    The Smart Grid will ultimately only be secured through the cooperative insight and involvement of those most familiar with the existing, putatively "not Smart" grid, who are bringing to the table a realistic view of the less shiny, less novel, aspects of keeping the lights on. From these comments, it seems they are not being dragged into the IT-heavy world of the Smart Grid, but are approaching it aggressively, albeit with understandable compartmentalization and caution
  • There is gap in security emphasis between those that are planning, and those that are doing
    While there has been much work done on the content of the most recent draft of NISTIR 7628, it is intended to only describe a portion of the waterfront. While that definition process continues, there are real decisions being made, and real deployments being undertaken, that are outside the scope of the current NIST effort
In the coming months, we hope to see this disparity lessen, as the NIST recommendations begin to impact the product and process decisions that utilities make based on those reports. Hopefully then, other more broad concerns, such as those highlighted in the NERC comments, will rise in importance and urgency to the industry.

images courtesy of:

Tuesday, February 2, 2010

Dawn of a New Day? Cyber Security Attack Disclosure and Implications for US Utilities

First, a Little Anti-Alarmism
There are changes coming, but the sky is not falling. Our cyber defenses are in need of more attention and more focus, but they are generally pretty good. The Smart Grid is clearly a new chapter, and ensuring we get as much of the required security designed and deployed correctly up front will save all of us a great deal of time and trouble later on.

Software and the Smart Grid
First, Let's start by articulating something that should be obvious: The biggest difference between today’s grid and the Smart Grid is software. You may protest and say "That's crazy! The software is a small part. The Smart Grid is growing with millions of new meters, many miles of new high voltage power lines, innumerable sensors, and of course, fault current limiting superconducting transformers." There is no arguing with those additions, but, when all is said, done, and deployed, the whole system may actually lose some mass, as this IBEW video, linked to in an earlier article here, makes clear.

Now back to the software, the key enabler of the Smart Grid. Over the past 30 years, it has been what separates modern enterprises from their pre-IT ancestors by making them faster, smarter, more efficient and more flexible. However, a well-documented but unintended consequence has been that it has also made them much more vulnerable. It's not just that potential bad guys can cause harm with software tools of their own; the real downside is that even on a good, hacker-free day, a large amount of uncertainty surrounds the consistent operation of this most critical corporate ingredient.

Software Provenance and Security
Most large organizations don't know where their software came from, at least not in a comprehensive manner. Any individual application can come from one or several of the following sources:
  • Internal development teams
  • Outsourced development providers
  • Packaged applications
  • Software as a Service ( SAAS )
  • Web services
From a security perspective, none of the above is necessarily more or less secure than the others. Software provenance is often quite opaque to users. Even when you buy a software from Vendor X, there's no guarantee that all the code was developed by Vendor X coders. There is usually no guarantee that the software is bug-free, that it doesn't include glaring programmatic weaknesses that make it an easy target, or even that it's not already harboring malicious code that can be triggered in the future and cause your organization and / or your customers great harm.

Approaches to securing software systems vary based on what you have to work with. Knowing where and by whom the software was built is a good start. Other factors such as access to source code, access to architects or subject matter experts who really know their way around an application can be a big help. Absent these things you'll want analysts trained and experienced in penetration (or Pen) testing, engineers whose job is to think and act like an attacker, find the easy ways into a system, tell the right folks what they've found, and often recommend hardening approaches.

Attacks on Software Source
All of this, however, is mere prologue to the story that began unfolding earlier this month related to published accounts of attacks against Google and a variety of other popular software vendors. The details are a bit sketchy, but the core elements include:
  • US tech companies have recently experienced a series of very serious cyber attacks that appear to have originated in Asia
  • Google admits that a couple of Gmail accounts were partially compromised
  • Firms report that the apparent target of the attacks was source code relating to popular software packages
This is an interesting phenomenon, because it describes an organic growth model for further hostile behavior. The accounts of the recent attacks in the press are clear on at least two facts: that a zero-day vulnerability led to the breaches, and that source code for familiar software systems was a major target of the attacks on the multiple vendors. According to Richard Steinnon, as quoted on darkreading,
As they get more sophisticated, they are very interested in source code and ways to find new vulnerabilities in software companies' products.
So you see, one feeds the other. Zero-day vulnerabilities are very hard to find. Most popular software packages have been around for a while, and have been well wrung-out in the market. Finding something new and vulnerable in them is neither common nor simple. With the source code, however, it becomes much more straightforward. Looking from the inside out, it is like having a map to the functionality, and weaknesses are revealed that would be very hard to find just searching from the surface. The fact that one of these vulnerabilities was found and then used to steal more source code leads to a conclusion that this is a pretty well-thought-out approach. The attack has been described as sophisticated, and using its spoils to sow the seeds of future attack vectors is equally so.

The Curtain Pulls Back
The big news, however, isn't so much that these events are happening, but rather that they're being discussed so openly. According to Atlantic journalist Marc Ambinder, we have Google to thank for that:
Google's revelation that they'd been hit was deemed a "watershed" moment by security industry analysts, but the other 32 companies who were hit have not followed suit and have begged the government to keep their identities a secret. The government has no choice but to protect their identities -- even as policy encourages greater transparency about the scope of such attacks.
Two weeks ago events reached fever pitch with Secretary of State Clinton speaking out in Washington against nation-supported (if not sponsored) cyber attacks by China and Iran, among others. Basically, she's calling out a new opposition axis, only this time it's isn't an Axis of Evil, it is an Axis of Cyber Threats.

On the Cyber Defensive
In case you didn't know it, US companies and government organizations have long been victims of and targets for cyber attack. This doesn't make the US unique, by any stretch, but recent increases in the frequency of damaging attacks is surprising, given the presence of some excellent cyber security defense programs on our side, and with the increasing instances of public regulation and legislation on the topic. The main culprit appears to be the seemingly innumerable Internet connection points that present attackers with unexpected access to both flaws in software and system configuration errors. These deliver the necessary opportunities for getting to other applications and to sensitive data. With US companies, there is little recourse for companies, little ability to hit back. That's our policy. Again, Mark Ambinder:
[These are] the U.S. network security rules of engagement. Defend, don't attack.... For example, if a U.S. site comes under attack [from a foreign site], the victim -- assume it's an intelligence agency -- can defend it by trying to block the attacks, and it can offensively attempt to figure out who's behind them -- but once that threshold is crossed, it cannot attack the sites. [Most attackers] have no such rules. In fact, [some governments] teach attack techniques to a large group of state-sponsored hackers, and part of the classroom work is for them to conduct actual attacks on sites around the world, including the U.S.
US companies are only obligated to disclose the loss of customers' private information, and they don't have to be very specific about how the loss occurred, so there isn't much improvement in protection as a result of understanding how a successful attack transpired.

Take Aways for Utilities

Smart Grid initiatives are driving a huge increase in Web connectivity for utilities at this very interesting point in the evolution of cyber offense and defense. A big part of that increase comes in the form of new online energy applications and services being built by Google and dozens of start-up companies including Silver Spring Networks, GridPoint, Grid Net, Tendril and ten-year demand management veteran EnerNOC. Are all as forward minded re: security as Google? Time will tell.

We know utilities in other countries have come under cyber attack ... at least one incident induced significant outages. We also know that malicioius code has found its way onto US utility computer systems. But there's lots more we don't know and there are many questions to consider while we're still in the formative stages of the Smart Grid build out:
  • Will large US utilities become targets for big cyberattacks similar to those that just hit Google?
  • Will they have the defenses in place to protect customer data and maintain reliability as well as it appears Google did?
  • Especially as they rely so heavily on enormous amounts of reliable, high quality power, will Google and other more mature cyber security victims be willing to share their best practices with the utility community?
  • What obligations do utilities have for disclosing cyber attacks they endure, especially ones that cause tangible damage? And if they do disclose this info, to whom do they disclose it: FERC, NERC, NSA, each other, or the general public?
Despite repeated warnings from experts and the press (for example: here and here) since the Google breach headlines appeared, progress on disclosure from other affected organizations, forensics on the actual mechanisms, and informed recommendations have been slow. That must change. Utilities and their software/service providers should be pressing for information and for assistance, because this kind of data and experience can educate and invigorate utility CIOs and CISOs so that they can err on the side of over-preparation when performing security planning on behalf of their companies and their customers. Nothing could more fundamentally weaken our nation and our competitiveness than an organized and successful attack on our power infrastructure, and these incidents present an uncommon opportunity to learn.

Photo Credit: Mike Baird @ Flickr