Monday, January 30, 2012

Full Disclosure from 2012 Distributech's Keynote Security Panel


It's fun to connect with and catch-up with energy sector security friends, and not always at security conferences. I think we all get a kick out of seeing each other and then dispersing back out into the world to promote the cause and fight our battles in all the different ways we do it.

In fact, it feels a little more special when gather inside a larger conference context, which without a doubt is what you get at the mighty annual Distributech, which took place this year in sunny San Antonio, Texas.

So, enough chit chat. Let's dive into what was discussed on Thursday morning by these folks. Moderator Mike Ahmadi of GraniteKey expertly led a panel of experts on the topic of Security Standards, including:
  • Bobby Brown, Enernex 
  • Alan Rivaldo, Texas PUC 
  • Nate Kube, Wurldtech 
  • Darren Highfill, Man of Many Hats 
The guys covered several different topics in depth, including security metrics, vulnerability handling in IT vs. OT, social engineering, and perhaps, most provocatively, security information disclosure ethics and ramifications. Below find a few highlights for each one:

Metrics and Measurement
  • In the shadow of Basecamp (which we'll get to shortly), trying to gauge industry progress on security or lack thereof, Mike asked: "are products getting better?" and the response surprised some of us I think. Nate, who has been testing grid products and systems since he was knee high said "absolutely!"
  • Others chimed in that, slowly but surely, increased awareness has raised the bar for what's expected from vendors. Sometimes it's because utilities' RFPs' demand it, other times it comes from the vendors themselves. Altogether it's certainly too slowly for many of us, but the consensus seemed to be: tangible improvement is happening out there
  • Darren introduced the new DOE RMMM (in early development), referenced other maturity models and frameworks, and he and the panel seemed to contend that all of these, to a greater or lesser extent, help organizations baseline and roadmap their security functions and goals ... and who wouldn't want that!
  • Bobby Brown got some laughs (from me, anyway) when he likened the concept of security maturity standards for SG products to the carnival sign we all know that says "You must be this tall to ride this ride"
  • Nate praised an audience member's phrase: "at the speed of Metasploit". This set the stage for the later discussion on disclosure. (There's more on the Metasploit vulnerability and exploit development framework HERE if this is your first time hearing the term.)
  • Much to my delight, much was said about metrics and measurement in the early going, as we moved back and forth between contrasting the development and evolution of standards and guidelines (e.g., NERC CIPs, NISTIR 7628, IEC 62443 2-4, etc.) with demonstrable improvement in the security posture of utilities
Vulnerabilities in IT vs. OT

This may be obvious to many folks, and I've heard it mentioned quite a bit myself especially concerning meters. But the point was made that in the IT universe, one of the primary modes for dealing with newly surfaced vulnerabilities as well as new types of threats, was rapid change. Rapid change of hardware (we all want the latest gadgets, laptops and servers) is facilitated and driven by customer expectations a refresh on these items every few years or so.

And we see even more rapid change in IT software, as patches to some systems are generated once a month, once a week or pretty much any time. We not only tolerate this pattern, we've come to expect it as a natural part of using the latest and greatest (and safest) software.

That of course brought us back to the OT part of our world, and its intrinsically different set of economics, values and certainly, hardware and software lifecycles. For many good reasons, the systems that support our operations centers, generators, transmission and distribution functions, to include both the hardware and the software, have simply not been built to accommodate frequent change. 

And the culture which wraps around these systems, both the users and the suppliers, is still largely hard-wired to make decisions based on comparatively very lengthy spans of time elapsing between changes.

According to Darren, factors that play into the longer OT hardware and software version lifecycles include:
  • How a system is built
  • How systems around that system are built
  • How we use these systems
And a question arose: are systems that are being designed today looking like they're more able to facilitate faster change cycles? Don't think we arrived at an answer on that ... and that means the answer might be "no"

Social Engineering

The panel got a question from an attendee on social engineering, that is, using plain old people skills (e.g., charm, friendliness, charisma, urgency, faux credentials, etc.) to gain physical access to secure areas, access control information, system configuration information, and just about anything else.

All agreed that typical utility workers' (stereotype to follow) inherent goodness and sense of trust and helpfulness made the energy sector more susceptible to this type of threat than say financial services on Wall Street, where (only slight exaggeration to follow) everyone is mean, greedy and suspicious of everyone else

One of the panelists from a testing org said social engineering is 100% whenever they use it (ouch). Though the same person that social engineering assessments often one of the first services lined out by a utility when negotiating a contract for a comprehensive assessment.

Allan Rivaldo, the Texas PUC representative, after he made it perfectly clear that his statements made on the panel were not necessarily representative of his org, followed by saying that Texas takes insider and social engineering threats very seriously.

Disclosure and Information Sharing

Someone dropped a bomb (of a question) near the end. The panel was asked what it thought about the recent public disclose of PLC/SCADA vulnerabilities in the OT products of half a dozen vendors, to include the attack code for each crafted in Metasploit. 

While it seemed like most panelists believed that Dale Peterson of Digital Bond had acted with good intent: to speed up the remediation of the vulnerabilities by their respective vendors, there was substantial disagreement on whether this approach was justified and on whether it would induce the result Peterson said he sought.

One panelist contended that this action was necessary and valuable for "shining a light" on a broken process related to how DHS's ICS Cert works with vendors to resolve known vulnerabilities. The point being, I think, that following the official policies, many vulnerabilities go unremediated if the vendor provides a reason for leaving the vulnerability alone.

But another said that the Basecamp project researchers' unilateral release of vulnerability details and exploits did little except increase the level of risk to asset owners.

The thing that got me was that, knowing the guys on the panel as well as I do, knowing that they are all men of extremely high intelligence and good will, and that they only want what's best for the community, I was really surprised that they disagreed substantially on the issues that the Basecamp disclosure episode surfaced. 

Clearly this is complicated stuff: ethically, technically, culturally. But I think there's no doubt that our thinking is maturing in some respects, and that the industry community, both the users and the vendors, is responding. It will take a long time for Basecamp to fully play out. Hopefully we'll mainly agree, when it does, that it had a net-positive affect on the electric sector's security posture.