When I accepted the position as Director of the Control Systems Security Program (CSSP) in 2006, I had no idea about what was coming. One of the challenges I did envision was finding a way to educate non-technical policy makers about ICS security. In other words, we needed an engineering approach to solve this problem, but we also needed to “sell” the approach to non-technical people and Aurora provided such a vehicle. After briefing the DHS Secretary on the proposed test and getting the ‘green-light’ the DHS and INL crews went into high gear. For some, Aurora was just another test and the outcome was to be determined during actual testing. For those of us who understood the basic physics involved (and lessons taught in power engineering 101) we knew we were out to destroy a generator. Since that event, there are those who will still deny the validity of what was accomplished on that cold day in Idaho, but the test finally provided empirical evidence that cyber attacks can destroy physical equipment and it captured the event on video.
It was like a stroll down memory lane. I became aware of DHS’s release of the INL-Aurora (not to be confused with the Google-Aurora) project related information from several sources. I was surprised to say the least. It seemed consistent with the vulnerability release policy of some individuals, but inconsistent with my understanding of DHS’s position. Was this a declaration that all is well and mitigations have been fully implemented, or was it an error as indicated by some?
I read the FOIA request and I read the DHS FOIA response. Nowhere in these documents is there a reference to the INL-Aurora project. In other words, on its face it appears that the DHS office that processed the FOIA request did not appreciate nor understand the difference between Google-Aurora and INL-Aurora. Nonetheless, it is possible that the folks over at the NPPD, NCCIC, and ICS-CERT went through a deliberate process to review and release the INL-Aurora information. Some would agree that it’s about time while others see it as a grave mistake that could threaten our critical infrastructure (even more so than it is already). I cannot say whether the release was purposeful. However, I remain optimistic that for any instance where this vulnerability may yet exist, this release can serve as impetus to take a second or closer look.
Then I read through the 800+ pages of documentation that were supplied in response to the FOIA. Because I wrote many of those words, I was not surprised by what I found. In fact, I gained a renewed appreciation for what was accomplished. Set aside for the moment the vulnerability itself and you’ll see in these documents a massive effort to apply the public/private partnership model to a real problem. Many U.S. Government agencies were briefed as well as public entities. DHS worked through the North American Electric Reliability Corporation (NERC) and the Nuclear Energy Institute (NEI) in an effort to reach potential targets of Aurora type attacks. The Department of Defense (DoD) pushed a project through the Technical Support Working Group (TSWG) in partnership with a private company to develop the Rotating Equipment Isolation Device (REID) in what must have been record time. It was only a matter of months from inception to prototype to a commercially available product. In the world of government contracting this is something of an anomaly. The entire INL staff that was involved in the Aurora test as well as the support contractors performed near miracles during the winter in Idaho. The INL assessment of the project risks were frank and they saw many potential problems that could derail the test. Regardless of all that could go wrong, INL kept Aurora on track and delivered on schedule. Unfortunately, in the immediate aftermath of Aurora and the unwanted (or at least premature) media attention, the people who worked so hard and competently to get this monumental task complete, had little time to bask in their accomplishments. The one thing that may not be so clear in the released documents is the basic engineering that when into assessing the vulnerability in the first place. Yes, the ‘hack’ was fairly trivial when you have the necessary engineering background and access to substation equipment, but discovering the Aurora vulnerability took a lot of work by a dedicated team of talented engineers.
Fast forward to 2014. What have we learned about the protection of critical cyber-physical assets? Based on various open source media reports in just the first half of 2014, we don’t seem to be learning how to defend at the same rate as others are learning to breach. In the Brookings paper that Ralph and I wrote more than a year ago “Bound to Fail: Why Cyber Security Risk Cannot Simply Be “Managed” Away” we stress that being well armored is as important as being well armed. Furthermore, we argued that risk-based approaches will never get the job done when it comes to National Security. Sadly, we beat the old drums (risk-based approaches) while expecting to hear a different tune.
It’s about time that we learn the lesson that Aurora tried to teach us seven years ago. Obviously, I am not speaking to those involved in the project, but to those who completely missed the point. You can continue to canvas the masses and run another survey or attend one more ICS cyber security conference and you will continue to hear a range of opinions on Aurora. You can argue the test was fake, you can argue that the problem does not exist, you can even argue that mitigation efforts have been 100% successful. What has been glossed over countless times in this debate is the approach used to discover Aurora in the first place. Those who first postulated Aurora and then set about to fully assess the vulnerability knew this:
Effective defense against cyber-physical attacks is based on thorough system analysis and engineering principles, not on consensus.
Perry Pederson