The 9/11 Commission noted several failures that led to that fateful day, but they emphasized that “The most important failure was one of imagination.” It’s been thirteen years and one could argue that this lesson has still not permeated to every corner of our security apparatus, but in the cyber domain, this lesson has not even scratched the surface. As a country with some of the most automated (i.e., vulnerable) critical infrastructure in the world, we seem to collectively lack the imagination necessary to mount an effective defense. In other words, an effective defense is really driven by imagination, backed by solid system analysis. If all an asset owner can imagine is to wait for Government alerts or scour vulnerability databases, their defenses are likely to fail when tested if they are not already owned.

Those with malicious intent do not lack imagination. They will come through the supply chain, they will come through wireless connections, they will hijack a thumb drive or a non-profit organization’s web site, or they will discover a path through the heating, ventilating, and air conditioning (HVAC) system. If a single word was needed to describe those with malicious intent it would be imaginative. And to be clear, this type of imagination does not require intuition, crystal balls or witch doctor skills, but systematic methodology, which we have referred to earlier as cyber-physical attack engineering.

The charge to simply be more imaginative is not very productive. You need to apply creative thinking, but it must be within the appropriate context. In the case of critical infrastructure that context can be described as the industrial plant ecosystem. The ecosystem includes the hardware, software, people, and the physical process. One of the first things asset owners can do is have a complete and accurate system inventory coupled with a deep understanding of the physical process being controlled. The public, and some political decision makers as well, would be shocked to learn about the blatant absence of appropriate system modeling for what is usually thought of as the country’s most critical systems. We can locate and book any ski resort in the Rockies within minutes from our smart phone, but we didn’t even think to enumerate our critical infrastructure and understand the digital ecosystems that control it. This is a shame if only for the fact that these digital ecosystems are not overly complex and relatively static. The foundational premise here is that you can’t defend what you can’t define. Once you know what you are dealing with, then a multi-disciplinary team consisting of plant operations, plant process engineering, as well as cyber security expertise can step through an analysis to discover those plant level vulnerabilities that will never be found in a Government alert.

Which brings me to an action recently initiated by the U.S. nuclear industry and noted in a Control Global Unfettered blog . The Petition for Rulemaking at first blush sounds reasonable as it suggests focusing the efforts of nuclear power plant cyber defenders to those systems that have a nexus to radiological safety. Furthermore, the petition asserts that the time, resources, and cost needed to protect systems that do not have a nexus to radiological safety against cyber attack is unjustified. The potential for trouble is in asking for a categorical exclusion of systems based on what looks like a generic risk assessment that is then applied to every operating nuclear power plant. It amounts to promoting the limited use of imagination as an industry best practice. One thing you can bet on is that the offensive team (those with malicious intent) will not make the same mistake. Unfortunately, the fact is that severe trouble can be caused by digitally manipulating plant systems that have no obvious relation to reactor protection – a fact that we and others have identified and documented for nuclear plant operators, regulators, and the International Atomic Energy Agency (IAEA). A consequence-based analysis of digital compromise, pepped up with a good dose of creativity and process knowledge, usually ends up with attack paths that leave the asset owner with a “shoot, we didn’t consider this” response.

Ironically, a thorough consequence-based analysis like The Langner Group’s Critical Penetration Analysis (CPA) doesn’t have to be more expensive than an artificially restricted compliance exercise that limits itself to “critical” systems that are critical by face value. Physical vulnerabilities and digital design vulnerabilities are low in number and typical for specific plant configurations. It is therefore inappropriate to approach every single plant as a completely unique universe, especially for plants that use the same control and safety system design and product. Therefore, sharing this type of information between asset owners can reap huge benefits. The results of a CPA at one plant can (and should) be shared with other plants and will save the industry millions, illustrating a real world example of information sharing that reduces operating costs. Not only will significant dollars be saved doing a CPA for the industry, but there will also be a significant increase in the cyber security posture across board.

Perry Pederson