Two more nuclear facilities sign up for RIPE

Two more nuclear facilities are introducing the RIPE OT Security and Robustness Program to address cyber security in a sustainable and measurable manner, and to comply with tightened regulation at the same time.

Olkiluoto Nuclear Power Plant

The Olkiluoto nuclear power plant, operated by TVO, consists of three units. Unit 1 and 2 produce 860 MW power each and are operational since 1979 and 1982. Unit 3 is in the construction phase and is scheduled to go on the grid in 2018, delivering additional 1600 MW electrical power.

Since the Loviisa nuclear power plant, operated by Fortum, is already covered under RIPE, the recent deal means that the whole nuclear power production of one country (Finland) is now protected against cyber threats by The Langner Group’s RIPE program.

Posiva Spent Fuel Storage Facility

The other facility now covered by RIPE is the Finnish spent fuel storage, operated by Posiva. We believe this development is particularly important because Finland is one of the first countries to extend cyber security regulation to the fuel cycle, thereby underlining its globally leading position in nuclear cyber security.




The equilibrium of cyber conflict: In memoriam John Nash (1928-2015)

This weekend, Nobel laureate John Nash died in a car accident. If there is any theory I could think of that could explain what we’re seeing in international cyber conflict, I believe it’s his theory of non-cooperative games, especially the “Nash equilibrium”. The theory is the centerpiece of Nash’s 30 page inaugural dissertation from 1950, simply titled non-cooperative games.

The Nash equilibrium basically expresses that adversaries may arrive at a choice of strategy that minimizes their mutual losses, thereby reaching a stable state. Read more »



Historical Document: Towards a Cyber Security Governance Framework for Industrial Control Systems

In 2013 Ralph wrote this brief ten point manifesto that became one of the foundational pieces of the RIPE OT Security and Robustness Program. Two years later, it still looks pretty accurate.


  1. By addressing the problem of critical infrastructure cyber insecurity with security concepts and appliances borrowed from IT, we have tried to cure the symptoms rather than the disease.
  2. We have been poking around in largely undocumented digital environments guided by fuzzy threat intelligence, and applied band-aids (a.k.a. security controls) as the remedy of choice. However, a threat-driven approach to critical infrastructure cyber security is like wagging the dog. Being reactive by default, it fails to address the prevalent problem of systems that are insecure by design rather than because of software defects that would just needed to be “patched”, or hidden behind a firewall.
  3. We have been focusing on determining appropriate target security levels for individual plants rather than on establishing the means to reliably maintain any given security level regardless of criticality or industry. We have taken cyber security capability for granted without ever bothering to understand its characteristics and requirements.
  4. The design, configuration, operation and maintenance of industrial control systems in any reasonably secure manner requires a governance process. In absence of such a governance process, the security or insecurity of ICS applications and environments will always be subject to non-controllable external forces such as new vulnerabilities, new contractors who violate policy, or new threats, resulting in a constant decay of cyber security.
  5. The governance process is not threat-driven. It is a proactive and continuous activity based on the understanding that a non-governed cyber environment is insecure by default. Today, non-governed cyber environments are the norm in ICS installations. The popular excuse is that environments have “grown organically” (which is actually not an excuse but just stating a fact). However they will continue to “grow” until restricted by governance.
  6. The two major areas of the governance process are asset and configuration management (on the technology side), and workforce and supply chain management (on the people and procedures side).
  7. The foundation of the governance process is a verifiable cyber system and process model. Such a model can be created and maintained easily because system complexity is very low compared to IT environments, and most control system installations are extremely static. Creating a system and process model for an existing installation may require sweat, but it is anything but an intellectual or technological challenge.
  8. The governance process is identical for all industries. The basic activities of the governance process can be standardized in form of templates and can be audited in order to establish compliance.
  9. The task of setting appropriate target security levels can be isolated from the governance process as such. Setting target security levels may be based on the concept of risk, or may be based on alternative, policy-driven concepts.
  10. Based on a cyber security governance framework, templates can be extended and fine-tuned to measure and achieve sector-specific and application-specific performance targets. A framework of standardized templates and performance indicators also offers the opportunity for meaningful information sharing and benchmarking.



Threat, Uncertainty and Doubt

The staff at the Christian Science Monitor (CSM) Passcode just published an article titled: Quest for Knowledge and there was one sentence that caught my eye: “Instead, there’s a growing sentiment in the industry that knowing as much as possible about the attackers – and how they strike – is the key to good defense.”

There is no suggestion that the folks down at the CSM got it wrong in the way they characterized the industry trend. I am suggesting that those who follow threat’s siren song may end up with their hopes dashed against the rocks.

I am not dismissing the value of threat intelligence or suggesting it does not help to know more about your attacker. And, based on the sheer number of threat reports coming out, it is easy to see how one might reach the conclusion that threat intelligence is the latest bandwagon that must be jumped on. Taking a contrarian position I would say that knowing as much as possible about the threat is not the key. It may be nice to know or it may give a sense of being on top of the situation, but it never has been and it never will be the key to a good defense (at least for ICS environments). Read more »



The different levels of cyber security governance

0. Anything goes

No policies, no procedures, no checking. Typical for everyday contractor access in the majority of industrial facilities.

1. Passing the buck

Declaring others responsible for cyber security — end users, contractors etc. This is achieved by stressing “awareness” (assuming that the end user, if only being “aware” of cyber risk, would be in a position to take appropriate action). Typical example: Holding end users responsible for appropriately performing backups without ever giving them a procedure. Any provision that includes judgment on the end user’s part falls into this category.

2. Putting yourself in charge without a plan

Emphasizing that certain procedures and configurations are subject to permission by a central authority (such as the IT department, or physical security), but failing to provide any rules on how decisions are made — because there are none. This leads to ad-hoc decisions that cannot be questioned, consistently be performed by others, or even be audited. The major difference to level 1 is that now it’s no longer the end user who is held responsible, but consistency is missing as well.

3. Creating a fantasy world of wishful thinking

The organization has produced an impressive and consistent policy framework but never checks if it has anything to do with reality. Most of the time because it simply cannot be audited. For example, it is impossible to audit policies like “backups must be performed in a timely manner”, because “timely” could be anything between five minutes and five years.

4. The real deal

The organization uses a consistent policy framework that can be audited and is audited. For example, “timely” is specified as “every week”. Non-conformity is recorded and prompts action — not necessarily the decapitation of those who didn’t follow policy, but maybe the re-phrasing of policies that turned out to be not practical.

5. Sustainable governance

The organization is using a consistent policy framework that can be audited and is audited, and audits as well as user feedback is largely automated. Security automation is the key challenge for sustainable governance. Is that a technical problem? Absolutely not. Next time you pay your meter, order a pizza or cab over the Internet using an app, think about the absurdity that people in real production environments — including contractors — are expected to approach so much more important cyber security issues referring to a folder full of boring policy printouts, or try to locate the respective documents in a labyrinth of files. Rather than talking so much about the Industrial Internet of Things, we suggest to contemplate about the Industrial Intranet of OT Governance — if only because without solid governance, the Industrial Internet is doomed from the beginning.

Guess which level of governance we are implementing with the RIPE OT Security and Robustness Program.




Harmonizing ICS Security and Compliance

During the SANS ICS Security Summit 2015 last week in Orlando, Mike Assante moderated a panel titled: Harmonizing ICS Security and Compliance. I shared the stage with Matt Davis from Ernst & Yong and Josh Sandler from Duke Energy. Based on comments from my colleagues on the panel and questions from the audience, there was general agreement that security should transcend compliance. In other words, the goal of any organization should be a security regime that includes people, process, and technology in such a way that compliance is not the driver. Obviously, compliance does not equal security as any practitioner will tell you. Furthermore, having been on the regulator’s side of the table, regulations are intended to be the minimum of what must be done rather than the maximum. This mode of thinking puts the onus squarely on the asset owner, which is exactly where it should be. The regulator should be there to assist, enable, and at times validate, but ultimate responsibility remains with the asset owner.

Thanks to Mike for pulling this informative panel together and asking the key final question: So, what are asset owners to do? My answer was direct and to the point; call me. The Langner Group’s Robust Planning and Evaluation (RIPE) program can help asset owners implement just such a regime that integrates people, process, and technology into a sustainable and measurable security posture. RIPE is used to demonstrate compliance to the most rigid regulations in the nuclear industry on one end of the spectrum and to raise ICS security posture step by step, adjusting to available budget, for companies with no need to comply to regulation on the other end of the spectrum. It does well in these different scenarios and will also do well for your company.

Perry Pederson




Regulating Nuclear Cyber Security: The Core Issues

If there is anything such as “critical infrastructure” where a cyber attack must be prevented by all means, it’s certainly the international fleet of nuclear power plants and associated facilities for the production, processing and storage of nuclear material. Potential cyber attacks against these facilities don’t cause concern in respect to the confidentiality, integrity, and availability of information, but in respect of public health and national security. While the majority of nuclear power plants still use analog safety systems that simply cannot be compromised by even the most sophisticated digital code, these analog systems are simply no longer available for purchase. Therefore, renewal projects for the instrumentation and control of nuke plants, and certainly new reactors, use digital devices for even the most sensitive systems and processes. Critical risk or acceptable? Well that’s what governments around the world need to figure out.

In the US, cyber security for nuclear power plants got its start as an industry best practice. Subsequent to the attacks of 9/11, many aspects of U.S. security were bolstered and nuclear power plants were considered among the most critical of critical infrastructure assets and therefore in need of additional security. Industry efforts were noteworthy and significant progress was made. However, the U.S. Nuclear Regulatory Commission (NRC) determined that industry efforts were insufficient and published a new cyber security rule in 2009 and cyber security guidance in 2010.

Read more »



The Langner Group at S4x15 Week in Miami

At S4x15 The Langner Group will contribute the following talks:

A Process-Based Approach to ICS Security: The RIPE Program in Real Life

Traditional strategies toward ICS security focused on specific technologies such as data diodes or whitelisting, or on high-level guidance for risk management. Examples for the latter are the NIST Cyber Security Framework and ISA-99. The RIPE Program takes a different avenue and focuses on the practical how-to of cyber security governance on the plant floor. It can be thought of as a fast lane to NIST CSF implementation that comes with the added benefit of metrics and scalability.

In this presentation, Ralph Langner will provide a brief introduction to the concepts of and instruments in RIPE. The bulk of the talk will be accomplished by Tomas Nystrom who is in charge of cyber security for the several hundred power plants owned and operated by Nordic energy giant Fortum. Tomas will provide real-world experience on the process of introducing RIPE to an operational nuclear power plant, making the plant both secure against cyber attacks and compliant with national regulation.

Cyber-Physical Attack Engineering

“Visible through the various cyber-physical exploits [used in Stuxnet] is the silhouette of a methodology for attack engineering that can be taught in school and can ultimately be implemented in algorithms. (..) Attack engineering is about reliably taking over control in order to exploit physical vulnerabilities. The way to get there is completely different from IT.”

Such is written in To Kill a Centrifuge. In his talk, Ralph elaborates on the fundamentals of cyber-physical attack engineering as a discipline that must be understood and mastered in order to identify and protect against the worst attack scenarios that sophisticated attackers could pull off against high-value targets. At the same time it also helps to understand where defensive “best practices” are completely worthless.

The subject of the talk, which is also reflected and applied in The Langner Group’s Critical Penetration Analysis, calls for a distinct re-orientation of cyber-physical security. Ralph points out that a pure infosec methodology, spiced up with hacking wisdom, stops short of providing meaningful results for risk mitigation as it does not link vulnerability with potential consequence of exploitation: Where a hacker alleges that by “owning” a SCADA or ICS component he could “do anything”, reality is usually quite different. Nevertheless deterministic routes to disaster may be inherent in a plant design but not understood by asset owners and pen testers alike.

Cyber-physical attack engineering should be seen in line with like-minded efforts such as the groundbreaking work in nuclear security by Gary Johnson (formerly with IAEA) and recent presentations by Bryan Singer (Kenexis), who will pick up on the subject at S4x15.



Lack of Cyber Imagination

The 9/11 Commission noted several failures that led to that fateful day, but they emphasized that “The most important failure was one of imagination.” It’s been thirteen years and one could argue that this lesson has still not permeated to every corner of our security apparatus, but in the cyber domain, this lesson has not even scratched the surface. As a country with some of the most automated (i.e., vulnerable) critical infrastructure in the world, we seem to collectively lack the imagination necessary to mount an effective defense. In other words, an effective defense is really driven by imagination, backed by solid system analysis. If all an asset owner can imagine is to wait for Government alerts or scour vulnerability databases, their defenses are likely to fail when tested if they are not already owned. Read more »



Follow-up: Surviving on a Diet of Poisoned Fruit

The Langner Group attended a presentation and group discussion with Dr. Richard Danzig, former Secretary of the Navy, hosted by a leading think tank. Dr. Danzig made a presentation focused on the eight recommendations outlined in his recently published report “Surviving on a Diet of Poisoned Fruit: Reducing the National Security Risks of America’s Cyber Dependencies“. Therein, Dr. Danzig makes a compelling case that mastering this new thing called cyberspace is the way forward “The United States will make its peace with the new technologies by understanding them and finding ways to limit their potentially pernicious and especially their potentially disastrous effects.” His paper has brought some sorely needed critical thinking and open debate to a pervasive problem that we as a nation must address. We at The Langner Group are aligned with the vision put forth by Dr. Danzig and sincerely hope that decision makers at all levels from public and private sectors take the time to read and ponder this report.


Older posts «

» Newer posts