Threat, Uncertainty and Doubt

The staff at the Christian Science Monitor (CSM) Passcode just published an article titled: Quest for Knowledge and there was one sentence that caught my eye: “Instead, there’s a growing sentiment in the industry that knowing as much as possible about the attackers – and how they strike – is the key to good defense.”

There is no suggestion that the folks down at the CSM got it wrong in the way they characterized the industry trend. I am suggesting that those who follow threat’s siren song may end up with their hopes dashed against the rocks.

I am not dismissing the value of threat intelligence or suggesting it does not help to know more about your attacker. And, based on the sheer number of threat reports coming out, it is easy to see how one might reach the conclusion that threat intelligence is the latest bandwagon that must be jumped on. Taking a contrarian position I would say that knowing as much as possible about the threat is not the key. It may be nice to know or it may give a sense of being on top of the situation, but it never has been and it never will be the key to a good defense (at least for ICS environments). Read more »



The different levels of cyber security governance

0. Anything goes

No policies, no procedures, no checking. Typical for everyday contractor access in the majority of industrial facilities.

1. Passing the buck

Declaring others responsible for cyber security — end users, contractors etc. This is achieved by stressing “awareness” (assuming that the end user, if only being “aware” of cyber risk, would be in a position to take appropriate action). Typical example: Holding end users responsible for appropriately performing backups without ever giving them a procedure. Any provision that includes judgment on the end user’s part falls into this category.

2. Putting yourself in charge without a plan

Emphasizing that certain procedures and configurations are subject to permission by a central authority (such as the IT department, or physical security), but failing to provide any rules on how decisions are made — because there are none. This leads to ad-hoc decisions that cannot be questioned, consistently be performed by others, or even be audited. The major difference to level 1 is that now it’s no longer the end user who is held responsible, but consistency is missing as well.

3. Creating a fantasy world of wishful thinking

The organization has produced an impressive and consistent policy framework but never checks if it has anything to do with reality. Most of the time because it simply cannot be audited. For example, it is impossible to audit policies like “backups must be performed in a timely manner”, because “timely” could be anything between five minutes and five years.

4. The real deal

The organization uses a consistent policy framework that can be audited and is audited. For example, “timely” is specified as “every week”. Non-conformity is recorded and prompts action — not necessarily the decapitation of those who didn’t follow policy, but maybe the re-phrasing of policies that turned out to be not practical.

5. Sustainable governance

The organization is using a consistent policy framework that can be audited and is audited, and audits as well as user feedback is largely automated. Security automation is the key challenge for sustainable governance. Is that a technical problem? Absolutely not. Next time you pay your meter, order a pizza or cab over the Internet using an app, think about the absurdity that people in real production environments — including contractors — are expected to approach so much more important cyber security issues referring to a folder full of boring policy printouts, or try to locate the respective documents in a labyrinth of files. Rather than talking so much about the Industrial Internet of Things, we suggest to contemplate about the Industrial Intranet of OT Governance — if only because without solid governance, the Industrial Internet is doomed from the beginning.

Guess which level of governance we are implementing with the RIPE OT Security and Robustness Program.




Harmonizing ICS Security and Compliance

During the SANS ICS Security Summit 2015 last week in Orlando, Mike Assante moderated a panel titled: Harmonizing ICS Security and Compliance. I shared the stage with Matt Davis from Ernst & Yong and Josh Sandler from Duke Energy. Based on comments from my colleagues on the panel and questions from the audience, there was general agreement that security should transcend compliance. In other words, the goal of any organization should be a security regime that includes people, process, and technology in such a way that compliance is not the driver. Obviously, compliance does not equal security as any practitioner will tell you. Furthermore, having been on the regulator’s side of the table, regulations are intended to be the minimum of what must be done rather than the maximum. This mode of thinking puts the onus squarely on the asset owner, which is exactly where it should be. The regulator should be there to assist, enable, and at times validate, but ultimate responsibility remains with the asset owner.

Thanks to Mike for pulling this informative panel together and asking the key final question: So, what are asset owners to do? My answer was direct and to the point; call me. The Langner Group’s Robust Planning and Evaluation (RIPE) program can help asset owners implement just such a regime that integrates people, process, and technology into a sustainable and measurable security posture. RIPE is used to demonstrate compliance to the most rigid regulations in the nuclear industry on one end of the spectrum and to raise ICS security posture step by step, adjusting to available budget, for companies with no need to comply to regulation on the other end of the spectrum. It does well in these different scenarios and will also do well for your company.

Perry Pederson




Regulating Nuclear Cyber Security: The Core Issues

If there is anything such as “critical infrastructure” where a cyber attack must be prevented by all means, it’s certainly the international fleet of nuclear power plants and associated facilities for the production, processing and storage of nuclear material. Potential cyber attacks against these facilities don’t cause concern in respect to the confidentiality, integrity, and availability of information, but in respect of public health and national security. While the majority of nuclear power plants still use analog safety systems that simply cannot be compromised by even the most sophisticated digital code, these analog systems are simply no longer available for purchase. Therefore, renewal projects for the instrumentation and control of nuke plants, and certainly new reactors, use digital devices for even the most sensitive systems and processes. Critical risk or acceptable? Well that’s what governments around the world need to figure out.

In the US, cyber security for nuclear power plants got its start as an industry best practice. Subsequent to the attacks of 9/11, many aspects of U.S. security were bolstered and nuclear power plants were considered among the most critical of critical infrastructure assets and therefore in need of additional security. Industry efforts were noteworthy and significant progress was made. However, the U.S. Nuclear Regulatory Commission (NRC) determined that industry efforts were insufficient and published a new cyber security rule in 2009 and cyber security guidance in 2010.

Read more »



The Langner Group at S4x15 Week in Miami

At S4x15 The Langner Group will contribute the following talks:

A Process-Based Approach to ICS Security: The RIPE Program in Real Life

Traditional strategies toward ICS security focused on specific technologies such as data diodes or whitelisting, or on high-level guidance for risk management. Examples for the latter are the NIST Cyber Security Framework and ISA-99. The RIPE Program takes a different avenue and focuses on the practical how-to of cyber security governance on the plant floor. It can be thought of as a fast lane to NIST CSF implementation that comes with the added benefit of metrics and scalability.

In this presentation, Ralph Langner will provide a brief introduction to the concepts of and instruments in RIPE. The bulk of the talk will be accomplished by Tomas Nystrom who is in charge of cyber security for the several hundred power plants owned and operated by Nordic energy giant Fortum. Tomas will provide real-world experience on the process of introducing RIPE to an operational nuclear power plant, making the plant both secure against cyber attacks and compliant with national regulation.

Cyber-Physical Attack Engineering

“Visible through the various cyber-physical exploits [used in Stuxnet] is the silhouette of a methodology for attack engineering that can be taught in school and can ultimately be implemented in algorithms. (..) Attack engineering is about reliably taking over control in order to exploit physical vulnerabilities. The way to get there is completely different from IT.”

Such is written in To Kill a Centrifuge. In his talk, Ralph elaborates on the fundamentals of cyber-physical attack engineering as a discipline that must be understood and mastered in order to identify and protect against the worst attack scenarios that sophisticated attackers could pull off against high-value targets. At the same time it also helps to understand where defensive “best practices” are completely worthless.

The subject of the talk, which is also reflected and applied in The Langner Group’s Critical Penetration Analysis, calls for a distinct re-orientation of cyber-physical security. Ralph points out that a pure infosec methodology, spiced up with hacking wisdom, stops short of providing meaningful results for risk mitigation as it does not link vulnerability with potential consequence of exploitation: Where a hacker alleges that by “owning” a SCADA or ICS component he could “do anything”, reality is usually quite different. Nevertheless deterministic routes to disaster may be inherent in a plant design but not understood by asset owners and pen testers alike.

Cyber-physical attack engineering should be seen in line with like-minded efforts such as the groundbreaking work in nuclear security by Gary Johnson (formerly with IAEA) and recent presentations by Bryan Singer (Kenexis), who will pick up on the subject at S4x15.



Lack of Cyber Imagination

The 9/11 Commission noted several failures that led to that fateful day, but they emphasized that “The most important failure was one of imagination.” It’s been thirteen years and one could argue that this lesson has still not permeated to every corner of our security apparatus, but in the cyber domain, this lesson has not even scratched the surface. As a country with some of the most automated (i.e., vulnerable) critical infrastructure in the world, we seem to collectively lack the imagination necessary to mount an effective defense. In other words, an effective defense is really driven by imagination, backed by solid system analysis. If all an asset owner can imagine is to wait for Government alerts or scour vulnerability databases, their defenses are likely to fail when tested if they are not already owned. Read more »



Follow-up: Surviving on a Diet of Poisoned Fruit

The Langner Group attended a presentation and group discussion with Dr. Richard Danzig, former Secretary of the Navy, hosted by a leading think tank. Dr. Danzig made a presentation focused on the eight recommendations outlined in his recently published report “Surviving on a Diet of Poisoned Fruit: Reducing the National Security Risks of America’s Cyber Dependencies“. Therein, Dr. Danzig makes a compelling case that mastering this new thing called cyberspace is the way forward “The United States will make its peace with the new technologies by understanding them and finding ways to limit their potentially pernicious and especially their potentially disastrous effects.” His paper has brought some sorely needed critical thinking and open debate to a pervasive problem that we as a nation must address. We at The Langner Group are aligned with the vision put forth by Dr. Danzig and sincerely hope that decision makers at all levels from public and private sectors take the time to read and ponder this report.




Who’s Smarter, Hackers or Defenders?

I am sometimes befuddled at just how much press (negative and otherwise) hackers receive. Truth be told, perhaps my befuddlement contains just a twinge of jealousy (okay, maybe more than a twinge). Although hackers may not have attained the status of rock stars yet, I can imagine throngs of hacker groupies hanging around just outside the back door waiting to pounce on the clueless geeks as they emerge from an all-night hacking session.

Sure, I’ve done some white-hat hacking. I have had training through a university, SANS, and multiple visits to the Idaho National Laboratory. I’ve used many of the tools that are commonly available. What I have gathered in all this time is that hacking is easy. Let me explain. It’s easy in the sense that typically the hacker is not creating vulnerabilities (such can only be done by system designers and software developers), but taking advantage of vulnerabilities discovered. The adoration heaped on hackers strikes me as akin to admiring a clever thief because they found the back door unlocked. Granted, the thief may have scoured hundreds of neighborhoods with an advanced algorithm searching for this one unlocked door, but that is hardly a remarkable let alone admirable feat. Read more »



Beyond AIC: Tom Clancy’s take on cyber-physical attacks

Too often, discussions on cyber-physical attack scenarios and how to prevent them are focused on the idea that a cyber attacker could disrupt or freeze process control, thereby causing downtime. This thinking is in alignment with the common misconception that cyber-physical security would be just another form of information security, with the major difference being that the basic protective priorities of confidentiality, integrity, availability (CIA) only need to be reordered to availability, integrity, confidentiality (AIC), and bingo!, we can secure process control by using otherwise identical concepts, products, and procedures from infosec.

The misconception is due to framing the problem within the conceptual space of information security, ignoring the physical side of process control – which shouldn’t surprise when infosec people are invited to lead the discussion. However, cybernetics is not the same as IT, and the availability of digital components (in infosec terms) is not necessarily the highest priority of cyber-physical defense.

Interestingly, fiction writer Tom Clancy had this insight intuitively when writing his thriller “Threat Vector” in which a Chinese state-owned hacker organization (the “Red Hacker Alliance”) cyber-attacks US critical infrastructure. In the following quote, the villain named “Tong” is a Chinese super-hacker that may have been modeled with characters like Ugly Gorilla in mind:

“During a public dispute between China’s state-owned petroleum organization and an American oil company over a pipeline contract in Brazil, Tong came before the leadership of the MSS [Ministry of State Security] and asked them, quite simply, if they would like his Red Hacker Alliance to destroy the oil company. He was asked by the ministers if he intended to destroy the American oil company’s dominance in the marketplace.

‘That is not what I mean. I mean, physically ruin them.’ – ‘Shut their computers down?’ (…) ‘Of course not. We need their computers. We have obtained command-level control of their pipelines and oil-drilling capacity. We have kinetic capabilities at their locations. We can cause actual real-world destruction.’”

(Tom Clancy, Threat Vector)

Malicious process control requires fully-functional control systems, making the digital disruption of SCADA and PLCs look like a foolish beginner’s mistake. A cyber-physical attack is not an attack against a control system, but an attack against the physical equipment or process that it controls. It is therefore a dangerous oversimplification to identify cyber-physical defense with ICS security or, even worse, SCADA security.

What we really have to be concerned about is digital process control security, which cannot be assured without understanding the physical process and equipment and their specific analog vulnerabilities, and which may even involve analog components such as last-line-of-defense analog safeguards for high-value targets. That’s the major reason why we include process and equipment engineering principles in Critical Penetration Analysis.



IT vs. ICS: An Attacker’s Perspective

There are extensive treatments of the similarities and differences between information technology (IT) systems and industrial control systems (ICS), but these differences are more than just academic concerns. Many IT hacks as reported in the media seem to be opportunistic in the sense that the hackerverse exerts constant pressure on IT systems searching for targets of opportunity or weak links in the defense. In contrast, deploying effective cyber weapons intent on sabotage of critical infrastructure is not simply a matter of finding the latest vulnerability in an OPC server or a hard coded password in a PLC.

For the purpose of discussion, a cyber-weapon is a software artifact designed to cause physical harm to objects, people, or the environment. Turning machines into weapons is not a new idea and the notion has been made apparent by entities such as the Chinese PLA, “The new concept of weapons will cause ordinary people and military men alike to be greatly astonished at the fact that commonplace things that are close to them can also become weapons with which to engage in war.” Read more »

Older posts «

» Newer posts