Securing the ICS: Measure solution effectiveness, maturity

Figure 1: To determine effectiveness and maturity, take a particular control and plot the effectiveness score on the Y-axis of a graph and the DMS on the X-axis of the graph. By seeing which quadrant the result falls into, people can quickly make some general statements about the systems under consideration. Courtesy: Maverick Technologies
Figure 1: To determine effectiveness and maturity, take a particular control and plot the effectiveness score on the Y-axis of a graph and the DMS on the X-axis of the graph. By seeing which quadrant the result falls into, people can quickly make some general statements about the systems under consideration. Courtesy: Maverick Technologies

Many articles focused on cybersecurity stress how important it is to secure an industrial control system (ICS) and share ways to implement this security. In this article, the assumption is this advice has already been taken to secure the ICS and a plan has been implemented.

Now what? How do people determine how well a security plan has been implemented? What is the difference between a system that meets a security rule in a minimalist fashion and one that performs in a more mature manner?

To help answer these questions, the J. M. Huber Corp. began the process of building a formal corporate ICS security program in 2016. It invited MAVERICK Technologies, a Rockwell Automation Company, to join the project in 2017. MAVERICK and Huber have been working together for the last four years to develop the Industrial Control System Maturity Assessment Program (ICSMAP). The ICSMAP is a custom program derived from principles contained in the ISA/IEC 62443 series of standards. Elements of this article are based on previous publications by Drew Franklin (J. M. Huber) and the author. ICSMAP will be discussed here as one way to evaluate the effectiveness and maturity of an ICS security program.

Evaluating ICS security effectiveness

To evaluate the effectiveness of an ICS security program, facilities must start with a specific set of security controls. These controls may be the result of a thorough risk-based evaluation of the process and its vulnerabilities such as the ISA/IEC 62443 process. These controls may be a list downloaded from the internet. Whatever the source, the desired controls must be documented. In most cases, there will be a primary set of top-level controls, and a second set of sub-controls that goes into greater detail. Table 1 shows the ICSMAP controls and sub-controls around redundancy, backup and recovery.

Table 1: Redundancy, backup and recovery. Courtesy: Maverick Technologies
Table 1: Redundancy, backup and recovery. Courtesy: Maverick Technologies

Effectiveness is assessed through inspection, observation and inquiry. The assessment can be performed via a formal audit program, an informal self-assessment or something in between. Each control will receive a score varying between one and five to describe the extent to which the control was achieved. Having a scale with an odd number of values allows for a midpoint or neutral value. Having only three possible scores does not allow for enough variability and having more than five possible scores quickly reaches the point where the difference between scores becomes arbitrary. The ICSMAP also allows for a score of 0 indicating the control is not applicable. The ICSMAP sub-objective for backup methodology is shown in Table 2, along with its test procedure and possible effectiveness scores.

Once an effectiveness review is complete, there will be a score identified for each control or sub-control. Where a larger control is broken into a set of sub-controls, effectiveness should be measured and reported at the sub-control level to provide the best possible visibility into the operation of that control. The set of control scores can easily be color coded to quickly identify areas of high and low effectiveness (see Table 3).

Table 2: ICSMAP sub-objective for backup methodology. Courtesy: Maverick Technologies
Table 2: ICSMAP sub-objective for backup methodology. Courtesy: Maverick Technologies

Defining maturity

There are often many ways to implement a security control, and people need to be able to distinguish between the quality of various methods. Methods are not inherently good or bad, but variations between immature methods and mature methods can be identified.

An immature system may be labor-intensive, prone to error, difficult to analyze and depends on people rather than processes. A mature system may be automatic, resilient, self-documenting and leverages technology and processes.

A low-maturity program for backing up process controllers might be completely manual. Based on a calendar schedule, a technician carries a laptop to each controller, backs up the programs to a local hard drive, backs up the hard drive to tape and carries tapes offsite in a briefcase. This procedure has multiple single points of failure, is prone to human error, may have no action log and may miss critical changes between scheduled backups.

Table 3: Color-coded identification scores. Courtesy: Maverick Technologies
Table 3: Color-coded identification scores. Courtesy: Maverick Technologies

A high-maturity program for the same system might be completely automatic. A program monitor could continuously evaluate the state of the process controllers and trigger automatic backups whenever a change is detected. It could pull backups over the network to secure shared storage and log all its actions. The shared storage could be backed up offsite using validated commercial solutions. Each element of the systems — from networks to servers to commercial services — could be built redundant and resilient.

Evaluating maturity

One portion of the maturity evaluation process is developing the judged maturity score (JMS) for each security control area. As the maturity is more evaluative and less prescriptive, maturity scores are often developed at the control level rather than at the sub-control level. Each security control’s maturity is evaluated in five areas, which are common for all controls:

  • Documentation
  • Efficiency
  • Resiliency
  • Monitoring
  • Use of available technology.

For each of the areas, a specific list of descriptive scoring criteria is required. When developing the overall JMS for a control, the five component JMS scores are averaged to create a single JMS score. The ICSMAP has the following score descriptions in the efficiency category (see Table 4).

Table 4: Score descriptions in the efficiency category. Courtesy: Maverick Technologies
Table 4: Score descriptions in the efficiency category. Courtesy: Maverick Technologies

At some time prior to evaluating the maturity of the ICS security controls, the enterprise should conduct an exercise to determine a set of required maturity scores (RMS) for the system. This exercise is a management activity, but industrial controls professionals may be called on to assist. Management should incorporate the risk profile of the process and the risk appetite of the business into the overall decision on how mature specific systems are required to be.

For example, a control may be judged to be critical due to the potential negative effects of failure. The business unit may assign that control an RMS of five because it needs to perform at the highest level possible with current technology. A separate control may be assigned an RMS of two because older, more manual and less mature systems are considered acceptable to the business unit.

A full maturity evaluation is possible once JMS and RMS values have been identified for each control objective and the differential between the two has been determined by subtracting the RMS from the JMS. This differential will be referred to as the differential maturity score (DMS). A DMS of zero (0) indicates the JMS, representing the in-place systems, exactly matches the RMS, representing the required state. A positive DMS indicates the in-place systems are more mature than what is required by the business. A negative DMS indicates the in-place systems do not meet the requirements. The one to five range on JMS and RMS scores results in a score range of negative four to four for DMS.

Business intelligence

With the analyses complete, the effectiveness and maturity can be combined to glean business intelligence from the base data. Effectiveness and maturity are complementary, but separate, metrics designed to measure the way ICS security systems meet business needs. Effectiveness describes how well the programs are followed and implemented. Maturity describes the programs in place on an ICS. Combining the two can give actionable information about how ICS programs are performing.

Take a particular control and plot the effectiveness score on the Y-axis of a graph and the DMS on the X-axis of the graph. By seeing which quadrant the result falls into, people can make general statements about the systems under consideration (see Figure 1).

Figure 1: To determine effectiveness and maturity, take a particular control and plot the effectiveness score on the Y-axis of a graph and the DMS on the X-axis of the graph. By seeing which quadrant the result falls into, people can quickly make some general statements about the systems under consideration. Courtesy: Maverick Technologies
Figure 1: To determine effectiveness and maturity, take a particular control and plot the effectiveness score on the Y-axis of a graph and the DMS on the X-axis of the graph. By seeing which quadrant the result falls into, people can quickly make some general statements about the systems under consideration. Courtesy: Maverick Technologies

Starting at the top left, we see high effectiveness with a negative DMS. This is where capable experts might be running manual systems. Their logbooks are filed in pristine binders and have meticulous notes. There is room for investment in efficiency and automation. The top right is where a company may have over-invested on the latest and greatest. The business value of the investments is low. The installation is more mature than it needs to be. The bottom right quadrant is where there may be a false sense of security. Advanced systems have been built, but they are not being used effectively. High-maturity systems may be ignored or underused. Systems may be misunderstood because operators are not fully trained. In the bottom left is the critical quadrant and suggest immediate intervention is warranted.

A formal, repeatable process

Building an ICS security program within an organization is a major undertaking, but the job is not complete until it is built and operating. It needs to be periodically evaluated to ensure it is still performing as designed and it remains synchronized with business needs. The evaluation process needs to be formalized and repeatable. Even the evaluation process itself should be subject to review and adjusted as business and technical circumstances change. By evaluating the ICS security program’s effective and maturity, organizations can have confidence that their security programs are performing as designed, risks are appropriately mitigated and value is gained from the security investment.

YOU MAY ALSO LIKE

GET ON THE BEAT

 

Keep your finger on the pulse of top industry news

RECENT NEWS
HACKS & ATTACKS
RESOURCES