Wednesday, August 31, 2016

CFATS Security Model

One of the many complaints about attempting to regulate cybersecurity in critical infrastructure is that any attempt to mandate security procedures will, because of the rapid state of change in cybersecurity, result in outdated standards being applied as malware continues to evolve; making the regulations counterproductive. This has led to many (myself included) recommending that any cybersecurity regulatory scheme must be focused on out comes rather than specifying security measures. There is currently only one major security regulatory program that is based upon this concept, the Chemical Facility Anti-Terrorism Standards (CFATS) program. Thus an analysis of the successes and problems of the CFATS program would be important for an extension of that regulatory scheme into the cybersecurity realm.

CFATS Background


The CFATS program was established by Congress as an add-on the 2007 DHS spending bill (PL 109-295). Section 550 of that bill required DHS to establish “risk-based performance standards for security of chemical facilities and requiring vulnerability assessments and the development and implementation of site security plans for chemical facilities”.

A key provision of that section was that DHS was prohibited from disapproving “a site security plan submitted under this section based on the presence or absence of a particular security measure”. This provision resulted in DHS developing their Risk Based Performance Standard guidance document. This document provided expected outcomes that DHS would use to evaluate the eighteen performance standards (outlined in 6 CFR 27.230 that would have to be addressed in a covered facility’s site security plan.

RBPS Metrics


The guidance document provided a brief overview of each of the performance standards, including a discussion of the considerations that might have to be considered in selecting security measures and a brief outline of some of the types of security measures that could be employed. At the end of each performance standard discussion was a list of the metrics that the DHS Infrastructure Security Compliance Division (ISCD) would be using to evaluate the site security plan compliance with the RPBS.

For example, the metrics for RBPS 8, Cybersecurity, included:

• Cybersecurity policies;
• Access control;
• Personnel security;
• Awareness and training;
• Disaster recovery and business continuity;
• System development and acquisition;
• Configuration management; and
• Audits

Risk Assessment


Most of these metrics contained sub-metrics and a list of performance standards for each based upon the tier ranking of the facility. The tier ranking was a measure of risk assessment conducted by ISCD based upon data provided by the facility in a two-part risk assessment process. The first part was based upon the data provided in the facilities Top Screen submission. All non-exempted facilities in the United States that had chemical inventories that contained one or more of a list of 300+ DHS chemicals of interest (COI) at or above the screening threshold quantity (STQ) set for that COI were required to submit an on-line Top Screen.

ISCD took the information provided in the initial 40,000+ Top Screens to determine which facilities seemed to present a high-risk of terrorist attack. Those 7,000+ high-risk facilities were then directed to submit additional information via the on-line Security Vulnerability Assessment (SVA) tool. That information was then used to confirm the high-risk assessment and to further rank the risk of those facilities by placing them into one of four Tier; Tier 1 being the highest risk tier.

Site Security Program Negotiations


Once a facility receives its Tier ranking notification from ISCD it is then required to prepare and submit its site security plan (SSP) via another on-line tool. Since there is an expected inclination for facilities to minimize their spending on security (an expense with no expected financial return) and the guidance document is deliberately vague as to what security measures are required it is unlikely (ISCD has published no statistics on this) that any facility submitted an initial SSP that met the RBPS metrics is all aspects according to ISCD evaluators.

During the early days of the program ISCD took the stance that the Congressional prohibition against specifying security measures meant that ISCD could not tell facilities what they had to do to modify their SSP plan to ensure that it met all metrics. The most they could do was tell them what metrics had not been met. As the program advanced and Chemical Security Inspectors (CSI) had more experience with facilities having their SSPs authorized and later approved, many of the CSI were able to tell facility security managers what measures had been approved by ISCD at other facilities in similar situations.

As a practical matter this has meant that the process of getting an SSP approved has been a series of negotiations between facilities and DHS. The facility proposes an SSP and ISCD tells them where it is deficient. The facility then modifies the SSP and it is re-evaluated by ISCD. Some number of iterations of this process are required until the facility and ISCD can come to an agreement as to what security measures are necessary for that facility. Those security measures then become the enforceable CFATS requirements for that facility.

Manpower Intensive


While the on-line submission of the SSP allows for some automated analysis there are still a large number of man-hours needed to conduct the evaluation of each submission. Additionally, ISCD has been adamant that their CSI would be maximally available to facilities during the SSP approval process to help guide facilities through the approval process.

These manpower requirements were partially responsible for the lengthy delays that ISCD experienced during the early approval process. As the CSI became more experienced in the program, lower risk facilities were being evaluated, and ISCD instituted various management process improvements, the approval process was sped up significantly. But, even with these improvements, the SSP approval process is time intensive.

Lessons Learned


For anyone that is looking to the CFATS program as a model for creating a security related regulatory program that is both enforceable and does not specify any specific security measures in the regulations there are some very specific lessons to be drawn from an analysis of the CFATS program. The first and foremost is that a regulatory agency can negotiate facility specific security measures as long as:

• A strong, well-written set of performance standards is used as the basis of negotiation;
• There is a commitment on both the part of both regulated community and the regulators to work together to ensure the security of the regulated community; and
• The regulatory agency, and their political overseers, are willing to allow for a reasonable time frame for the negotiation process to proceed.

The second lesson that should be taken from the CFATS process is that for this process to be successful a relatively large and active inspection force is required to give the regulatory reviewers an accurate view of the on-the-ground situation at each regulated facility. The CFATS program has shown that there may be a need for multiple site visits by the inspection force to properly understand both the security plan and security capabilities of the regulated facility.


Finally, there must be a determined attempt to limit the size of the covered community to keep the size of the inspection force to a size that can be supported within the budget of the regulatory agency. CFATS did this by limiting the universe of potential covered facilities by limiting the number of chemicals that would drive the initial data submission. They then further reduced that number by doing a risk analysis to isolate just the highest risk facilities. Finally, they provided a means whereby a facility could ‘opt out’ of the CFATS program by reducing or eliminating their inventories of COI. Their initial universe of about 47,000 facilities was reduced to about 7,000 by risk analysis and they are now down to 2,984 covered facilities through the opting out process.

No comments:

 
/* Use this with templates/template-twocol.html */