Tuesday, July 25, 2017

House Passes HR 3364 – Cyber Sanctions

Today the House passed HR 3364, the Countering America’s Adversaries Through Sanctions Act by a nearly unanimous vote of 419 to 3. The bill includes provisions requiring the President to enforce various economic sanctions against anyone in Russia whom the President determines “knowingly engages in significant activities undermining cybersecurity against any person, including a democratic institution, or government on behalf of the Government of the Russian Federation” {§224(a)(1)(A)}.


I’ll be doing a more thorough review of this bill in the near future.

ICS-CERT Published an Alert, an Advisory and 4 Updates

Today the DHS ICS-CERT published a control system security alert for the CRASHOVERRIDE malware and a control system security advisory for products from NXP. The NXP advisory was previously published on the NCCIC Portal on June 1st, 2017. ICS-CERT also updated four previously issued control system advisories for products from Siemens (3) and GE.

CRASHOVERRIDE Alert


This alert briefly describes the CRASHOVERRIDE malware. This malware was previously identified by ESET (on June 12th), Dragos (on June 12th) and US CERT (on June 12th) which ICS-CERT fully credits. All three reports provide much more information than does the ICS-CERT Alert. ICS-CERT has provided a different set of YARA rules for the detection of the malware than those previously published by Dragos. The ICS-CERT rules appear to target different portions of the malware.

NXP Advisory


This advisory describes two vulnerabilities in the NXP i.MX Devices, used on logic boards. The vulnerabilities were reported by Quarkslab. These are hardware vulnerabilities that generally cannot be corrected by a software fix. ICS-CERT notes that the vulnerabilities “are only exploitable when the device is placed in security enabled mode”.

The two reported vulnerabilities are:

• Stack-based buffer overflow - CVE-2017-7936; and
• Improper certificate validation - CVE-2017-7932

ICS-CERT reports that a successful attack (by an uncharacterized attacker with uncharacterized access) could exploit the vulnerability to create a denial of service attack or to load an unauthorized image on the device affecting secure boot.

NOTE: These are not stand-alone devices, they are chip sets found on circuit boards on unnamed devices from unnamed supplier. Hopefully one (or more) of those downstream suppliers will develop a successful mitigation for this problem on their devices. But, it has been almost two months since notification was made to those vendors….

S7-300 Update


This update provides new information on an advisory that was originally published on December 13th, 2016 and then updated on May 9th, 2017. The update provides a link to a firmware update for the  S7-CPU 410 CPUs.

GE Update


This update provides new information on an advisory that was originally published on April 27th, 2017, and updated on May 18th, 2017. The new update identifies 8 legacy products that are affected by the vulnerability. It also provides links to previously identified firmware versions and newly mitigated products, including the newly identified legacy products. The firmware update for the URplus platform is still expected to be released this month.

PROFINET 1 update


This update provides new information on an advisory that was originally published on May 9th, 2017 and updated on June 15th, 2017, on June 20th, 2017, and again on July 6th, 2017. The update provides updated version information and mitigation information for the SINEMA Server: All versions < V14.


PROFINET 2 update


This update provides new information on an advisory that was originally published on May 9th, 2017 and updated on June 15, 2017. The update provides new affected version information and mitigation links for:

• SCALANCE XM400, XR500: All versions prior to V6.1;
• S7-400 PN/DP V6 Incl. F: All versions;
• S7-400-H V6: All versions prior to V6.0.7;
• S7-400 PN/DP V7 Incl. F: All versions;
• S7-410: All versions prior to V8.2;
• SINAMICS S110 w. PN: All versions prior to V4.4 SP3 HF5;
• SINAMICS S120 V4.7: All versions prior to V4.7 H27; and

• SINAMICS V90 w. PN: All versions prior to V1.1

Private vs Public Vendor Vulnerability Disclosures

Yesterday I had an interesting Twitversation with Michael Toecker (@mtoecker) about vulnerability disclosures for distributed control systems (DCS) a type of industrial control system apparently frequently used in power generation facilities (and a number of chemical manufacturing facilities). Apparently one major DCS vendor (Emmerson) does not publicly report their DCS vulnerabilities (via ICS-CERT for example), but relies upon private disclosure to system owners.

The conversation started when Michael tweeted that “Ovation has ~51% of the generation DCS market”. I had never heard of Ovation (not terribly unusual) so I looked it up on the ICS-CERT vulnerabilities page and could not find any listings of any vulnerabilities. I asked Michael about that and he replied: “They have their own cert for Ovation Users.” And the conversation went on from there and well worth reading aside from this post.

Which brings up the whole issue of vendors fixing and then communicating about security vulnerabilities in their software (which is different than the whole coordinated disclosure debate). I cannot remember discussing this particular issue in any detail before so now seems like a good time.

Mitigation Decisions


Software vendors have been dealing with fixing program bugs forever. They have developed techniques for identifying problems (including outside ‘help’ from researchers), fixing them and then getting the fixes into the hands of consumers. Some are better at it than others.

For industrial control system owners, the fixing of software (lumping in firmware and even some hardware issues here) problems is a bit more problematic than with a standard desktop software issue. The system needs to be taken off-line for some amount of time which requires a shutdown of production. The ‘update’ may cause unknown interactions with other control system components that interfere with production. And finally, the owner may not have anyone ‘on staff’ trained to deal with the above issues. So, the decision to apply a software fix is a cost benefit analysis that frequently results in a ‘if it ain’t broke don’t fix it’ response.

For a security related issues the ‘cost benefit analysis’ is even more difficult. The cost analysis remains the same, but the benefit side is much more difficult since it deals with risk analysis. The cost of potential failure has to be modified by how likely is the failure event to happen. Where no failure history exists (no attacks here) that probability is really difficult to determine.

That is especially true if there are no in-house cybersecurity experts to help make the decision. This is where the system owner has to rely on the information provided by the vendor (and/or system integrator) in describing the vulnerability that is being fixed by the most recent update (patch, new version, etc). A detailed description of what could go wrong, what an attacker would need to successfully exploit the vulnerability and other potential mitigation measures that could reduce the risk will greatly assist the asset owner/operator in making a realistic risk analysis.

Vulnerability Reports


In a near perfect world (no vulnerabilities in a ‘perfect world’) a software engineer from the vendor would call up the control system engineer at the user site and have a detailed discussion of the discovered vulnerability, the fix applied in the latest update, the potential interactions with other systems in use and the probability that an attacker could/would use that vulnerability upon that particular user. That is not going to happen for a whole host of obvious and not so obvious reasons.

In a less perfect world, the conversation would be replace by a detailed written report from the vendor describing the vulnerability in great detail, how it would affect operations and interactions with all probable other devices and software with which the product could be expected to interact. It would also include a discussion of the threat environment in which the product existed, with a report on the history of known/suspected exploits and the potential for exploits in a variety of customer environments.

Again, not going to happen. Too much time and expertise would be required to develop such reports that would also end up disclosing too much proprietary information. And, probably more importantly, they would never actually be read by the owner/operator.

In the real world, what happens is that a brief report (one to two pages) is prepared describing the vulnerability, who it might effect and the potential consequences of a successful exploit. To make the preparation and subsequent analysis of the report easier, a set of standard descriptions is developed and used in standardized report format. Not as much information would be provided, but that which is provided is more accessible and more likely to be used.

Vulnerability Communication


Now, not all vendors have the staff necessary for the development, publication and dissemination of these reports. Instead, they will rely on various computer emergency response teams (CERTs) to provide the communications. A vendor engineer will communicate with a CERT engineer to provide the necessary information and the CERT will write the vulnerability report. Frequently, but certainly not always, the individual who discovered the vulnerability will be involved in providing information to the CERT.

The decision then has to be made as to how the vulnerability report will get into the hands of the owner/operator. Where the vendor/CERT has contact information on all the owner/operators of the affected equipment the report can be directly communicated to them. Where the vendor/CERT does not have that contact information then the only way to get the information to the owner/operator is via public communication of the report.

Public disclosure has a couple of problems associated with it. First it is a public admission by the vendor that a mistake was made in the development of the product; something that the sales department does not generally want to tell potential customers. Second, it substantially increases the number of people that know about the vulnerability, thereby increasing the risk of potential attempts at exploiting the vulnerability.

Typically, the former problem is dealt with by the vendor/CERT first distributing the vulnerability reports privately to those customer with whom they are in contact (generally larger customers), allowing some reasonable time to lapse to allow those customers to remediate their systems and then make a public disclosure to the remainder of the customer base.

Oh, and that first problem? Sales is told to suck it up. After all, the other vendors in the market place (especially the big ones) are publicly disclosing their vulnerabilities, so it is not really an issue.

Public Disclosure Benefits


So, are there benefits to public disclosure that might suggest that it is a good alternative even when customer contact information is available? Actually, there are a number. First, and personally most important, non-customers get a chance to look the disclosure reports and provide their two cents worth in the risk evaluation process. Gadflies, like yours truly, get a chance to provide an outside quality control process to the vulnerability disclosure process to ensure that owner/operators have as much information as practical about the vulnerabilities.

Second, outsiders to the communication process have some access to the vulnerability information. This includes folks like investors, corporate management and yes, regulatory agencies. These are the folks that have a vested interest in ensuring that the proximate decision makers at the owner/operator are making reasonable decisions in their cost-benefit and risk analysis calculations. If they do not know about the existence of the vulnerabilities, they have no way of asking questions about the implementation of those processes with respect to those vulnerabilities.

And, last but not least, researchers in the field get a chance to see what types of vulnerabilities other researchers are finding (and ethically disclosing) and how vendors are dealing with those vulnerabilities. This provides some incentives for ethical (coordinated, or whatever current term you want to use) disclosure and it provides for a robust research community that has a source of fresh ideas about what types of vulnerabilities for which they should be searching.


Needless to say, I am a fan of public disclosure.

Bills Introduced – 07-24-17

Yesterday with both the House and Senate back in session there were 37 bills introduced. Of those, four may be of specific interest to readers of this blog:

HR 3358 Making appropriations for the Departments of Labor, Health and Human Services, and Education, and related agencies for the fiscal year ending September 30, 2018, and for other purposes. Rep. Cole, Tom [R-OK-4]

HR 3359 To amend the Homeland Security Act of 2002 to authorize the Cybersecurity and Infrastructure Security Agency of the Department of Homeland Security, and for other purposes. Rep. McCaul, Michael T. [R-TX-10]

HR 3362 Making appropriations for the Department of State, foreign operations, and related programs for the fiscal year ending September 30, 2018, and for other purposes. Rep. Rogers, Harold [R-KY-5]

HR 3364 To provide congressional review and to counter aggression by the Governments of Iran, the Russian Federation, and North Korea, and for other purposes. Rep. Royce, Edward R. [R-CA-39]

As usual I will be watching the two spending bills for cybersecurity measures (probably none in the case of these two bills, but you never know).

HR 3359 is the homeland security bill that I talked about briefly in my blog yesterday. The official text is not yet available, but I may review later today it based upon the committee draft that will be considered tomorrow by the House Homeland Security Committee.


HR 3364 may be of interest if the bill addresses cyber related aggression or cyber response to aggression by these three countries.

Monday, July 24, 2017

HR 3180 Fails in House – FY 2018 Intel Authorization

Today the House failed to pass HR 3180, the FY 2018 Intel authorization bill. It failed on a vote of 241 to 163 with a 2/3 vote (290 Ayes) being required for passage. It was a nearly party-line vote with 10 Republicans voting Nay and 30 Democrats voting Aye. The House Intelligence Committee will have to go back to the drawing board and try to craft a bill that can garner more support from the Democrats if the leadership continues to rely on passing the bill under suspension of the rules.

The other point to remember, however, is that if the bill could not garner the 2/3 vote required under the suspension of the rules process, then it likely would not be able to make it to the floor of the Senate due to the cloture vote requirement (3/5ths vote).


There were no cybersecurity provisions in this bill that would have been of specific interest to readers of this blog.

Committee Hearings – Week of 7-23-27

This week, with both the House and Senate in session we start to see action on spending bills in the Senate while House spending bills start to move to the floor of the House. Additionally, there is two cybersecurity hearing scheduled this week one on insurance and the other a markup hearing.

Spending Bills (Senate Appropriations Committee)


DOD Spending Bill


The House Rules Committee will hold a hearing to formulate the rule for HR 3219 tonight. What was the DOD FY 2018 spending bill is now the Make America Secure Appropriations Act, 2018; a mash-up of four spending bills {HR 3219 (DOD), HR 3162 (Legislative Branch), HR 2998 (Military Construction/VA), and HR 3266 (Energy and Water Development)}.

None of those bills currently have any provisions of specific interest here. The amendment process could certainly change that.

Proposed amendments are supposed to be submitted by later this morning. There were 28 amendments already submitted by 8:00 am EDT. There is only one cyber related amendment (cyber scholarship spending) currently on the list, but that will probably change.

The bill is currently scheduled to come to the floor later this week (Wednesday?).

Cybersecurity Insurance


On Wednesday the House Small Business Committee will be holding a hearing on “Protecting Small Businesses from Cyber Attacks: The Cybersecurity Insurance Option”. The witness list includes:

• Robert Luft, SureFire Innovations;
• Erica Davis, Zurich Insurance;
• Eric Cernak, Munich Re US;
• Daimon Geopfert, Security and Privacy ConsultingRisk Advisory Services

I will be very surprised if control system security issues are even mentioned in passing, but I am certainly open to surprises.

Cybersecurity Markup


The House Homeland Security Committee will be holding a mark-up hearing on Wednesday. Two of the bills may be of specific interest to readers of this blog. The first is HR 3202, Cyber Vulnerability Disclosure Reporting Act, the bill I reviewed yesterday. I certainly hope the Committee adds provisions requiring public posting of the unclassified report.

The second is a new (not yet introduced) bill by Chairman McCaul (R,TX) that would establish the Cybersecurity and Infrastructure Security Agency to replace the current National Protection and Programs Directorate. A committee print of the bill is available and a quick review of the provisions shows that it still relies on the IT-centric definition of ‘cybersecurity risk’ found in 6 USC 148(a). I would really like to see this bill change that definition to one based on the ‘information system’ definition found in 6 USC 1501(9). More on this bill later.

On the Floor of the House


In addition to HR 3219 mentioned above there are two other bills of potential interest currently on the schedule for consideration on the floor of the House. The first is HR 3180, the Intelligence Authorization Act for Fiscal Year 2018. While there are some cyber related provisions in the unclassified portion of the bill, none are of specific interest to readers of this blog. The bill will be considered today under the suspension of the rules, so no amendments will be possible.


The second is an as of yet unintroduced “Russia, Iran, and North Korea Sanctions Act”. It will be considered tomorrow, so it will be introduced today. A very quick review of the committee draft of bill does show mention of cybersecurity related sanctions. I’ll review those in more detail later. Interestingly, this bill is also being considered under the suspension of the rules provisions indicating that the leadership thinks this bill will receive substantial bipartisan support to meet the 2/3 majority vote required for passage.

Sunday, July 23, 2017

HR 3202 Introduced – Cybersecurity Reporting

Earlier this month Rep. Jackson-Lee (D,TX) introduced HR 3202, the Cyber Vulnerability Disclosure Reporting Act. The bill would require a report to Congress on procedures that DHS has developed in regards to vulnerability disclosures.

Section 2 of the bill requires DHS (within 240 days of passage of the bill) to report to Congress that describes “the policies and procedures developed for coordinating cyber vulnerability disclosures, in accordance with section 227(m) of  the Homeland Security Act of 2002 (6 U.S.C. 148(m) [Link Added; Note: it is §148(l) at this link, an amendment changing that para to (m) has not yet been published])” {§2(a)}.

Moving Forward


Jackson-Lee is an influential member of the House Homeland Security Committee, the committee to which the bill was assigned for consideration. It is very likely that she has enough influence to have this bill considered in Committee. There is nothing in the bill that would draw the ire of any organization. Since it just requires a very legitimate report to Congress it is likely that this bill would have enough bipartisan support to allow it to be considered under the suspension of the rules procedures in the House. If it were to be considered in the Senate, it would likely be considered under their unanimous consent procedure.

Commentary


Since the bill specifies that the main report will be unclassified (with a potential classified annex) I would have liked to have seen the bill include a provision for DHS to post a copy of the unclassified version of the report to the NCCIC web site. That would allow these policies and procedures to become public knowledge, as they should be. Without that sort of provision we may never see this report; it certainly will not show up on a congressional web site.


Saturday, July 22, 2017

Trump Administration Updates Unified Agenda – DHS

This week the Trump Administration’s Office of Information and Regulatory Affairs (OIRA) published an Update to the Unified Agenda. This provides a look at the results of the review of on-going regulatory actions previously addressed by the Obama Administration and new regulatory initiatives started by the new administration. The last Obama update of the Unified Agenda (Fall 2016 Unified Agenda) took place in November, 2016.

Trump’s OIRA described the current Unified Agenda this way:

“The Agenda represents ongoing progress toward the goals of more effective and less burdensome regulation and includes the following developments:
“Agencies withdrew 469 actions proposed in the Fall 2016 Agenda;
“Agencies reconsidered 391 active actions by reclassifying them as long-term (282) and inactive (109), allowing for further careful review;
“Economically significant regulations fell to 58, or about 50 percent less than Fall 2016;
“For the first time, agencies will post and make public their list of "inactive" rules-providing notice to the public of regulations still being reviewed or considered.”

DHS Active Rulemaking


As usual, I have gone through the list of active DHS rulemaking activities and came up with a list that may be of specific interest to readers of this blog. Table 1 lists those rulemaking activities.

OS
Proposed Rule
Chemical Facility Anti-Terrorism Standards (CFATS)
USCG
Proposed Rule
Revision to Transportation Worker Identification Credential (TWIC) Requirements for Mariners
TSA
Proposed Rule
Surface Transportation Vulnerability Assessments and Security Plans
Table 1: Items on Current Unified Agenda

This is down from eight that were on the Fall 2016 Agenda. One (1601-AA56) action has been completed with the final rule being published last December. Four items (1601-AA76, 1625-AB94, 1652-AA55, and 1652-AA69) have been moved to the long-range portion of the Agenda (see below).

The pages for each of the rulemakings have been substantially changed in this update. This version does not include a regulatory history (listing of when various stages of the rulemaking process have been completed including a link to the Federal Register for each publication noted). The update also does not provide an expected date for the publication of the next stage in the rulemaking process. In the past those have proven to be grossly inadequate guesses, so there is really not much lost by not including that information.

Long-Term Actions


The long-term action section of the Unified Agenda contains the listing of on-going rulemaking efforts that the Administration does not expect to see reach the next publication stage for at least 12 months. The long-term action section for DHS is quite lengthy. The list includes the rulemakings shown in Table 2 that may be of specific interest to readers of this blog.


OS
Ammonium Nitrate Security Program
OS
Homeland Security Acquisition Regulation: Safeguarding of Controlled Unclassified Sensitive Information (HSAR Case 2015-001)
OS
Updates to Protected Critical Infrastructure Information
USCG
Amendments to Chemical Testing Requirements
USCG
2013 Liquid Chemical Categorization Updates
Maritime Security--Vessel Personnel Security Training
TSA
Protection of Sensitive Security Information
TSA
Security Training for Surface Transportation Employees
TSA
Vetting of Certain Surface Transportation Employees
Table 2: Long-Term Actions for DHS

This list is longer than the one found in the Fall 2016 Unified Agenda. I have already noted that three items were moved here from the active agenda. Additionally, the Trump Administration added a new rulemaking (1625-AC36) that has been placed on the long-term action list. Finally, OIRA removed a rulemaking (1625-AB21) that had actually been completed (final rule published) well prior to the publication of the Fall 2016 Unified Agenda. The Obama OIRA apparently kept it on the list because the effective date was not until 2018.

Inactive Items


It is interesting to see the Trump Administration introduce the concept of the ‘Inactive Items’ list; rulemakings that have dropped off the Unified Agenda, but are still in the working files of the agency involved and action could possibly be expected at some future date. This list is also odd in that it is a .PDF document rather than an HTML table.

There are four rulemakings on the DHS portion of the list that may be of specific interest to readers of this blog. I have included in the list below a link to the last time that the rulemaking showed up in the Unified Agenda. It is very clear that the administration officials took their mandate to identify such latent rulemakings very seriously.

• 1625-AA12 – USCG – Marine Transportation--Related Facility Response Plans for
Hazardous Substances (Fall 2013);
• 1625-AA13 – USCG – Tank Vessel Response Plans for Hazardous Substances (Fall 2013);
• 1652-AA16 – TSA – Transportation of Explosives from Canada to the United States Via Commercial Motor Vehicle and Railroad Carrier (Fall 2011)
• 1652-AA50 – TSA – Drivers Licensed by Canada or Mexico Transporting Hazardous Materials to and Within the United States (Fall 2015)

Commentary


While Trump vociferously campaigned on a stand against new regulations, this publication of the Unified Agenda update makes it clear that we can still expect to see regulatory actions being taken by this administration. In fact, with respect to those types of regulations that would be of specific interest here, there has been absolutely no indication of a reduction in the change in the number of regulatory actions being undertaken.


It is not entirely clear at this point that the one new rulemaking added to the Unified Agenda Long-Term Agenda in this update (1625-AC36) is really a new regulatory action initiated by the Trump Administration. This has been an on-going issue since the 2010 amendments to the Standards of Training, Certificate, and Watchkeeping Convention and Code, but this is the first time that it has been officially noted in the Unified Agenda.

NIST Cybersecurity Workforce RFI Comments – 07-22-17

This is the first in a series of blog posts looking at the comments that NIST has received on their request for information (RFI) on cyber workforce development. The comments are posted to the NIST National Initiative for Cybersecurity Education (NICE) web site. Comments posted this week came from:


 One commenter specifically responded to questions posed by NIST in their RFI. The others were long form explications of viewpoints about specific issues. One was a copy of an article published on CIODive.com addressing some different non-traditional cybersecurity-training activities that have been tried. Another suggested that we need to start looking at specialization training for cybersecurity personnel rather than generalist training. And the last one addressed the need for rapid changes in cybersecurity training programs to reflect changes in the environment.


The comments from Eric Baechle provided specific responses for the NIST questions. The views from Eric paint a very bleak picture of how cybersecurity specialists are utilized at one, unnamed agency (presumably government agency, but that is not exactly clear). Not unexpectedly they paint a picture of an agency management that does not understand the complexities of the cybersecurity problems being addressed by the specialized workforce nor the work actually being done by their cybersecurity team. While this is not directly a workforce development issue (other than apparently there is no effort in this organization being made to continue developing the skills of the team being employed) it does help to explain why there may be retention issues and employee burnout affecting cybersecurity operations.

HR 3198 Introduced – FAA R&D

Last week Rep. Knight (R,CA) introduced HR 3198, the FAA Leadership in Groundbreaking High-Tech Research and Development (FLIGHT R&D) Act. The bill sets forth the research and development agenda for the Federal Aviation Administration. It includes provisions for cybersecurity research, including:

§31. Cyber Testbed.
§32. Cabin communications, entertainment, and information technology systems
cybersecurity vulnerabilities.
§33. Cybersecurity threat modeling.
§34. National Institute of Standards and Technology cybersecurity standards.
§35. Cybersecurity research coordination.
§36. Cybersecurity research and development program.

Most of these provisions address cybersecurity for the FAA flight control system and general FAA IT systems. Two sections (§32 and §36) deal more directly with aircraft cybersecurity.

Cabin Cybersecurity


Section 32 requires the FAA to “evaluate and determine the research and development needs associated with cybersecurity vulnerabilities of cabin communications, entertainment, and information technology systems on civil passenger aircraft” {§32(a)}. The evaluation will address:

• Technical risks and vulnerabilities;
• Potential impacts on the national airspace and public safety; and
• Identification of deficiencies in cabin-based cybersecurity.

Within 9 months of passage of this bill the FAA would be required to report back to Congress on the results of the evaluation and “provide recommendations to improve research and development on cabin-based cybersecurity vulnerabilities” {§32(b)(2)}.

Future Cybersecurity Program


Section 36 directs the FAA to “establish a research and development program to improve the cybersecurity of civil aircraft and the national airspace system” {§36(a)}. There is no specific guidance as to what that plan should include beyond mandating that a study of the topic be conducted by the National Academies. A report to Congress is required in 18 months.

Moving Forward


Knight and his two co-sponsors {Rep. Smith (R,TX) and Rep. Babin (R,TX)} are members of the House Science, Space, and Technology Committee, one of the two committees to which this bill was assigned for consideration. Babin is also a member of the House Transportation and Infrastructure Committee, the other committee. This means that both committees could actually consider this bill. With Chairman Smith as a cosponsor, it will almost certainly be considered in the Science, Space and Technology Committee.

There are no monies authorized to be spent by this bill and there are no provisions (mainly due to the lack of specificity in the requirements) that would draw the specific ire of anyone, so there should be no organized opposition to the bill. I suspect that it will be recommended for adoption by the Space, Science and Technology Committee and if it makes it to the floor of the House for consideration (probably under the suspension of the rules procedures) it will pass with substantial bipartisan support.

Commentary



It is strange that the cybersecurity of avionics control systems is never mentioned in this bill. The provisions of §32 and §36 are clearly intended to address the issue, but they never directly say that. I suspect that this is done so as not to raise the specific objection from aircraft vendors (and their avionics system suppliers) that no one has ever demonstrated a vulnerability of those control systems. The weasel wording allows those concerned to ignore the specific provisions and thus not oppose the entire bill. This is politics.

Friday, July 21, 2017

HR 3191 Introduced – Russia Cybersecurity

Last week Rep. Boyle (D,PA) introduced HR 3191, the No Cyber Cooperation with Russia Act. The bill would disallow the expenditure of any federal funds for any joint US – Russian cybersecurity initiative. This is a response to the announcement by President Trump after he returned from the G20 Summit that he and Putin had discussed forming a joint cyber-security unit to protect against election hacking.

Section 2 of the bill says simply:

“No Federal funds may be used to establish, support, or otherwise promote, directly or indirectly, the formation of[,] or any United States participation in[,] a joint cybersecurity initiative involving the Government of Russia or any entity operating under the direction of the Government of Russia.”

There are no qualifying definitions or explanations.

Moving Forward


Boyle is a rather junior member of the House Foreign Affairs Committee to which this bill was assigned for consideration. Three of his 13 Democratic cosponsors are also members of that Committee. In normal circumstances, this could provide for the possibility of the bill being considered in Committee. In this case, party membership probably trumps committee membership, so there is very little possibility of this bill being considered in Committee.

Commentary


Even assuming that this is not a completely knee-jerk reaction to a “policy” announcement by Trump (as we frequently saw from Republicans during the Obama Administration) and that there are legitimate reasons to object to the specific policy proposal, the blunt wording of this proposal contains the seeds of many potential unintended consequences.

For example, if Interpol formed a task-force to take down criminal gangs operating botnets, and that unit included police from Russia (where at least some of these botnet operations are headquartered) then this bill would prohibit US participation in the effort. I highly doubt that that is what the crafters intended.


I suspect, however, that this bill (and the two others, HR 3259 and S 1544, that have not yet been printed by the GPO) was written to provide Democrats the opportunity to proclaim that they have introduced legislation opposing Trumps inopportune proposal. Even if the bill were to somehow be considered and approved by the House and Senate, it would certainly be vetoed by the President, if the unit had been a serious policy proposal in the first place (and that is an open question since the unit was proposed in a TWEET®).

HR 2997 Introduced – FY 2018 FAA Reauthorization

Last month Rep. Schuster (R,PA) introduced HR 2997, the 21st Century Aviation Innovation, Reform, and Reauthorization (21st Century AIRR) Act. This is the House version of the 2018 FAA authorization bill. The Senate version is S 1405. There is one cybersecurity provision in the bill and a number of drone provisions.

Cybersecurity


Section 601 of the bill addresses the FAA’s strategic cybersecurity plan. It would require an update of the existing plan required under §2111 of PL 114-190 (130 Stat 626). It would specifically require that plan to be modified to include the establishment of the American Air Navigation Services Corporation, the vehicle for the privatization of air traffic control. The obligatory report to Congress is included.

UAS Provisions


Section 432 of the bill modifies codifies a number of current UAS provisions of US law by adding a new chapter (Chapter 455) to 49 USC. One of particular interest here is the Model Aircraft exception established in §336 of the FAA Modernization and Reform Act of 2012 (PL 1125-95, 126 Stat 77). That would be addressed in a new §45509, Operation of small unmanned aircraft. While in many ways similar to the new §44808 proposed in the Senate bill, there are some significant differences. Those difference include:

• Failure to include limitations to line-of-sight operations;
• Adds a 55-lb aircraft weight limit {§45509(a)(3)}; and
• Adds restriction on flying over amusement parks {§45509(a)(5)}.

Both bills include an obligatory reference to ‘within the programming of a community-based organization’. This bill actually provides a definition of ‘community-based organization’ and a requirement for the FAA to establish guidelines for “recognizing community-based organizations” {§45509(e)}.

Moving Forward


On June 27th the House Transportation and Infrastructure Committee held a mark-up hearing for HR 2997. A number of amendments were made (none of particular interest here) and the bill was ordered reported favorably by a nearly party-line vote (one Republican voted Nay). That report has not yet been published.

This bill will move forward to be considered by the full House at some point. Based upon the vote in Committee, this bill is not likely to be considered under the suspension of the rules process since that requires a 2/3 vote to pass the bill. This means that there will be some sort of amendment process adopted by the House Rules Committee.

Once the House and Senate pass both of their versions of the bill, a conference committee will work out the differences and a combined version will be voted upon in both houses. If recent history is any kind of guideline, the final bill will be approved in late November or early December.

Commentary


Both the House and Senate bills move to more narrowly cast the ‘model aircraft’ exemption to small UAS operation. It is becoming increasingly clearer that there never was any intention to exempt the general public from FAA UAS rules, only the relatively small group of individuals that belong to model aircraft clubs and societies. This would appear to open up a whole nest of problems for the FAA in moving forward with UAS regulations as the universe of potentially covered entities for the FAA regulations expands dramatically.

One way to avoid this general public regulation issue would be for manufacturers of small UAS destined for the consumer market to establish company sponsored UAS clubs with membership instructions included in every consumer UAS sold in the United States. Formal club rules with on-line meetings, training sessions and organized fly-ins would probably allow for recognition by the FAA. Especially since the Agency has no desire to get into consumer regulation enforcement.


I do have to admit that I was more than a little surprised and disappointed to see this bill add the amusement park restriction to the model aircraft section of the bill while continuing to ignore the potentially much more dangerous issue of the operation of UAS over critical infrastructure facilities such as chemical plants or electric grid infrastructure facilities. Critical infrastructure owners need to begin complaining vociferously about this issue.

Bills Introduced – 07-20-17

With both the House and Senate in session, there were 72 bills introduced yesterday. Of those, three may be of specific interest to readers of this blog:

S 1603 An original bill making appropriations for Agriculture, Rural Development, Food and Drug Administration, and Related Agencies programs for the fiscal year ending September 30, 2018, and for other purposes. Sen. Hoeven, John [R-ND]

S 1609 An original bill making appropriations for energy and water development and related agencies for the fiscal year ending September 30, 2018, and for other purposes.  Sen. Alexander, Lamar [R-TN]

S Con Res 22 A concurrent resolution expressing the sense of Congress on the use of the Intergovernmental Personnel Act Mobility Program and the Department of Defense Information Technology Exchange Program to obtain personnel with cyber skills and abilities for the Department of Defense. Sen. Rounds, Mike [R-SD]

The two spending bills will be watched for cybersecurity measures.


Another ‘sense of congress’ resolution on cybersecurity; I’m not sure what is going on here, but this will also be watched for definitions and wording.

Thursday, July 20, 2017

House Passes HR 2825 – DHS Authorization

Today the House passed HR 2825, the Department of Homeland Security (DHS) Authorization Act of 2017, by a substantially bipartisan vote of 386 to 41. The bill was considered under the suspension of the rules process that limited debate and did not allow any amendments to be offered. The bill easily met the 2/3 vote standard for passage under these rules.

A DHS authorization bill has yet to be introduced in the Senate during this Congress. It would be very unusual for the Senate to take up this bill without first considering an in-house version first.

The bill does include provisions addressing:

• Cybersecurity,
• Maritime security, and
• Surface transportation security

There has not been a DHS authorization bill sent to the President since the Department was originally created in 2002.

ICS-CERT Publishes an Advisory and an Update

Today the DHS ICS-CERT published a control system security advisory and an update to a previously published advisory, both for products from Schneider Electric.

Schneider Advisory


This update describes multiple vulnerabilities in the Schneider PowerSCADA Anywhere and Citect Anywhere products. The vulnerabilities are apparently being self-reported by Schneider. Schneider has developed new versions that mitigate the vulnerabilities.

The reported vulnerabilities are:

• Cross-site request forgery - CVE-2017-7969;
• Information exposure - CVE-2017-7970;
• Improper validation of certificate expiration - CVE-2017-7971; and
• Improper neutralization of expression/command delimiter - CVE-2017-7972

ICS-CERT reports that a relatively low skilled attacker could remotely exploit the vulnerabilities to perform actions on behalf of a legitimate user, perform network reconnaissance, or gain access to resources beyond those intended with normal operation of the product.

Schneider Update



This update provides new information on an advisory that was originally published on April 13th, 2017. The update provides information on a firmware update and a software update that are needed to mitigate the vulnerability.

Chemical Plants and Ransomware

There has been an interesting and on-going discussion on TWITTER® related to how chemical plants may be affected by ransomware like WannaCry. It was the result of the publication of two DHS-OCIA FOUO documents about WannaCry (here and here). They were published by PublicIntelligence.

The on-going TWITTER discussion was really based upon one entry in a chart in the second document described above; (U) Table 1—Ransomware Targeting and Susceptibility by Sector. The entry for the Chemical Sector contained the statement: “Chemical plants have manual overrides in place to ensure the safe containment of chemical processes in case cyber defenses fail. In some cases, it may be possible to run the chemical plant independently of cyber controls, otherwise the plant will most likely shut down.”

Most of the discussion has been on where the supporting data for that statement comes (short answer, no one knows) and how accurate that statement is. I cannot provide any information on the first, but a reasonable answer to the second will take more than 140 characters to explain.

Chemical Plant Automation


There is a great deal of variety in the level and sophistication of automation in chemical manufacturing processes. I have worked in a plant where there was absolutely no automation. Sensors were either analog or digital with no connections beyond a power supply. All operations are directly controlled by the operator manually operating various valves and power switches. Plants like this are unusual in this day and age. They are small plants typically running experimental processes on a shoestring budget. They are going to essentially be unaffected by ransomware except on the business process side of the house.

The most sophisticated facilities (and I have seen some of these, but never worked in one) have almost completely automated their chemical manufacturing processes. The extensive and complicated control system requires limited operator oversight; taking a wide mix of sensor data (temperature, pressure, flowrates and valve states for example) processes that data to develop (via a complex process control algorithm) commands to various operations devices (transfer valves, heating, cooling and vacuum controls for example) to control the manufacturing process. The operator actions are fairly limited to starting or stopping the process, making small manual adds of chemicals to the process and watching for process upset conditions.

Most specialty chemical manufacturing (batch processes) have a level of automation somewhere between these two extremes. An operator typically watches sensor data on a human machine interface (HMI) display and operates controls via the same HMI in response to a written set of instructions, training and experience. There may be some manual valve movements made by the operator or his assistants, but most are remotely operated via electrical or pneumatic operations.

Safety systems are in use (hopefully) in all plants regardless of the level of automation. They may be simple mechanical devices such as pressure relief valves or rupture disks. They could be process alarms that require operators to take manual corrective actions. They could be simple interlocks where a specific sensor output generates a direct command to operate a specific valve. Or they could be complex algorithmic responses to a variety of sensor readings resulting is a number of automatic operational changes to the process. These automated safety systems can reside in a stand-alone computer system with dedicated sensors and valves that are not in any way connected to the main process control system (the safest system) or various parts (or all of) the safety system could reside on the same computer system running the chemical manufacturing process.

In a perfect world, what determines the level of sophistication (and thus cost) of the safety system is the potential outcome of the process upset that it controls. The more serious the potential consequence of the process upset (again in a perfect world) the more complex and involved the safety system becomes. Where there are potential catastrophic, off-site consequences one would like to expect to see sophisticated stand-alone safety systems to prevent those catastrophic results.

Ransomware Effects


For purposes of this discussion I am going to assume that the ransomware has effected all networked controls system computers and that any stand-alone safety systems remain operational, these would include sophisticated systems, mechanical devices and most electro-mechanical interlocks (those not controlled through a PLC).

For the least automated systems the affects would be mainly cosmetic; operators would still be controlling the process, it would be more physical control with the operator going out and manually operating controls instead of using the HMI. This is assuming that there are still sensor readouts that do not go through the HMI. This would require either analog gauges or 4/20ma gauges wired to old-style displays.

Double displays with their associated wiring are a pain to maintain and frequently are considered a wasteful duplication of resources. The absence of analog gauges or non-computer sensor-output displays would mean that the operator would have no view of the key process control variables, and thus, no control of the process.

The consequences of going to full operator manual control of processes would be immense. I made the transition from full manual to semi-automated process control. We were able to add more sensors to better understand the process variables and those new sensors were in locations that were not readily accessible by the operator. Just those additional sensors decreased process times (and thus process costs) significantly as well as reducing product variability and off-spec products. We also significantly reduced the number of operators that were necessary to operate multiple processes that typically run at specialty chemical plants. Some plants would be able to operate at significantly reduced capacity, but increased product variability problem could have downstream quality effects on customer operations.

For fully automated chemical facilities (typically found in continuous process facilities like refineries) an instantaneous change to manual operation would not be possible. The lack of analog gauges and local sensor readouts and the relatively inaccessible manual controls would make it physically impossible for operators to coordinate the operation of the connected portions of the process in real time.

Safety Effects


Again, properly designed and implemented safety systems would be expected to stop any catastrophic consequences of sudden loss of control in chemical manufacturing systems. There were a number of very important qualifiers in that previous sentence. The major problem with designing safety systems is that it is very difficult to completely understand catastrophic failure modes in a manufacturing environment.

Typically, one has to use lab scale data to understand the physical parameters of those failure modes (NO ONE wants to do FULL SCALE testing of such failure modes) and then apply various models to try to scale up those test results to be able to plan for preventive actions to stop or mitigate the failures. No matter how sophisticated the modeling efforts they are, in the end, based upon educated guesses as to how the system will behave. Then systems are designed to try to best control those failure modes. And, it is not generally acceptable to really test those systems to see how they actually work in practice (in the emergency environment).

The OCIA Statement


The OCIA statement that started this discussion is almost certainly not based upon any survey of the chemical industry. It is a reasonable brief attempt by outsiders with a non-chemical manufacturing background to categorize the potential consequences of a non-chemical emergency event on generic chemical manufacturing.

If I were to attempt to reword this statement from a chemical manufacturing process point of view, it would read something like this:

“Chemical manufacturing facilities should have safety systems in place to contain catastrophic consequences in the event of loss of control. The efficacy of those systems and their operation in an instantaneous loss of computer control situation would have to be evaluated on a case-by-case basis. Continued commercial production without replacing/fixing affected computer based process controls could be possible is some unknown number of facilities. It would be difficult to accurately predict which facilities could continue commercially viable production.”


Bills Introduced – 07-19-17

Yesterday, with both the House and Senate in session, there were 54 bills introduced. Of these only one may be of specific interest to readers of this blog:

H Res 459 Expressing the sense of the House of Representatives that the United States should support the development of programs that better prepare students for careers in cybersecurity by actively promoting ethical hacking skills. Rep. Garrett, Thomas A., Jr. [R-VA-5]


Generally speaking, ‘sense of Congress’ resolutions are fairly meaningless political statements with no practical effect. I will be watching this one, however, to see how it is worded and what definitions, if any, it uses. I do not expect that it will actually see consideration in committee or on the floor of the House.

Wednesday, July 19, 2017

Bills Introduced – 07-18-17

Yesterday with both the House and Senate in session there were 27 bills introduced. One of those may be of specific interest to readers of this blog:

HR 3282 To amend title 49, United States Code, with respect to electronic logging devices, and for other purposes. Rep. Babin, Brian [R-TX-36]


I will only be providing additional coverage of this bill if it includes specific language addressing chemical transportation or if it contains cybersecurity provisions.

Tuesday, July 18, 2017

ICS-CERT Publishes Advisory and Updates Another

Today the DHS ICS-CERT published a new control system security advisory for products from Rockwell. They also updated another control system security advisory for products from Siemens. The Rockwell advisory was originally published in the NCCIC Portal on May 18, 2017.

Rockwell Advisory


This advisory describes an improper input validation vulnerability in the Rockwell MicroLogix 1100 Controllers. The vulnerability was reported by Mark Gondree of Sonoma State University, Francisco Tacliad and Thuy Nguyen of the Naval Postgraduate School. Rockwell has a newer firmware version that mitigates the vulnerability. There is no indication that any of the researchers have been provided an opportunity to verify the efficacy of the fix.

ICS-CERT does not provide any information on skill level or type access required to exploit this vulnerability. They just note that a successful exploit could lead to a denial of service condition.

Siemens Update


This update provides additional information on an advisory that was originally published on July 6th, 2017. The new information included updated version information for:

• Firmware variant Modbus TCP: All versions prior to V1.10.01,
• Firmware variant DNP3 TCP: All versions prior to V1.03, and
• SIPROTEC 7SJ66: All versions prior to V4.23
• SIPROTEC 7SJ686: All versions prior to V4.86
• SIPROTEC 7UT686: All versions prior to V4.01
• SIPROTEC 7SD686: All versions prior to V4.04

The only change seen in the security reporting from Siemens was affected version information and the update link for DNP3 TCP. The other updated version information was provided in the ‘Mitigation’ section of the earlier ICS-CERT version of the advisory, but not in the ‘Affected Products’ section.

Commentary


I have not done an actual tally to confirm this, but it seems to me that we see a much higher percentage of Rockwell product advisories making it to the NCCIC (or the old US-CERT) secure portal before being publicly disclosed than we do for Siemens products. Since it is not clear how this decision is made for limited disclosure, it would be unfair to say something untoward was happening; but, it does seem odd.


If the decisions are made based upon company requests for the delay, then this is a marketing call by the respective companies with no foul noted. If the decision is being made just by ICS-CERT, then the community probably deserves some process explication.

HR 3101 Introduced – Port Cybersecurity

Last month Rep. Torres (D,CA) introduced HR 3101, the Strengthening Cybersecurity Information Sharing and Coordination in Our Ports Act of 2017. The bill establishes a number of modest cybersecurity requirements for (and in support of) port operations.

Federal Requirements


Section 2 of the bill establishes federal requirements for cybersecurity risk assessments, information sharing and coordination. First it requires DHS to conduct (and subsequently evaluate) a risk assessment for maritime cybersecurity based upon the NIST Cybersecurity Framework. Next, it requires DHS to ensure that at least one maritime information sharing analysis committee (ISAC) participates in the National Cybersecurity and Communications Integration Center.

Paragraph (4) requires DHS to establish “guidelines for voluntary reporting of maritime-related cybersecurity risks and incidents (as such terms are defined in section 227 of the Homeland Security Act of 2002 (6 U.S.C. 148)) to the Center [NCCIC]”. The next paragraph then requires DHS to “to report [on] and make recommendations to the Secretary on enhancing the sharing of information related to cybersecurity risks and incidents between relevant Federal agencies and State, local, and tribal governments”.

Local Requirements


Section 3 of the bill establishes local cybersecurity requirements. First it requires each Maritime Security Advisory Committee “to facilitate the sharing of cybersecurity risks and incidents to address port-specific cybersecurity risks, which may include the establishment of a working group of members of Area Maritime Security Advisory Committees to address port-specific cybersecurity vulnerabilities” {§2(1)}. Next it requires all new maritime or facility security plan (under 46 USC 70103) to “include a mitigation plan to prevent, manage, and respond to cybersecurity risks” {§2(2)}.

Specifically §4 amends two separate provision of 46 USC {§70102(b)(1)(C) – facility and vessel assessments – and §70103(c)(3)(C) – vessel and facility security plans} by adding the word “cybersecurity” after “physical security”. It would also add a requirement for vessel and facility security plans to address the “prevention, management, and response to cybersecurity risks” {new §70103(c)(3)(C)(v)}.

Moving Forward


While Torres is not a member of either committee to which the bill has been assigned for consideration, two of her cosponsors are {Rep. Correa (D,CA) – Homeland Security; and Rep. Wilson (D,FL) – Transportation and Infrastructure}. This means that there is at least a chance that either or both of these committees could consider HR 3101.

I do not see anything in the bill that would engender any significant opposition. If the bill were to be considered on the floor of the House it is likely that it would pass, probably under the suspension of the rules provision.

Commentary


Once again, the provisions of this bill rely on the 6 USC 148(a)(1) definition of ‘cybersecurity risk’, a definition that is limited to information systems and does not include control systems. This would mean that the requirements of this bill would not apply to operation of any of the many critical control systems found on vessels or in maritime facilities.


I would again like to point to a solution to this definitional problem in port cybersecurity legislation that I proposed in an earlier blog post. It would still use the existing, IT-centric, definition of ‘information system’, but would add a new definition for ‘control system’ and then combine both terms in the definition of ‘cybersecurity risk’.
 
/* Use this with templates/template-twocol.html */