CyberEye

By Patrick Marshall

Blog archive
risk alternatives (Narong Jongsirikul/Shutterstock.com)

NIST's how-to for prioritizing risk

Some of the hardest parts of a security professional’s job are identifying which elements in an enterprise infrastructure pose the greatest risk and keeping that infrastructure secure going forward. The underlying constraint in these considerations is how to do this with a less-than-infinite budget.

In many organizations, and certainly for most of government, that comes down to keeping systems up and running when at least some part of that infrastructure depends on legacy systems. Agencies can’t replace all of the aging machines and applications, so where should they invest scarce dollars to boost security, while at the same time making sure they don’t introduce problems that prevent that infrastructure from functioning properly?

That’s what the National Institute of Standards and Technology most recent guidance on risk assessment aims to address. Unlike other cybersecurity guidance NIST has published, however, this document includes a step-by-step process that agencies can use to identify the most critical parts of an infrastructure so they can better choose what to upgrade and where to spend their (usually scarce) dollars.

NIST itself said the new guidance builds on previous publications, such as SP 800-53 Rev. 4, SP 800-160 and SP 800-161, all of which also emphasized picking out critical parts of an infrastructure, but didn’t say how to do that.

The relevant publication, the NIST Cybersecurity Framework -- an answer to the President Barack Obama’s 2013 Executive Order 13636 on “Improving Critical Infrastructure Cybersecurity” -- includes a detailed mechanism that organizations can use to better understand how to managing security risks.

The framework has become a standard document for both public- and private-sector organizations in establishing their approach to cybersecurity. In May, the Trump White House issued an executive order on strengthening federal cybersecurity that effectively made use of the NIST framework government policy.

The new NIST guide described what it calls a “high level criticality analysis process model,” which steps users through the various components needed to get to the end point of a detailed analysis of the criticality levels for all of the programs, systems, subsystems, components and subcomponents needed in a particular enterprise.

This kind of approach will give agencies more certainty in what they buy, and it won’t upset the business logic that supports an agency and its mission. After all, even though cybersecurity has certainly risen in the list of agency priorities, the main question most IT managers ask security product vendors is how any new tool will affect the normal running of current networks and systems.

The authors of NIST's new guidance believe their approach could eliminate the debate over return on investment of security solutions versus the long term resilience of systems. That’s something to be hoped for, but it may be a while before agency bosses shunt aside the well-established ROI for something that’s still so nebulous -- for now, anyway -- as resilience.

The new NIST publication does hint at the need for more active outcomes for all of the guidance -- from NIST and others -- that’s been published over the last few years. The House, for example, recently tried to push measurable metrics onto the NIST Framework through the NIST Cybersecurity Framework, Assessment and Auditing Act of 2017, which was introduced in February.

It would be a real advance if that effort produced actual metrics that could be used because it’s been notoriously hard to do that with any kind of specific security guidance. Each organization has very different needs when it comes to the application of security, so getting a general set of metrics to measure effectiveness may not be possible.

Still, the current draft of the NIST criticality guidance, which is open for comment until Aug. 18, gets halfway there. It at least promises to give users a better idea of what they have and how best to insert new security solutions and systems. That should make for a more certain and more effective acquisition process. And, who knows, it might eventually take its place alongside the NIST Cybersecurity Framework as a solid basis for government cybersecurity efforts.

Posted by Brian Robinson on Jul 24, 2017 at 10:33 AM


Featured

  • Defense
    The U.S. Army Corps of Engineers and the National Geospatial-Intelligence Agency (NGA) reveal concept renderings for the Next NGA West (N2W) campus from the design-build team McCarthy HITT winning proposal. The entirety of the campus is anticipated to be operational in 2025.

    How NGA is tackling interoperability challenges

    Mark Munsell, the National Geospatial-Intelligence Agency’s CTO, talks about talent shortages and how the agency is working to get more unclassified data.

  • Veterans Affairs
    Veterans Affairs CIO Jim Gfrerer speaks at an Oct. 10 FCW event (Photo credit: Troy K. Schneider)

    VA's pivot to agile

    With 10 months on the job, Veterans Affairs CIO Jim Gfrerer is pushing his organization toward a culture of constant delivery.

Stay Connected

Sign up for our newsletter.

I agree to this site's Privacy Policy.