Common IT security framework for government gets a step closer
- By William Jackson
- Sep 21, 2012
The National Institute of Standards and Technology has released revised guidelines for risk assessment, outlining updated steps for establishing risk-based security in federal information systems.
Risk assessment is identifying, estimating and prioritizing the risks to an organization’s operations and assets so that they can be effectively addressed.
Special Publication 800-30 Rev. 1, Guide for Conducting Risk Assessments, is the last of five documents initially planned by an interagency task force to help harmonize IT security requirements across civilian agencies, the military and the intelligence communities. The significance of the effort is enormous, said Ron Ross, a NIST fellow in the Computer Security Division.
As cyber threats get smarter, prevention must keep pace
“For the first time in over four decades we are moving toward a common information security framework for all government,” Ross said. “It’s going to take a while to get all of the documents operationalized, but the transition is well under way.”
Under the Federal Information Security Management Act, NIST is responsible for developing guidelines, standards and specifications for IT security, but the FISMA requirements do not apply to the .mil domain or to national security IT systems. That split has resulted in separate but overlapping security programs for the different sectors of government. Civilian, military and intelligence agencies have been cooperating for four years to bring their information security policies in line with one another's under the Joint Task Force Transformation Initiative.
An interagency working group was formed under the task force in April 2009 by NIST, the Defense Department and the Office of the Director of National Intelligence with the goal of producing a unified information security framework. NIST has the lead in the group and publishes the guidance in its 800 series of special publications.
In addition to the revision of the risk assessment guidelines, the task force has produced:
• SP 800-37 Rev. 1: Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach.
• SP 800-53 Rev. 3: Recommended Security Controls for Federal Information Systems and Organizations, in multiple volumes.
• SP 800-53A Rev. 1: Guide for Assessing the Security Controls in Federal Information Systems and Organizations.
• SP 800-39: Managing Information Security Risk: Organization, Mission and Information System View.
The task force’s work is not finished, Ross said. A new revision of SP 800-53 is in the works and the task force is working with the Committee on National Security Systems (CNSS) to propose additional documents to be included in its body of work.
Ross called last year’s SP 800-39 on managing risk the flagship of the task force’s efforts. It gives a standardized approach with a deep set of security controls to the challenge of securing IT systems. Risk management is not about producing absolute security, however. It means controlling and mitigating risk when possible and practical, and accepting an appropriate level of risk for each system.
The new revision of SP 800-30 answers the question, “when do I stop applying controls?” Ross said.
The question of what security controls to apply and when to stop adding them becomes more important as systems become more complex and the threats more varied. This increased complexity is reflected in Revision 4 of SP 800-53, Recommended Security Controls, now in draft form, which increases the number of available security controls from 600 to more than 800.
“We now have to be more nuanced in how we select and implement these controls,” Ross said. The new publication will provide that nuance with more specialized controls.
Ross said the need for the new version of SP 800-53 is urgent and it is expected to be released in late November.
He called it a part of the opening of a second front in IT security, which focuses on building security into IT products and systems from the beginning, rather than only addressing it after the fact as threats emerge. He identified three critical steps in achieving this new front:
• Simplifying infrastructure, which has become more complex with the growing availability of commodity products being incorporated into systems. This is beginning to be addressed in government’s adoption of cloud computing and implementation of enterprise architecture, which hold the promise of standardizing and consolidating systems.
• Specialization of controls, to give managers more and more effective choices in implementing security that meets the needs of their organizations.
• Integrating security into organizational processes.
“The security folks need to learn to speak the language of program managers and mission owners,” Ross said. This would allow the tradeoffs needed to balance cost, mission objectives and security to be made in design and acquisition phases rather than after the fact.
A new special publication on system security engineering is in the works, expected to be published in 2013, which Ross said could be included among the joint task force’s documents if the CNSS approves.
Among other candidates for inclusion are new revisions of SP 800-18, Guide for Developing Security Plans for Federal Information Systems, last updated in 2006, and SP 800-137, Information Security Continuous Monitoring, published last year.
William Jackson is a Maryland-based freelance writer.