NIST talks simulation and cloud roadmap at forum

Agency wants model to predict behavior

The National Institute of Standards and Technology is working on a simulation model to understand and predict behavior in cloud computing systems, Dawn Leaf, the agency’s senior executive of cloud computing told attendees at a NIST forum on Nov. 4.

The cloud computing simulation model project, also known as Koala, focuses on the behavior of infrastructure-as-a-service cloud systems. The objectives are to compare behavior of proposed resource algorithms for IaaS clouds, and discover and characterize complex natures that may occur in those clouds.

NIST officials expect to share the initial findings of the project in early 2011, Leaf said during a presentation at NIST’s Cloud Computing Forum and Workshop II held in Gaithersburg, Md., Nov. 4-5.

The simulation project is an example of work NIST and agencies such as the General Services Administration have been doing since May when the NIST held its first Cloud Computing Summit.

Related coverage:

NIST to preview collaborative standards portal at cloud forum

Guidelines would speed certification of cloud products, services

Thinking of a private cloud? Government gets an expanding choice

Other work has included the release of a draft special publication that gives security guidelines for virtualization, the release of security controls for the Federal Risk Authorization and Management Program (FedRAMP) as well as development work on a portal designed to foster collaborative development of cloud computing standards known as the Standards Acceleration to Jumpstart the Adoption of Cloud Computing (SAJACC) portal.

NIST is now looking forward to developing a strategic roadmap for cloud computing with the help of federal and industry stakeholders, Leaf said.

The first step is to define target government cloud computing business use cases, Leaf said. These cases would be different from business implementation cases such as the 30-plus cases across federal, state and local governments published on the Federal CIO Council's website in May.

“The goal here is to identify opportunities for deploying clouds that we have not yet implemented,” she said. The aim is to identify the interoperability, portability and security requirements needed to go forward.

These business use cases are also different from those connected with SAJACC. The 24 SAJACC use cases – announced at the forum – focus on how consumers get data into cloud service providers’ environments.

The target business use cases are more operational. A hypothetical example could be determining what requirements are needed to implement a community cloud for export licensing enforcement that supports the Commerce, Defense, Homeland Security and State departments.

Leaf used this example because in the federal government “we tend to polarize between public and private cloud.” If data is already available on the web, agencies are comfortable putting it in a public cloud. If there are security requirements the approach is to put the data in a private cloud. However, there are many types of clouds between these two such as community or hybrid clouds that need to be explored, Leaf said.

The next step is to define a neutral cloud computing reference architecture and taxonomy. A hardware manufacturer’s reference architecture would be more focused on infrastructure while a data management provider would tend to focus on data management issues.

So it is not clear what a cloud computing reference architecture what look like at this point. The goal is to open the dialogue, Leaf said. What is clear is that the model should not proscribe a particular implementation. Plus, it has to be flexible enough to allow cloud services to be mapped to an overall model so business use cases can be discussed.

The third part of NIST’s strategy is generating a cloud computing roadmap. By translating business mission requirements against a cloud reference model, NIST hopes to identify the gaps needed to be filled with regards to standards. “We can figure out what is missing in terms of standards.”

Leaf emphasized that ownership of this cloud computing roadmap is community-based involving collaboration between the government IT community and industry delivering its expertise in terms of a reference model, ontology and technology.

Federal CIO Vivek Kundra also emphasized the need for government and industry partnerships to achieve the goals for cloud computing.

Kundra reflected on the momentum building for the cloud. For example, GSA recently awarded 11 contracts to vendors that will provide infrastructure as a service including storage, virtualization and web hosting. The cities of Los Angeles and New York and the state of Wyoming are moving services to the cloud. IBM and Microsoft have announced government clouds.

Government and industry are on a one-way street headed toward the cloud, he said. “We want to make sure as we think about policy and security it is not done so in an abstract, closed ecosystem,” Kundra said. The process has to be done in an open, participatory fashion “so we can be beneficiaries of everyone’s thinking,” he noted.

CIOs have to make sure they have the right security controls in place as they move agency resources to the cloud. That is why the government launched FedRAMP and this week released a set of proposed controls and models to certify cloud solutions governmentwide, Kundra said. He urged those from the public and private sector attending the forum to look at those controls and give the government feedback.

“We want to make sure from an economic perspective as cloud vendors sell into the government they are not doing so in a fragmented manner where they are negotiating with every bureau and agency,” Kundra said.

It is vital that the government and private sector get standards right from the beginning because once they are hardwired it will be difficult to change, he said.

Kundra noted that he attended the World Economic Forum on Cloud Computing on Nov.3 where interoperability and portability were not the only issues discussed. Attendees were concerned about the future of data sovereignty as data moves across multiple boundaries not just at the state and local levels but between nation states.

Governments need to think about governance models to address that issue, Kundra said.

Throughout the first day of the forum held at NIST’s headquarters, panel discussions focused on standards, reference architectures, the global community and other cloud issues. Today, government and industry cloud computing practitioners will roll up their sleeves and exchange ideas during breakout sessions held at the Holiday Inn Gaithersburg.

About the Author

Rutrell Yasin is is a freelance technology writer for GCN.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected