Portal to aid in development of standards is coming soon
NIST is still working on cloud computing standards
- By Rutrell Yasin
- Jul 22, 2010
A portal to facilitate collaborative development of standards to support cloud computing requirements is on schedule to be ready by the end of the year, said Dawn Leaf, senior executive for cloud computing at the National Institute of Standards and Technology.
However, she told an audience at a Brookings Institution event in Washington, D.C, that due to the amount of work and collaboration required to develop standards, NIST can't promise a completion date for the standards.
The Standards Acceleration to Jumpstart Adoption of Cloud Computing is a strategy and a process that will become the portal. The standards NIST needs to develop will deal with interoperability, portability and security issues related to cloud computing, she said in her presentation, delivered July 21.
The goal of SAJACC -- introduced in May at the NIST Cloud Computing Summit -- is to address the issue of supporting the implementation of complex technology during the period when standards are needed but not yet developed, and to speed up the development of the standards, she said.
Leaf joined a panel discussion with government officials from the General Services Administration and the Federal Trade Commission at the event titled “Moving to the Cloud: How the Public Sector Can Leverage the Power of Cloud Computing,” held at the Brookings Institution headquarters in Washington, D.C. Darrell West, vice president and director of Governance Studies and director of the Center for Technology Innovation at Brookings, moderated the panel. West also released a report, “Steps to Improve Cloud Computing in the Public Sector,” at the event.
11 ways to boost cloud adoption in government
NIST portal could get cloud standards to fly
Leaf made it clear that NIST does not develop standards but helps facilitate the process within industry and academia.
“It is impossible to identify at the top level all the standards that are required. Standards are only effective if they address real-life problems, Leaf said. “That is the whole point of the scenario of SAJACC.”
She estimated that standards can only address about 20 percent of the top interoperability, portability and security issues. "Maybe we put 80 percent effort in and address 20 percent of the areas and get tremendous payback from that,” she said.
Another area that slows down the standard process is that experts from different areas such as telecommunications or privacy and security, for instance, have their own focus.
That’s why we can’t give a specific time frame,” Leaf said. NIST has identified a set of 25 requirements based on operational scenarios that will address the greatest portion of interoperability, portability and security issues. NIST will make those available to stakeholders in the SAJACC portal, which is coming out at the end of the year.
Meanwhile, David McClure, associate administrator of GSA’s Office of Citizen Services and Innovative Technologies, said agencies are starting to put public-facing data into the cloud already. Data.gov and Recovery.gov are examples.
Moving to the cloud does not mean that agencies relinquish control of their data to third parties or service providers, he said, referring to a pervasive fear among many agency leaders about cloud computing.
Agencies still have to go through the process of privacy assessment, data categorization schemas of whether the information is sensitive, proprietary or related to national security before posting something to a publicly accessible cloud, McClure said.
“The Open Government Directive is not about relinquishing control of agency information and turning over data to a third party that determines access and use,” McClure added.
Rutrell Yasin is is a freelance technology writer for GCN.