Do agencies need a FedRAMP for cloud service agreements?
- By Rutrell Yasin
- Aug 09, 2013
Every federal agency handles service level agreements differently based on each organization’s unique requirements. But is there a way for the government to offer a standard approach to SLAs for cloud computing that can be applied across agencies similar to what is being done with the Federal Risk and Authorization Management Program?
A FedRAMP-style approach for SLAs would help both industry and agencies maximize the cost efficiencies and benefits of cloud computing and commodity IT, according to Keith Trippie, executive director for the Homeland Security Department’s Enterprise System Development Office.
DHS established SLAs for new data centers that were being designed in the mid-2000s, but those were not at a granular enough level to address the various private cloud services the agency wanted to implement, Trippie explained July 24 during a GCN Tech Essentials webinar on cloud migration. DHS officials added a set of SLAs and targets – based on best practices – that they wanted for each of the particular sets of services.
However, Trippie said he could guarantee that if DHS officials took those SLAs to the Justice Department or Census Bureau or any other agency, folks there would do SLAs differently. Maybe, he said, it’s time to raise the standards and offer a FedRAMP approach to get the same level of standardization for SLAs.
“Can we get to a level of service level agreements that are beyond best practices that are actually implemented across multiple federal contracts as opposed to all the one-offs?” Trippie asked. The one-offs make it difficult for industry to fully meet what the federal government is looking for and federal agencies will not realize cloud’s cost efficiencies.
FedRAMP provides a standard approach for security assessment, authorization and continuous monitoring of cloud products and services. The program uses a “do once, use many times” framework that is expected to reduce the cost, time and staff required to conduct redundant agency security assessments of cloud solutions.
The National Institute of Standards and Technology describes a service level agreement as a document stating the technical performance promises made by the cloud provider, how disputes are to be discovered and handled, and any remedies for performance failures.
Improved SLA management is something that the General Services Administration is monitoring as part of its cloud-brokerage effort, Mark Day, acting deputy assistant commissioner for the Office of Integrated Technology Services within GSA’s Federal Acquisition Service, wrote in an email. GSA is exploring how best to empower agency SLA management. Part of that effort involves determining whether any common SLA standards might provide value across agencies.
“Some have suggested a FedRAMP-like model might help, which is an interesting idea,” Day said. However, “in the FedRAMP model, the effort had an advantage of a discrete set of standards that NIST published,” he said. “That level of standards does not currently exist for SLAs.” Still, GSA is “interested in anything which will benefit our customers and help them achieve greater cost savings and efficiencies in successful transition to the cloud environment," he said.
Trippie said the folks at DHS are keeping their eyes on some of the efforts NIST is leading. More than a year and a half ago, NIST released the U.S. Cloud Computing Technology Roadmap, Volume 1, which acknowledged the need for industry and the federal government to develop and adopt consistent technical specifications to enable the creation and practical evaluation of SLAs between customers and cloud providers.
More recently, NIST submitted a whitepaper on SLAs to the International Standard Organization’s SC38 group, which was accepted as a work item by team members, said Robert Bohn, NIST Cloud Computing Program Manager. The whitepaper addresses the requirements that should be considered and put into cloud SLAs. The guidance from the document would give an organization about 80 percent of the requirements needed to write an SLA. The document then would leave room for the agency to put in the type of metrics and measurements it would require; for instance, the requirement that the cloud provider must have 99 percent system availability.
GSA was able to massage security controls NIST had published and put them into FedRAMP, Bohn noted. However, there has been “zero discussion between GSA and NIST so far for putting together a FedRAMP-type SLA.” However, work on international standards could lead to that, he said.
“I see FedRAMP as a basis for service level agreements for security,” said Kevin Jackson, vice president and general manager of cloud services with NJVC, a provider of cloud broker services. It is common practice that cloud users own their data, but if they want to change from one cloud service provider to another, how do they get it into a format they can use?
Another area is virtualized infrastructure. How do you take a virtual machine that is on a government cloud and get an equivalent type of machine on a commercial cloud? The federal government should set up a standard and an SLA providing a consistent description of a virtual machine as well as emerging technology such as software-defined networking, Jackson noted. Then agencies can have a consistent virtual data center no matter what cloud service provider they are using, he said.
NIST could do this, though it actually should be a business process handed down from the federal government, such as from the federal CIO and the Office of Management and Budget, Jackson said. GSA also did a fairly good job of establishing minimum-level SLAs for multiple infrastructure-as-a-service providers when it awarded a contract authorizing selected vendors to perform IaaS cloud services for agencies. This also could be a model used across the federal government, he said.
Rutrell Yasin is is a freelance technology writer for GCN.