Identity, data management crucial to cloud success
IT efficiencies hinge on how well the cloud can scale to large sets of users, records and data
Identity and key management need to be addressed if cloud computing is going to achieve the cost savings and information technology operation efficiencies promised by proponents of the computing model, according to industry and government repreentatives speaking at a recent cloud conference.
Another issue revolves around data and how organizations can get it into the cloud and ported across different cloud computing environments, speakers told attendees at the National Institute of Standards and Technology’s Cloud Computing Summit held on May 20 at the Commerce Department in Washington, D.C.
The summit served as a call to action for industry and the government to work collaboratively on standards for cloud computing interoperability, portability and security. The government defines cloud computing as an on-demand model for network access, allowing users to tap into a shared pool of configurable computing resources.
NIST portal could get cloud standards to fly
Don't look down: The path to cloud computing is still missing a few steps
“There are some technical issues related to security that are often underestimated,” said Tim Mather, founding member of the Cloud Security Alliance, during a morning panel with experts from industry.
“We have problems scaling certain technologies related to security” – identity management and key management come to mind, he said. “The cloud will exacerbate those problems."
Identity management deals with identifying individuals authorized to access an information system and controlling the access to the resources in that system by placing restrictions on the established identities. Key management focuses on the generation, exchange, storage, safeguarding, use, vetting and replacement of cryptographic keys, which are used to change plain text into encrypted data for higher levels of security.
“Identity is so broken," Mather said, adding that, given NIST’s expertise in computer security and encryption, the agency could play a role in guiding standards efforts for these areas.
Those issues “are technical and down in the weeds,” he said. "But if the cloud is really going to scale and if we are going to get to inter-clouds, interoperability and portability, those two problems have to be solved,” he said.
John Shea, director of Enterprise Services and Integration for the Defense Department's Office of the Chief Information Officer, acknowledged the scalability problem when dealing with a large number of identities.
DOD’s biggest challenge is the unknown, he said. “We found scaling is a huge question mark because it is not just about the cloud, but communities of interest and backbone clouds and how they play together,” he said during an afternoon session.
The question is: “Can we really deal with identity management on a large scale?” he asked. DOD has to deal with 47 million identities made up of employees, contractors, dependents and retirees – and that’s a huge number, he said.
On the cloud computing front, the CIO’s office is working on a proof -of-concept for moving desktop systems for 3 million users to the cloud. Sustaining desktops for such a large number of users cost roughly $8,000 to $10,000 a seat per year, he said. Three million users times $8,000 adds up to a lot of money that can be repurposed for other uses, he said.
Seventy-five percent of the user’s data is associated with Microsoft Office applications and Adobe Portable Data Format files, so that is the focus area, Shea said. However, to move those applications to the cloud, DOD needs a strategy. “And we don’t have a strategy yet,” he said.
But Shea’s team isn’t sitting idly by. “We are always looking for applications that we can demonstrate as [candidates for] movement to the cloud.”
Data and the management of that data is another challenge that will have to be addressed, Jim Blakely, director of data center virtualization and cloud computing for Intel, said during the industry panel.
“Data overall is a huge challenge,” Blakely said. “How do I get it in, how long will it take me? If I’m going to use [the cloud] for burst capacity [an increase in traffic], there is a potential for bottlenecks,” he said.
“Once I got the data in, how do I get it out?” he asked.
The Census Bureau dealt with potential bottlenecks for its 2010.census.gov web site by turning to the cloud.
Brian McGrath, Census' chief information officer, said the agency didn’t have the means to evaluate how traffic would impact the site in advance. Plus, the IT staff was concerned about the cost and time it would take to put up a site, he said during an afternoon government implementation panel.
Census was also concerned with downtime, especially that caused by denial-of-service attacks. Additionally, Census didn’t want to have a lot of hardware around for years after the 2010 Census was over.
So, Census contracted to use Akamai’s content delivery network, which is hosting the site in the cloud, he said. The bureau also uses Everbridge for mass notification to alert and send information to the nearly 1 million people temporarily hired to help with the Census.
Census was able to take advantage of the certification and accreditation processes the agency shares with other federal partners to ensure the vendors met Census’ security and compliance requirements, he said.
Census has also constructed a private cloud – based on a virtualized environment -- for the exchange of more sensitive information, McGrath said.
As agencies move to the cloud, McGrath advised that they first stop speaking “geekspeak” to the business managers and focus on the services they plan to deliver. He noted that his organization is a fee-for-service operation, so the IT directorate performed pilots showing cost-savings associated with a virtualized environment.
The decennial census was an opportunity to prove that IT could deliver services quickly, he said.