One of the criticisms usually weighed against cloud computing is that, with many cloud services, the actual location of where they store your data is unknown. Google, for instance, does not divulge the location of its servers that handle Google Docs. For government agencies that need to keep track of the location of the data for policy and regulatory reasons, this is a major deal-breaker.
But should it be? Knowing where the data is located, and that proper protective measures are in place there, is certainly instrumental in safeguarding the data. But location may not be the correct way to think about these concerns, said Lew Tucker, who is the chief technology officer for cloud computing initiatives at Sun Microsystems. He brought up this point June 1 in a cloud computing panel at the CommunityOne conference.
The question of "where the bits reside, of what geography or national boundary these bits exist within," is somewhat moot, given that "we are totally connected by networks," he said.
In fact, access, rather than location, may be the better way of thinking about things.
"It really is who has access to these bits that is the really critical question, not the locale where they reside in," Tucker said. "But right now we are governed by rules about the locale of the disk drive."
It's a good distinction. When you think about the location of a particular document, or anything else, what you are really thinking about a series of bits residing on some physical medium, such as a hard drive or tape drive, which itself is probably located in a network-connected data center.
But no one who is actually inside the data center can view the data with any more ease than any than anyone else on the network, In fact, if the data resides on a server without a monitor, everyone can access the data in exactly the same way, by a terminal from some other location. Sure, a wrongdoer could sneak inside the data center and steal the server with sensitive data. But again, any data center breach can be described just as well in terms of who had access to the data center, as well as the location of the data itself.
Posted on Jun 02, 2009 at 9:39 AM0 comments
Many pundits and federal government observers are already raising the question, "Will a government cyber czar improve national security?".
The question, which arose from a series of new and significant cybersecurity policy moves announced by President Barack Obama in a White House speech May 29, is fair, but ill-informed.
First, let's drop the czar part. This position has in fact been defined as a coordinator, with no operational responsibility or authority to make policy unilaterally. So let's keep the role and expectations in check.
Second, what's different and more important about this announcement, compared to past cyber space initiatives, is the fact that the president himself put a big pile of political chips on the table in support of making America's digital infrastructure more secure. That makes a huge difference, regardless of what title you give his lead coordinator.
Third, the recommendations in his May 29 cyberspace policy announcement are well-grounded from work originally prepared last year by the Center for Strategic and International Studies' Commission on Cybersecurity, which had gained broad public and private sector support. So there is strong consensus and significant momentum behind most of the policies the president outlined last Friday. It is true, and Obama made it clear, much work remains to be done to pull together a coherent national strategy and it will take time. But with Obama saying he's now watching, many disparate efforts are likely to get fresh, rigorous, and more coordinated attention.
Of course, the questions of who will get the job--and will he or she have the political skills to reconcile such a broad portfolio of competing cybersecurity challenges--remain vital concerns.
But by President Obama declaring publicly that he would select this individual personally, that this person would sit on both the National Security Council and the National Economic Council, and that he or she would have regular access to the president, all point positively toward the notion that the cybersecurity coordinator can be effective, if not the final authority. And in all likelihood the national cybersecurity strategy due to be delivered to the president will prove to be more balanced and pragmatic than would likely emerge under the traditional notion of a czar. For as we all know, and most czars discover, the pretense of power typically comes with too little authority to get substantive things done. In this case, the person to watch is Obama, not his czar.
Posted on May 31, 2009 at 9:39 AM0 comments
Later this month, the General Services Administration (GSA) is expected to unveil how it moved one of its most popular public-facing services, the USA.gov
government search site, to a cloud computing-based infrastructure.
The agency appears ahead of the curve. The White House touts it the benefits of cloud computing in documents supporting the 2010 budget. The idea is that by outsourcing computing and software support to organizations that could do it more cost-effectively (even those done in-house, a.k.a the Defense Information Systems Agency), the government could save IT costs.
So GSA's announcement would be a sign of the way forward, yes? Except, the service GSA is using may not actually be, strictly speaking, cloud computing, at least by the definitions of cloud computing now being formulated by the National Institute of Standards and Technology.
We first heard of GSA planning to moving USA.gov, and its affiliate Hispanic site GobiernoUSA.gov, to the "cloud" in February. Beyond mentioning that IT infrastructure provider Terremark would supply the cloud on which such sites shall rest, GSA provided scant technical and pricing details.
In subsequent conversations with GCN, Martha Dorris, acting associate administrator for the Office of Citizen Services and Communications at GSA, mentioned that the agency expected to reduce Web management costs by up to 80 percent from the move from the current provider. The Terremark contract will also cover Webcontent.gov as well as the data.gov initiative, she advised us.
More recently, we heard that GSA switched the sites over to the Terremark facility earlier this month and are planning to unveil the site within a few weeks. "The move is progressing remarkably smooth, with no major surprises or problems, just the process of learning new systems," said Thomas Freebairn, GSA's acting director of USA.gov technologies, in a statement sent to us by GSA.
Admittedly, cloud computing is, itself, a not very-well defined term, more marketing-speak than anything else. It could apply to any sort of computational capability that is outsourced. So, naturally vendors of Application Service Providers (ASP), Software-as-a-Service (SaaS) or utility computing have been quick to rebrand their wares as cloud offerings. And so they should, if it helps customers get a better handle of the benefits.
But the definition of cloud computing is slowly getting inclusive. Last week, the NIST released a draft definition of cloud computing, authored by Peter Mell and Tim Grance of the agency's Information Technology Laboratory.
The NIST draft pointed to a few of the key attributes that separate cloud computing from other types of offerings. Two of the key ones are "rapid elasticity" and "pay per use."
"Rapid elasticity" means "Capabilities can be rapidly and elastically provisioned to quickly scale up and rapidly released to quickly scale down," the NIST draft states. Pay per use means "Capabilities are charged using a metered, fee-for-service, or advertising based billing model to promote optimization of resource use."(The last one refers to services such as Facebook or Flickr that generate revenue from advertising).
Does Terremark's offering fit this model? The answer, as the Magic Eight-Ball states, is cloudy.
Not too long ago, we spoke with Robert Thompson, sales director within Terremark's federal group, and Steve Hill, engineering director for the federal group. While they declined to talk about the GSA work specifically, they did describe the company's pricing model.
The offering, called Enterprise Cloud (E-Cloud) does not exactly fit the profile of cloud computing, though it is definitely a step away from the hosting services that many of the company's competitors offer. With most hosting services, you can contract space out on a server, using the operating system provided.
With Terremark, you supply a VMware-based image of your complete operating environment, including an operating system (either from you or provided by the company). The company then runs this virtual instance on its own servers, under the VMWare ESX hypervisors. So, we presume that is what GSA is doing is moving the entire array of USA.gov sites, along with the supporting content management system, into a VMWare instance, where it will be run on Terremark's servers.
Like traditional hosting services, Terremark bills on a monthly basis. On the GSA schedule, E-Cloud comes in a number of pricing tiers. One configuration offers the equivalent processing power of a single dual-core processor-based server (5 Ghz), along with 10 gigabytes of Random Access Memory (RAM) and 100 gigabytes of storage, for about $2,000 a month. Additionally, bandwidth to and from the Internet is offered at the rate of about $47.50 per megabyte of dedicated bandwidth.
"It is not based on a server mentality, but on a pool-of-resources mentality. The customers can subdivide that resource for any number of servers," Hill said. Usage is calculated by taking five-minute samples from the statistics provided by the VMWare management software. (Terremark, in conjunction with Computer Sciences Corp. also offers a specific cloud service, called Trusted E-Cloud, which offers additional government-focused security and managed services).
With Terremark's plan, users estimate how much processing power they need and use the estimates to pick the most appropriate plan. If their usage goes over these limits, customers pay an overage, but, like cell phone users, they can switch to a larger plan in the following months. Since most Terremark clusters are not operating at 100 percent capacity at given time, the chances are that the additional muscle can be put to use within a few minutes to handle any overages, Hill pointed out.
No discounts are offered for not using less than the full capacity, though. The customer pays the same whether 4.99 GHz or 1.0 Ghz is actually used. And this is strikingly different from cloud services from Google and Amazon, both of which charge only for the actual CPU, storage and bandwidth that was used.
So, is what GSA is using actually cloud computing? Or is it a slightly different sort of hosting model, an admittedly innovative one based on virtualization?
On the one hand, it does not have the truly elastic pricing that Amazon does, where you literally can buy 47 cents worth of computing if you need to. But it does allow the user to scale up and down as traffic waxes and wanes, admittedly on a month-to-month basis.
As federal agencies move into this exciting new world of outsourcing, they may have to answer such thorny questions (GSA itself certainly seems to be grappling with the issue). Or, better yet, maybe they won't worry so much about getting in compliance with the latest buzzword.
GCN editor Wyatt Kash contributed to this article.
Posted on May 18, 2009 at 9:39 AM1 comments