Tarak Modi

Commentary | Another View

8 myths of cloud computing

The benefits are real, but you shouldn't believe all the hype

Cloud computing has grown from a promising business concept into one of the fastest-growing segments of the information technology industry. Tough economic conditions and a constant pressure to accomplish more with less are prime catalysts to the realization that tapping into the cloud can allow fast access to best-of-breed business applications, computing resources, storage and other infrastructure at a much lower cost.

To a great extent these needs are also directly responsible for making cloud computing one of the most overhyped phenomena to have hit the IT industry in a long time. Yes, the cloud business model definitely has many advantages. Unfortunately, the hype has made vendors – of all sizes – so desperate for a piece of the cloud action that they have been willingly blurring distinctions and adapting definitions to suit their own ends. As Taylor Rickard, chief technology officer of G&B Solutions, so eloquently puts it, “Ask 25 people what cloud computing means and you are likely to get 30 different definitions.”

More coverage:

Service agencies conjure their own clouds

Cloud computing has appear for Web applications

With so much disinformation out there, is it any wonder that there are so many myths associated with clouds? Without further ado, let’s dispel eight of the more common myths surrounding cloud computing today.

Myth #1: Cloud computing is just a marketing label for a new and improved ASP.

An application service provider (ASP) is a business model that became popular in the late 1990s by offering software services to customers, using computer networks and the Internet as the mechanism to deliver and manage the service. Software offered using an ASP model is also sometimes called on-demand software or software as a service (SaaS). But isn’t this exactly what a cloud does as well?

It’s easy to fall into that trap if you look at the better-known clouds, such as Google Apps and Amazon EC2. Cloud definitions, as many as there are, can be deceiving as well. Just look at the definition of a cloud from the National Institute of Standards and Technology, which defines a cloud as “a pay-per-use model for enabling convenient, on-demand network access to a shared pool of configurable and reliable computing resources that can be rapidly provisioned and released with minimal consumer management effort or service provider interaction.” Once again, doesn’t this sound just like an ASP?

A careful examination shows that there are indeed key differences between an ASP and a cloud. Unfortunately, these differences have been marred by vendors who have hijacked the term cloud computing to appear hip and interesting. It is thus the responsibility of the customer to understand the two key distinction points between an ASP and cloud.

First, an ASP/SaaS limits the level of configuration allowed in its services. It’s more of a “one-size-fits-all” model. The client must generally accept the application “as provided,” since ASPs can only afford a customized solution for the largest of their clients.

Clouds, on the other hand, are much more configurable. Second, in an ASP, clients can request an environment that is completely isolated – even physically – from the rest of the tenants. Not so in the cloud model, which typically uses virtualization technology to distribute workloads across massively scalable data centers running hundreds of thousands of CPUs that can be rapidly allocated or scaled back according to a customer's needs. This means tenants, who might even be competitors in the same market space, might not only share the same physical space but might be at times sharing the same physical server and memory!

Myth #2: A cloud is a just a fancy name for a virtualized data center.

Granted, virtualization, which is the ability to create multiple logical servers (or virtual machines) within a single physical server or across multiple physical servers, appears to be one of the cornerstones of cloud computing.

That does not mean that just because you have VMs, you have a cloud. A cloud takes virtualization to the next level by mandating the ability to grow or shrink capacity as needed, providing pay-as-you-go pricing (metering), and letting users provision new servers and storage themselves as needed. To further prove the point that virtualization and clouds are not the same, consider the fact that Google Apps, a poster child for cloud computing, uses very little hardware virtualization. In fact, much of their cloud is built on just the opposite paradigm of taking a large set of low-cost commodity systems and tying them together into one large supercomputer.

Rather than virtualization, the real key to cloud infrastructure is resource abstraction to the point that it just doesn't matter whether you have virtualization.

Myth #3: Technology is the only hurdle in implementing a cloud.

Much of the discussion around adopting a cloud is narrowly focused on technology-related challenges such as securing the network, ensuring data privacy, addressing multi-tenancy, security concerns and so on. As critical as these challenges are, they are challenges that can be solved with good old-fashioned engineering.

There are, however, broader challenges to cloud adaptation beyond technology, which if not met head-on have the potential to seriously undermine cloud computing. Consider, for example, both the perceived and real loss of control of resources – data, storage, applications, servers, and personnel – in a cloud environment, which in large corporations and government agencies could be enough to stall meaningful cloud adaptation for years.

Overcoming such non-technical challenges requires an honest assessment of the core business and organizational culture, meticulous planning, and solid organizational change control to manage the changes required to prepare the organization for its transition journey to the “cloud.” Key areas that need to be addressed include ensuring that the business processes, applications and data are well-defined and loosely coupled; a well-defined enterprise architecture exists; and appropriate governance processes are in place.

Myth #4: Moving to a Cloud means not worrying about C&A.

Just the mention of the term certification and accreditation (C&A) is enough to send chills down one’s spine. Anyone who has been through a C&A knows that it is worse than a documentation and paperwork nightmare because it doesn’t end when you wake up in the morning. Adding to the stress is the fact that the stakes for passing C&A are high, especially when the C&A activity is mandated by regulations such as the Federal Information Security Management Act.

There have been many articles that cite a benefit of the cloud as easing the C&A required for systems residing in the cloud. According Peter Mell, who leads NIST’s could computing team, there are indeed some security advantages to the cloud. By shifting public data to an external cloud, you reduce the exposure of the internal sensitive data and cloud homogeneity makes security auditing/testing simpler.

The cloud also enables automated security management and provides redundancy and disaster recovery advantages. On the other hand, Mell also has identified new security challenges that revolve around trusting a vendor’s security model; the customer’s inability to respond to audit findings; obtaining support for investigations; indirect administrator accountability; proprietary implementations that can’t be examined; and of course, the biggie – loss of physical control.

Always remember the golden rule that “shifting the work does not shift the accountability.” That means that you are still liable to provide evidence satisfying the C&A requirements regardless of whether you are using a cloud.

Myth #5: All existing applications will seamlessly migrate to the cloud.

As much as the marketing brochures from cloud vendors might have you believe, migrating your data center – applications and all – is not just a matter of flipping the switch. The fact is that porting legacy applications to the cloud will be a complicated task, since past application development rarely focused on open-source, SaaS-style infrastructures.

Workarounds and patches might help to some extent, but the longer-term focus will have to be on modernizing applications – think service-oriented architectures, for example – to support the new service-based paradigm.

Furthermore, it is important to realize that not all of your applications may be appropriate for the cloud. For example, applications relying on clustered servers aren't always a good fit for the cloud environment. Such applications typically require identical configuration of each server and large dedicated bandwidth among servers, which can't always be guaranteed by the cloud.

Myth #6: The cloud commoditizes IT – so all I need is my credit card.

A major marketing point for cloud vendors is that a business user "can just go in and buy a development server in minutes" that's as good as the one it would take their IT department days or weeks to provision.

For example, you might have come across the claim that you could leverage Amazon S3 (storage) and Amazon EC2 (computing power) to create your very own data center on the Internet. With this powerful combination you would have terabytes of space and several hundred computers working for you!

Unfortunately, in the real world, such rapid and dynamic procurement only works in the simplest of cases or in specialized applications such as prototyping. Ensuring that the newly procured environment can be integrated into the wider corporate environment requires more diligence (hence cost) to guarantee compliance (think C&A) with corporate IT standards and/or federal regulations, such as FISMA. Also, you need to be wary if you have special performance or scalability needs for your enterprise applications, as it’s quite likely that many infrastructure-as-a-service (IaaS) players just won’t be able to meet them.

Myth #7: ROI with a cloud is guaranteed.

A positive return on investment and reduced operating costs are often the catalysts for virtualization projects in data centers. Since cloud computing is often equated to -- and has overlaps in -- the benefits achieved from virtualization, the logical inference is that it, too, would have a positive ROI. Or so it would seem.

The fact is that the verdict on cloud computing ROI is still not in. Yes, it's often cheaper, but not always. As SOA consultant David Linthicum says, “Like anything in the world of IT, it depends. It depends upon your current investment in IT infrastructure, which can't be recovered. It depends upon the types of applications you're looking to deploy in the clouds. And it depends upon the cost of risk within your particular business. Cloud computing has the potential to bring a great deal of value through efficiencies and cost reduction, but you have to run the business models for your specific enterprise and problem domain.”

A recent study by McKinsey & Co. supports this claim, saying “customers are only likely to save money when running specific platforms, such as Linux, in the cloud.” For an entire data center, the report states, you're better off staying in-house. The study also claims that “hosted infrastructure services such as EC2 are not cost-effective for large enterprises.”

Licensing modes are another factor that could change the equation as not all software/application vendors have cloud-friendly licensing models yet. The nutshell is that you must carefully analyze your specific requirements, weigh your options and calculate your specific case related ROI prior to jumping onto the cloud bandwagon.

Myth #8: Everything in the future has to be a cloud.

You might have heard something to the effect that “if in three years if you’re not in the cloud, you won’t be relevant.” This reminds me of when client-server computing first came out in the 1990s and technology pundits started proclaiming the death of the mainframe – which, by the way, is still alive and kicking. The truth is that the cloud is likely to act as more of a complement than a replacement to in-house systems and the in house IT department or data center is far from dead.

For one reason or another, not all systems will end up in the cloud. Reasons for exclusion from the cloud might vary from regulatory requirements around data privacy and security to specialized application configurations that require dedicated hardware and organizational/culture issues.

Let’s be completely clear: Cloud computing is in no way just hype. It is real, the business benefits are real and the underlying foundation technology is real. The fact that there are so many myths associated with cloud computing is not unique either, as is evident from past rising stars, such as client-server computing, N-tiered architecture, and SOA.

Each of these stars has, however, evolved over the years with the myths being dispelled and its definition and purpose becoming more focused. Such will be the case with cloud computing as well. Until that happens, though, it is up to the practitioners to dispel the myths and the buyers to become educated as to what is real and what is not.

About the Author

Tarak Modi is principal architect at G&B Solutions. He leads the Cloud Computing and Security C&A practices within G&B as part of the CTO Office.


  • business meeting (Monkey Business Images/Shutterstock.com)

    Civic tech volunteers help states with legacy systems

    As COVID-19 exposed vulnerabilities in state and local government IT systems, the newly formed U.S. Digital Response stepped in to help. Its successes offer insight into existing barriers and the future of the civic tech movement.

  • data analytics (Shutterstock.com)

    More visible data helps drive DOD decision-making

    CDOs in the Defense Department are opening up their data to take advantage of artificial intelligence and machine learning tools that help surface insights and improve decision-making.

Stay Connected