Virtualized data centers

What's next for data centers: Virtualization and SDN take over

Someone working in data centers 30 years ago wouldn't recognize them today, and likely the next 30 years will see just as many changes. But what will those changes be?

Doug Bourgeois, chief cloud executive for VMware's U.S. Public Sector division and former director of the Interior Department's National Business Center, sees an overall shift to software-defined  data centers, more use of big data and end-user computing.

"The software-defined data center is one where all infrastructure has been virtualized, and the management of that infrastructure is completely controlled by software that is driven by policies," Bourgeois said. "What this emerging trend is essentially doing is taking what has happened over the last 15 years with server virtualization and bringing that to the network and storage levels. Once these three levels of the infrastructure have been virtualized, the data center becomes orders of magnitude more agile than ever before."

Other experts in the field agree. "I’m convinced that virtualization will continue to offer pivotal advances within the data center," said Joe Brown, president of Accelera Solutions. "This is evident in the recent popularity of software-defined networking, which is essentially virtualization of the network layer beyond the traditional host virtualization. As we continue to abstract highly complex systems like networks into simplified, user-friendly and console-driven services, data centers will become even easier to manage and operate.” 

The impact will extend beyond data centers to the “end user experience in areas including mobility, BYOD and social [media],” said Anthony Robbins, vice president of federal sales for Brocade.

“As virtualization takes over the entire IT infrastructure, ‘as a service’ will really take off, enabling all IT as a service, in particular cloud, whether public, private or hybrid," Robbins said. "Tomorrow’s government worries nothing about IT, and everything about service level agreements. This will be true both in the case of providing citizen services and in support of the warfighter. The entire acquisition system in place today won’t exist 20 years from now."

In Bourgeois' vision, applications can be easily migrated between virtual data centers and even across physical environments without having to be reconfigured. Cloned environments for development, testing and reconstitution can be provisioned in minutes or even seconds. Moreover, the policy-driven automation capabilities of the software-defined data center will allow the fault tolerance and high availability features to be carried out automatically and across physical data centers, thereby lowering the cost and increasing the ability to meet or exceed availability requirements.

"Perhaps most important, the software-defined data center will finally allow the IT organization to shift resources away from operations and towards helping the mission respond to changing business needs," Bourgeois said.

Bourgeois said big data would also continue to affect future data centers, but in slightly different and better ways than it does today. The accumulation of data has been accelerated by the use of sensors and other data-gathering devices, and the IT industry only recently has started addressing the implications of all that data, he said.

"Advancements such as in-memory data processing are beginning to transform how applications and middleware technologies are developed and deployed,” Bourgeois said. “Over time, data analytical systems will begin to take on features that allow them to recognize events and trends in data and carry out downstream activities in an automated fashion as a result."

The users themselves, and the devices they use, will transform the data center and the way data is used. "Applications will be accessed by any mobile device and using any native operating system,” he said, which “paves the way for the enterprise app store to serve up applications to users that may be using smart phones, tablets or laptops regardless of the network they are using.”

With a software defined data center, “the enterprise IT organization can even present a different set of applications to the same user depending upon which device they are using to access the app store and which network that they are accessing," he said.

All of this will be combined in what Daniel Kent calls the “intercloud.”

"The term cloud clearly embeds many technologies and business models,” said Kent, director of federal solutions and CTO of the US Federal division of Cisco. “And technologies used to build out clouds are numerous. So, many streams of innovation and development will continue to add impact to data centers over the coming years."

Cloud computing is still in its nascent stage, with a range of cloud types being built. “Over the coming years we will start to see the world of many clouds emerge,” he said. “This is where agencies will leverage dozens or even hundreds of various types of clouds that are providing specific resources to meet specific needs of the agencies."

Kent's vision will require intercloud connecters, systems that are open in nature and provide consistency in network operations, security, visibility and control to agencies for the resources that are hosted in another’s cloud. An example would be where an agency leverages compute power in Amazon’s cloud to meet a dynamic need of a program, something we are already seeing agencies such as NASA experiment with today.

Kent agrees that software-defined networking also will be transformative. "SDN is just starting to be adopted in academia and research labs and for multitenant cloud architectures," where applications can dynamically use network resources based on the application’s needs. At the same time, a highly scalable network fabric is coming into play.

“Put these technologies together and the notion of a highly programmable and scalable network fabric that can dynamically meet the needs of the applications in real time is what you will see in the future," he said.

Finally, Kent sees that the future will be ruled and shaped by the increasingly popular "Internet of things." Those devices are already out there, and it makes sense that data centers will have to deal with them one way or the other. "We expect to have 50 billion things connecting to networks by 2020, up from 10 billion now," he said.

"As more and more devices, sensors and objects connect to the Internet and each other, it will create an enormous amount of network connections. More interestingly, these connected objects will collectively generate vast amounts of unstructured data,” he said. “These billions of connected devices will provide a continuous stream of real-time data over the network. These large data sets will be transported, collected, aggregated and analyzed in the data centers and will create new paradigms for data center networks, computing and storage."

About the Author

John Breeden II is a freelance technology writer for GCN.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected