Snapshot: Virtualization

Software-defined platforms define future of virtualization

As government has pushed the mantra of “more bang for the buck, ” virtualization has become an accepted way of doing IT. Server virtualization is transforming the data center environment. With that comes storage and network virtualization as well. With physical infrastructure now so abstracted, the traditional approach of throwing more hardware into the mix to solve problems is being turned on its head.

Software managed IT environments are now seen as a large part of the future. Software-defined networking (SDN) is an emerging practice. Software-defined storage (SDS) is quickly gaining pace. Emboldened by these innovations, software-defined data centers are just over the horizon.

Inevitably, that has led to thoughts about software-defined anything (SDx). As the dependency on physical hardware is reduced, so the thinking goes, software can manage entire environments. And that vastly increases the flexibility and agility with which agencies use various IT resources.

What once took days, weeks or months to set up and configure with physical IT can be deployed in hours, minutes or, in some cases, seconds with the virtualized world of SDx. It’s also much easier to match those resources to the requirements, doing away with the costly over capacity that often has to be built in to physical environments to ensure capacity for expected future demand.

SDx is certainly more concept than reality right now, but the idea is quickly gaining ground. In 2014, market researcher Gartner listed SDx as of the 10 top technologies to watch and include as part of strategic planning. Other technologies include the Internet of Things, mobile, smart machines and various cloud-based infrastructures.

Likewise, the Institute of Electrical and Electronics Engineers (IEEE) Computer Society said interoperability issues and standards for SDx would be a top priority for 2015. Various standards groups such as the Open Networking Foundation, the Internet Engineering Task Force and the International Telecommunication Union are already working on the appropriate specs.

Government agencies are dipping their toes into specific software-defined technologies such as networking and storage. The Defense Information Systems Agency (DISA), for example, has set up a software-defined network working group. It included money in its FY 2016 budget request to launch pilot programs to see how Defense Department networks can use SDN. Other funds would be used to develop a Technology Environment that will evaluate and characterize new technologies, including SDx.

Researchers at the Idaho National Laboratory (INL) have already gone further. They’ve developed a proof of concept to see how to apply SDx to the laboratory’s business environment. It emulated the use and security of INL business systems accessed by a large number of virtual machines, with software providing control intelligence that would otherwise be embedded in hardware.

In a recent issue of Government Computer News, Wayne Simpson, the INL innovation architect, and research scientist Tammie Borders, described how the prototype solution they developed showed SDx “can be used to improve security, repeatability of process and consistency in results.” They concluded that by adopting SDx approaches, organizations could reduce employee workload, improve security controls and optimize existing IT investments.

“As the dependence on hardware for the intelligence to implement access and security controls diminishes, organizations must overcome traditional thinking and drive changes in regulatory restrictions,” according to Simpson and Borders. “As these challenges are addressed, SDx will become more widely adopted and will change how information is accessed and consumed worldwide.”