Software-defined computing enables context-aware IT
- By David Ramel
- May 21, 2014
As systems and applications evolve and organizational investments along with mobile workers move outside the data center, context awareness becomes more important, according to Anil Karmel, former deputy CTO at the National Nuclear Security Administration.
“As soon as that data leaves your data center, you have to redefine that network perimeter and the security controls that are in it. So as your workloads move, your security moves with it," he said.
Karmel, the founder and CEO of cloud security company C2 Labs, spoke about software defined networking in a recent webinar titled, "Evolution or Revolution? The Software-Defined Shift in Federal IT," along with Steven Hagler of Hewlett-Packard Co. The event was produced by MeriTalk and sponsored by UNICOM Government IT and HP.
Part of the solution entails building intelligence into applications in contrast to endpoints or devices that consume data, Karmel said. Intelligence that lets applications be more aware of the type of data they're handling will enable context-aware decisions on where to store the data and where security should be employed.
To achieve this, it's paramount to design apps with users in mind and with security considerations built-in at the beginning, rather than have them "bolted on" later, Karmel said.
"Understanding the user, understanding what data they're trying to access, where they're trying to access that information, with security that's baked in at the beginning is absolutely key," he said.
Along with security, software-defined computing offers many other benefits, including simple energy savings, according to Hagler, who was on hand to discuss the HP Americas Moonshot server. HP claims it is the first software-defined server, in service for about a year.
Issues such as power consumption become more problematic for companies tied to doing things the old way in a new age of mobility, social engagement, big data, cloud computing and more, he said.
Focusing on workload-specific infrastructure rather than general-purpose systems and bringing functionality from clients to the servers also results in equipment acquisition cost savings, lower power consumption and less space required for infrastructure.
As agencies start to look at software-defined environments, Hagler advised that they think about the problems they're trying to solve and not try to solve them all with the same technology, as many more tools are now available.
A major issue faced by customers is how to go from a general-purpose to a workload-specific approach, and they should challenge their vendors to help in that transition. "Think a little bit more granular about what we're trying to solve," Hagler concluded.
As for the inevitable shift to SDN, Karmel explained more. "The space is definitely emerging and evolving, but it is the place that we will all end up at," he said. "SDN is that new paradigm; it is that new paradigm shift."
A longer version of this article appeared on Virtualization Review, a sister site to GCN.
David Ramel is the editor of Visual Studio Magazine.