WAN Optimization Gets a Closer Look
It used to be that if agencies had a problem with network capacity, they would simply throw more bandwidth at it or tweak the quality of service to speed up the delivery of services and data. Given the current surge in capacity demand caused by teleworking, data center consolidation, cloud, video and other trends, such quick fixes are no longer as viable.
That’s putting renewed focus on the wide-area network (WAN), as these concerns grow to cover the distributed enterprise. WAN optimization is not new to government, but it is drawing increased interest.
While 23 percent of federal, state and local government IT professionals surveyed in a recent 1105 Government Information Group report said their organizations had no interest in WAN optimization, another 23 percent had already deployed the technology, and 16 percent were planning deployments. The remainder had it under consideration.
Government networks are by far some of the largest and most geographically diverse networks in the world, with some remote locations offering very limited infrastructure, said Tim Braly, federal distinguished architect and senior systems engineer at networking company Brocade. Simply adding bandwidth is not the answer for these, and adds urgency to the need for WAN optimization.
“Delays caused by distance combined with limited bandwidth can severely impact certain applications like voice and time-sensitive data, as well as overall user experience and productivity,” he said. “If user experience and lost productivity are left out of the equation when determining the business case, then it’s all too easy to overlook the benefits of implementing WAN optimization.”
WAN optimization has already provided results for some federal agencies. The Department of Interior included it as part of 2011 IT Transformation Initiative, for example, and projects cost savings of as much as $100 million year from 2016 through 2020. The Defense Intelligence Agency used the technology in 2010 to allow users in some 50 locations worldwide to be able to use multimedia applications. Other agencies have used it to extend such things as high capacity satellite links to far-flung users.
In some ways, though, government still has a ways to go to catch up to the use of WAN optimization in the wider world.
“In 2012, two-thirds of the organizations we talked to were using WAN optimization for data center to branch office communications,” said John Burke, an analyst with Nemertes Research. “That use tilted more towards the larger organizations, and there we found that the technology was broadly understood and deployed.”
WAN optimization can be accomplished using a variety of different techniques, involving both hardware and software. Data compression and data deduplication, for example, both focus on reducing the amount of redundant data that has to travel across the network. Compression strips that data out from files before they are transmitted, whereas deduplication eliminates redundancies by instead sending references, such as links to a file on a server, rather than the actual data.
Other techniques include locally caching frequently-used data in certain locations, traffic shaping to prioritize traffic flow either by user or application, using quality of service to also prioritize traffic, and error correction to minimize data loss.
Optimization is not a zero-sum game, however. It will require investment in time, resources and money to deploy, and it’s not for every government organization.
“If they have legacy network systems that they have done due diligence on and feel are operating as efficiently as needed, then they may not need to make those changes,” said Shawn McCarthy, research director for IDC Government Insights. “They should only do it as the business analysis tells them the time has come.”
That analysis should also include systems that often have fallen below the radar, such as storage, which can both be affected by, and have a big effect on, network performance.
Those systems will be creating a significant amount of WAN traffic, Burke said, particularly if an agency’s storage infrastructure reaches outside of the data center and there is widespread replication of a the contents of a storage array, or if various storage appliances are being synched in a “hot-hot” environment.
“Typically you want to get faster response and performance on the network from other things than storage replication,” he said. “If your storage infrastructure can’t intelligently compress its own content and minimize its footprint in the network, then optimization definitely has a role to play there.”