Making big data more efficient and accessible
- By Bill Lemons
- Jun 08, 2016
The battle to manage big data while making information useful to agency personnel is one that federal network administrators fight every day. But there are a few strategies that can help them manage the challenge.
The problem is not that there is too much data; rather, it’s difficult for administrators to deliver that data effectively. As the amount of data continues to grow, delivery is understandably hampered, especially with users expecting data to be available quickly, whenever and wherever it’s needed.
Unfortunately, this expectation fails to take into consideration that managing the sheer volume of data can be a challenging and time-consuming process itself. Administrators must continually look for ways to process and present their agencies’ data in unique, usable and reliable ways. That takes effort and resources.
Further, information is not all linear or easily consumable. While data about water consumption at Fort Bragg or the traffic flow in and out of Fort Meade can be considered fairly manageable, much of the new data that is received is dynamic information. That, along with the growing volume of data, will continue to push network administrators to examine new and innovative ways to manage data through networks that weren’t built to handle that much traffic.
One way to manage this kind of scale is through applications that deliver tailored information that caters to each individual’s interests. Administrators can provide agency employees with the big data equivalent of Uber or Waze -- apps that are easily downloadable and serve specific needs. While this removes the IT middleman to some extent, it accomplishes the overall mission of providing end users with easy access to actionable information, delivered in bite-sized chunks.
However, even that doesn’t change the fact that agencies will need networks equipped to handle the delivery of those applications and the data they present. This is a challenge that is just as big as the data itself.
To handle the big data dilemma, agencies must implement networks that are highly agile, flexible and scalable. They must become the critical pieces that allow intelligent data to be pushed out on-demand at any time to any place, especially as military intelligence operations become more decentralized.
Software-defined networking (SDN) is the ideal conduit for this type of service because it creates a network that is elastic, resilient and built for delivering applications and data on-demand. Software-defined networks can serve as the cornerstone for modern Defense Department networks, which are no longer necessarily static or physical installations that exist in a single place, but are built on agility and fluidity.
SDN gives organizations a foundation to more easily handle growing data demands. Administrators can’t accurately predict the volumes of data their networks will experience in five or 10 years, but they know it’s going to grow. That growth demands software-based platforms that are more malleable than traditional hardware-based infrastructures. SDN certainly fits that bill.
SDN also makes accurate data analysis a possibility by allowing administrators to gauge performance across all network layers. They can monitor data consumption and traffic as well as identify and isolate potential problems and areas of concern. This results in improved data delivery.
Just as it’s important to remain flexible in terms of scale, it’s also critical for agencies to invest in open technologies rather than proprietary tools. Open standards allow for a high degree of interoperability and are often more cost-effective than their proprietary equivalents. Investing in open now can help improve the agility of an agency’s network infrastructure and prevent lock-in further down the road. It’s another key consideration that will help agencies adapt to shifting data requirements.
Together, all of these initiatives can help government networks become more agile, intelligent, self-healing and poised for growth. While they give administrators the ability to control and collate data, they also incorporate automation that allows the network itself to modulate data dissemination and automatically adjust as necessary. This takes much of the onus off of administrators while enabling smarter decision-making and better data delivery to end users.
These efforts may not help network administrators win the war against big data, but they’ll certainly go a long way toward helping them meet the challenge. That, in itself, can be considered a pretty big win.
Bill Lemons is director of Federal Systems Engineering at Juniper Networks.