Air Force team provides flight tests for military apps
Test Wing analyzes command-and-control applications before they take the field
- By William Jackson
- Mar 19, 2009
The Air Force, Army, Navy and Marine Corps all rely on command-and-control (C2) systems for situational awareness and communications with their units in the field and with one another. And they all rely on the Air Force’s 46th Test Wing to ensure that their applications operate properly in the field.
The wing, based at Eglin Air Force Base, Fla., is a technology development test team that evaluates air delivery weapons, navigation and guidance systems, and C2 systems.
“Our job is to keep projects on time and on budget,” said Lee Paggeot, the group’s C2 performance lead.
Paggeot’s team members watch the behavior of complex networks and systems as they stress test the networks with real-world loads. They watch how a network behaves and how applications respond as they communicate with one another and servers. The goal is to spot problems, find the causes and, if possible, identify the fixes before new systems are deployed in the field or fall behind in development.
“To do that, there is no substitute for going to an event where 200 users are crashing and burning,” he said.
Doing this work efficiently is critical to the wing’s survival. “We’re a reimbursed organization,” Paggeot said, meaning that it is paid by customers for the services it provides rather than relying on budget appropriations.
To provide its services, the wing has standardized on a handful of commercial products to help test, observe and analyze systems. The essential tools are the Observer analyzer from Network Instruments — to get a high-level view of performance with the ability to drill down to the packets — and the ServerVantage database and application-monitoring tools from Compuware, for looking into server problems.
“The power of Observer is that it’s a real-time tool, built from the ground up to use in a wide-area network,” Paggeot said. “I am watching my application behavior and characterizing and analyzing it” at the application layer, reassembling processes at the packet level to understand what is happening.
This kind of attention is relatively new for C2 systems.
“In the old days, C2 systems were not considered weapons systems,” Paggeot said. But as military systems became more integrated and started using TCP networks to communicate, the integration of new applications into C2 systems became more critical. That was apparent during the first Gulf War. “The generals said, ‘We have to start building command-and-control systems like weapons systems.’”
The importance of properly analyzing application performance was reinforced as the 46th Test Wing began testing C2 systems in a more structured way . An event it was evaluating failed and blew up — figuratively speaking — Paggeot said. “We got burned. We got smart and said, 'This is not the way to do business.'”
The wing adopted the Observer analyzer when it saw another service use the tool at a test event. Team members were impressed with its ability to look at performance from a high level and then provide details. It also is inexpensive and tends to perform reliably, Paggeot said.
The integration of these two levels of observation and analysis makes Observer powerful, said Charles Thompson, senior product manager at Network Instruments.
“When you’re trying to trouble-shoot a problem, it’s trying to find a needle in a haystack,” Thompson said. “It’s a process of elimination,” and testers need to be able to eliminate possible causes as quickly as possible to find the real one.
Paggeot agreed with that assessment. “In performance, it’s as important to be able to say what it’s not as what it is,” he said.
High-level reporting by Observer provides a look at large, complex networks to easily spot problems with statistical information. The team then uses Retrospective Network Analysis to take statistical information down to the packet level. Packets are time-stamped and can be reassembled to analyze what happened and where a problem occurred.
This means retaining large amounts of data for analysis, Thompson said. “Most customers want to retain about 48 hours of data,” he said. “That allows them to cover the weekend.”
The amount of storage required depends on the size of the network. “They are all different,” Thompson said. “But the increments are getting larger.” Typical installations used to be in the 4 to 8 terabyte range, but now they have crept up to 12 to 16 terabytes.
Compuware’s ServerVantage also is a powerful tool for observing top-level processes, he said. “There are systems in use now that wouldn’t be there if we did not have Observer and ServerVantage.”
Other tools in the wing’s test suite are Compuware’s ApplicationVantage, a tool for pinpointing application performance problems; Ace Analyst, a performance management tool from Opnet Technologies; and Compuware’s QALoad and Hewlett-Packard’s QuickTest Professional for load simulation.
The wing’s evaluations now are in demand by its military customers, and it keeps about 300 research and development projects on track.
“We’ve got work galore,” Paggeot said, and the wing is well-funded now.
William Jackson is freelance writer and the author of the CyberEye blog.