Weapons projects misfire on software

Cost overruns constrict already tight budgets, GAO says

R&D Over Budget: Seven of eight critical technologies for the Joint Strike Fighter are not yet mature.

Courtesy of the U.S. Department of Defense

Every year the Government Accountability Office issues a report that gives a brief summary of the status of major weapons acquisition programs. And every year the reports say that many, if not most, of those acquisition programs are experiencing cost overruns and schedule delays in their software development segments.

The problem is huge. In fiscal 2006, the Defense Department will spend as much as $12 billion on reworking software'30 percent of its estimated budget of $40 billion for research, development, testing and evaluation. By comparison, Motorola'and other large commercial companies'spends just a small percent of its budget on rework.

Nor can the significance of the problem be overlooked. In its summary for 2006, Assessments of Selected Major Weapon Programs (GCN.com/605), GAO pointed out that, in the past five years, 'DOD has doubled its planned investments in new weapons systems from $700 [billion] to $1.4 trillion. This huge increase has not been accompanied by more stability, better outcomes or more buying power for the acquisition dollar.'

The huge difference between military and private-sector efforts, according to Carol Mebane and Cheryl Andrew of GAO's weapons acquisition audits practice, exists because corporations use a structured, replicable approach to software development that emphasizes requirements planning upfront.

A few years ago, the two auditors spent months studying how commercial best practices could be applied to DOD projects to control both cost factors and schedule delays. They spoke to an audience of software and systems engineers at the Software and Systems Technology Conference in May, revisiting the conclusions of their 2004 report, Stronger Management Practices Are Needed to Improve DOD's Software-intensive Weapon Acquisitions (GCN.com, GCN.com/606).

The importance of improving software development efficiency can't be overstated, Mebane said. When DOD developed the F-4 fighter in the 1960s, less than 10 percent of its functionality was based on software; in today's development of the F/A-22, it's more than 80 percent.

The Joint Strike Fighter program'one of those included in this year's report'has seen R&D overruns totaling 30 percent. Despite that, when it's time for DOD to make a production decision on the JSF, 'the program will have released about 35 percent of the software needed for the system,' GAO found.

Additionally, seven of the eight 'critical technologies' identified by the watchdog agency are not yet mature; indeed, they 'are not expected to be until after the design review' phase is over.

'Based on our discussions with individual [companies], three factors determine' the success of a software development program, Andrew said. 'A manageable environment, disciplined processes and metrics, metrics, metrics.'

Creating a manageable environment means breaking software projects into manageable pieces, each generally with a six-month schedule.

'In DOD, a project can be two years, three years, even four years long. It makes it hard for a program manager to get his arms around a project, [or to] get a handle on costs,' Andrew said.

Both software and hardware programs generally follow a specific, four-phase process, whether in government or industry, he said'requirements, design, coding and testing.

In the companies they examined, the GAO team found that 90 percent to 95 percent of requirements for a software program were set in the first phase, and leading companies are willing to spend 20 percent to 30 percent of their resources on getting the requirements established.

Also, projects in commercial companies undergo frequent reviews with management, and software teams often conduct reviews weekly to identify where problems could arise. At DOD, on the other hand, major management reviews of software projects usually happen only once a year, or even two years apart.

'We were shocked at that,' Andrew said. But when GAO 'recommended that program offices should get involved more often instead of waiting for major reviews, there was resistance. ... The program offices didn't have access to [software development status information], and didn't look for it.'

GAO found that industry metrics fell into seven categories: requirements, cost, schedule, quality, size, tests and defects.

'Defects are a big, big metric,' Andrew said. 'Motorola tracked both errors and defects.' An error, she said, was a problem caught in the requirements phase, while a problem caught in later stages of development was considered a defect.

Motorola even tracks, as a metric, how many errors and defects it finds, she said.

'Motorola knows how many errors and defects it is likely to find. Finding too many or too few is also an error,' she said, and the company re-examines its processes to see if something has been missed.

As part of the DOD audit, GAO examined five weapons programs, two of them involving existing systems and redevelopment of software, three of them new systems.

The first two did a relatively good job of staying close to time and cost estimates. But the three new programs saw more than 100 percent increases in costs and time.
One of them, the Comanche helicopter program, was ultimately adandoned by the Army.

Revolution canceled

'With the Comanche, DOD was seeking to make revolutionary changes in the way helicopters were built,' Mebane said, 'but there was not a lot of analysis into allocating requirements. This weapons system was cancelled ... because the Army decided they could no longer afford to pour resources into it.'

Based on the 2004 report, Mebane said, the Air Force adopted the processes in its software improvement plans, and DOD amended its 5000 series acquisition policy to include more emphasis on systems engineering and evolutionary development.
But more improvement is needed.

'Every year we do assessments on weapons systems. This year there are 52 of them' in the summary, she said. Almost 35 percent of them are using immature technologies.
'This is kind of a hand-raise for, 'You're going to have problems later on,' ' she said.

DOD is not alone in wrestling with these problems. GAO is trying to find ways to measure the performance of software development programs within the overall weapon acquisitions process.

'We [assess] 50 or 60 major weapons systems, each one confined to two pages. We go through technological, design and manufacturing risks on the programs,' said Mike Sullivan, director of GAO's acquisition sourcing and management team. 'The software metric is something we've thought a lot about getting into. ... We're trying to figure out a way to depict [it].'

inside gcn

  • power grid (elxeneize/Shutterstock.com)

    Electric grid protection through low-cost sensors, machine learning

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group