Is it possible to build software free of defects?

CHANTILLY, Va.—Software quality experts are a demanding bunch. They
see the nation’s infrastructure being put at risk by poor software quality—and
they are not even talking about year 2000 vulnerabilities.

But even the experts at the International Software Assurance Certification Conference
here last week could not reach a consensus on the root cause of poor quality.

“We’re building systems out of [software] components that are built in a
slipshod manner,” said Gary McGraw, vice president of business development for
Reliable Software Technologies Corp. of Sterling, Va.

“Venture capital rather than technology is driving software development in this
country,” said Marcus Ranum, chief executive officer of Network Flight Recorder Inc.,
an Internet security company in Woodbine, Md.

“Organizational neglect of software processes exceeds the poor workmanship of
individual programmers as a source of errors,” said Don O’Neill, founder of the
National Software Quality Experiment.

O’Neill launched his quality project nine years ago after a three-year residency
at the Software Engineering Institute in Pittsburgh and 27 years as a software engineering
manager at IBM Corp.’s Federal Systems Division. His experiment focuses attention on
software product quality and reveals patterns of neglect in the nation’s

He said his results show “no systematic movement” toward fulfilling the
Defense Department’s 1991 Software Technology Strategy goal, which was to reduce
software problem rates by a factor of 10 by the year 2000.

“That quality goal is not being met, and it won’t be by a lot,” he said.
O’Neill reached his conclusions after analyzing the project’s national database,
which is populated by software quality measures from dozens of government, military and
industry applications.

The sampled measures include accounting, personnel and administrative applications;
administrative and management decision support; artillery fire control systems; electronic
warfare systems; Federal Aviation Administration communications; State Department embassy
support; electronic commerce; medical information systems; Joint Chiefs of Staff support;
and telecommunications software.

The samples show a defect rate of 2.4 per 1,000 lines of code, which is not good,
O’Neill said. Every software organization should treat the finding as “a wakeup
call,” he said.

With 788,459 lines of source code available to study, O’Neill reported 11,375
defects. Of the defects, 14 percent were bad enough to affect code execution and another
85 percent were minor. He said code inspectors found a major defect every 76 minutes.

According to O’Neill, organizations are responsible for the poor processes that
produce the most frequently occurring software errors. He traced more than 40 percent of
errors back to poorly documented requirements, specifications, and design, code and test
procedures resulting in “software products that are not under intellectual
control.” Without traceable documentation, O’Neill said, software products
“are not well-connected to the requirements that inspired their creation. We fall

Organizations that want to reduce their software defects, he said, should improve their
lifecycle documentation and be rigorous about following programming language standards.

Another quality expert said the language standards are part of the problem. Languages
themselves “are too complex,” said Jeffrey Voas, vice president of Reliable
Software Technologies, who cited the 800-page manual for C++.

Network Flight Recorder’s Ranum said the nation is operating atop a pyramid of
beta software.

Companies, he said, are being “pushed into public offerings, which means
you’ve got to come out with a new version of crud every four months or your investors
start to get concerned.

“How do we produce such a tremendous flow of software and have any of it be
good?” Ranum asked. “The answer I think is pretty clear.”

Stay Connected

Sign up for our newsletter.

I agree to this site's Privacy Policy.