Is it possible to build software free of defects?
- By Florence Olsen
- Mar 08, 1999
CHANTILLY, Va.Software quality experts are a demanding bunch. They
see the nations infrastructure being put at risk by poor software qualityand
they are not even talking about year 2000 vulnerabilities.
But even the experts at the International Software Assurance Certification Conference
here last week could not reach a consensus on the root cause of poor quality.
Were building systems out of [software] components that are built in a
slipshod manner, said Gary McGraw, vice president of business development for
Reliable Software Technologies Corp. of Sterling, Va.
Venture capital rather than technology is driving software development in this
country, said Marcus Ranum, chief executive officer of Network Flight Recorder Inc.,
an Internet security company in Woodbine, Md.
Organizational neglect of software processes exceeds the poor workmanship of
individual programmers as a source of errors, said Don ONeill, founder of the
National Software Quality Experiment.
ONeill launched his quality project nine years ago after a three-year residency
at the Software Engineering Institute in Pittsburgh and 27 years as a software engineering
manager at IBM Corp.s Federal Systems Division. His experiment focuses attention on
software product quality and reveals patterns of neglect in the nations
He said his results show no systematic movement toward fulfilling the
Defense Departments 1991 Software Technology Strategy goal, which was to reduce
software problem rates by a factor of 10 by the year 2000.
That quality goal is not being met, and it wont be by a lot, he said.
ONeill reached his conclusions after analyzing the projects national database,
which is populated by software quality measures from dozens of government, military and
The sampled measures include accounting, personnel and administrative applications;
administrative and management decision support; artillery fire control systems; electronic
warfare systems; Federal Aviation Administration communications; State Department embassy
support; electronic commerce; medical information systems; Joint Chiefs of Staff support;
and telecommunications software.
The samples show a defect rate of 2.4 per 1,000 lines of code, which is not good,
ONeill said. Every software organization should treat the finding as a wakeup
call, he said.
With 788,459 lines of source code available to study, ONeill reported 11,375
defects. Of the defects, 14 percent were bad enough to affect code execution and another
85 percent were minor. He said code inspectors found a major defect every 76 minutes.
According to ONeill, organizations are responsible for the poor processes that
produce the most frequently occurring software errors. He traced more than 40 percent of
errors back to poorly documented requirements, specifications, and design, code and test
procedures resulting in software products that are not under intellectual
control. Without traceable documentation, ONeill said, software products
are not well-connected to the requirements that inspired their creation. We fall
Organizations that want to reduce their software defects, he said, should improve their
lifecycle documentation and be rigorous about following programming language standards.
Another quality expert said the language standards are part of the problem. Languages
themselves are too complex, said Jeffrey Voas, vice president of Reliable
Software Technologies, who cited the 800-page manual for C++.
Network Flight Recorders Ranum said the nation is operating atop a pyramid of
Companies, he said, are being pushed into public offerings, which means
youve got to come out with a new version of crud every four months or your investors
start to get concerned.
How do we produce such a tremendous flow of software and have any of it be
good? Ranum asked. The answer I think is pretty clear.