All for one, but not one for all

NSA finds automated vulnerability testing requires the whole toolbox

Vasko Miokovic

Organizations trying to automate the process of testing software for vulnerabilities have no choice but to deploy a multitude of tools, according to a study conducted by the National Security Agency.

NSA tested five software vulnerability assessment tools on eight different applications, said Kris Britton, technical director at NSA's Center for Assured Software, speaking at the DHS-DOD Software Assurance Forum in Fairfax, Va., March 8.

The key result: 84 percent of the vulnerabilities found were identified by one tool and one tool alone. But that doesn't mean you don't need the other approaches.

'No tool stands out as an uber-tool,' Britton concluded. 'Each has its strengths and weaknesses.' The reason for this phenomenon, he theorized, is that the tools were originally developed for narrow testing purposes.

As a result, organizations need to be 'looking at various testing approaches,' including static code analysis (SCA) and penetration testing, said Ryan Berg, chief scientist at Ounce Labs, a software security firm based in Waltham, Mass.
But the point-solution tools Berg referred to 'do not provide the comprehensive evaluation required for a significant portion of the software industry,' said Djenana Campara, CEO of KDM Analytics, a software consultancy in Wilmington, Del.
'The tooling industry has not kept pace with software system evolution,' Campara said. 'Software organizations have recognized the need to evolve their systems beyond homogeneous and monolithic solutions.'

An analysis undertaken on open-source software by QinetiQ for the United Kingdom's Ministry of Defense found that 'SCA tools have been either extremely slow to analyze software, or inaccurate,' according to Colin O'Halloran, director of QinetiQ's systems assurance group.

The quest to find a comprehensive, automated systems analysis of vulnerabilities suffers from the fact that 'point tools cannot share information and interoperate,' said Mike Kass, software assurance project leader at the National Institute of Standards and Technology.

The core of that difficulty, according to Kass, is the lack of 'agreement on what a software weakness is.'

'Tools use their own taxonomies and definitions of weaknesses,' he said, 'and this creates mixed results among tools. You can get different results from different tools on the same code, or the same results but with different descriptions' of the weaknesses identified.

Mitre Corp. has made some progress on developing a common language for software vulnerabilities, with its initial list of common vulnerabilities and exposures, (CVE) and more recently, the common weakness enumeration (CWE).

'CVE is a database of vulnerabilities definitions and descriptions,' Berg said. 'CWE is an effort at coming up with a common taxonomy for describing what a particular vulnerability is.'

CVE includes a list of 20,000 vulnerabilities; CWE includes 600 categories of vulnerabilities.

The point of CWE is to 'enable more effective discussion, description, selection, and use of software security tools,' Kass said. More than 50 vendors are participating in the effort.

Still, he noted, 'There is little overlap among tools regarding what they claim to catch in the CWE.

'This creates questions for purchasers of tools regarding the tool's purported effectiveness and usefulness.'

Evaluations should focus on giving vendors an understanding of 'what part of the market their tool is best targeted at,' and customers the ability 'to select the most appropriate tool to use,' said QinetiQ's O'Halloran.

So, for now, organizations testing software should rely on a 'tool-box approach,' said NSA's Britton, who still hopes that more comprehensive tools will be developed as that industry matures.

But on the other hand, 'writing code well is far more important than using back-end tools,' said Paul Black, a computer scientist at NIST.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above