How to read signs of safe software

Microsoft, industry group wrestle with metrics for software assurance and how to set priorities

Take it slow: NSA's Tony Sager advises assessing how critical software affected by a flaw is before rushing to fix it.

Rick Steele

The idea that you can manage only what you measure applies to software assurance. But the development of metrics for that field is still in its infancy, according to speakers at the recent DHS-DOD Software Assurance Forum in Fairfax, Va.

The Defense Department defines software assurance as 'the level of confidence that software functions as intended and is free of vulnerabilities,' said Kristen Baldwin, deputy director for software engineering and systems assurance in the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics. It stands to reason, then, that determining a 'level of confidence' requires generally accepted assurance measurements.

'There are a thousand good things to do in software security,' said Tony Sager, chief of the vulnerability analysis and operations group at the National Security Agency. 'But if you can afford to do only 10 of them, you need some sense of measurement. You need to stop doing good things and start doing the best of things.'

Developing assurance metrics would 'help decision-makers quantify security risk exposures,' said Nadya Bartol, a Booz Allen Hamilton Inc. associate, and member of an industry software assurance working group that is working on a 'short list' of effective assurance measurements. The group expects to publish a final draft of its measurement guidance in September.

Microsoft Corp., meanwhile, has developed metrics through which it seeks to reduce the level of vulnerabilities in succeeding versions of its software. The company has been working on internal measurements for the assurance of the software it ships for the last five years, according to Steven Lipner, the company's senior director of security engineering strategy.

"You need to stop doing good things and start doing the best of things."
Tony Sager, NSA


'The metrics are oriented toward improving future product versions,' Lipner said. 'Ideally, we wanted to have an assessment done before we ship the darn product.'

A Microsoft security team developed two software assurance metrics, he said. The first is known as Relative Attack Surface Quotient, or RASQ. Software with a high RASQ has more vulnerabilities, and therefore is likely to fare poorly when attacked.

'RASQ measures things like default configurations, open ports, permissions services running and the number of ActiveX controls available by default,' Lipner explained.

'Based on experience, we put together an aggregated metric that combines twenty individual pieces into a single numerical score. This allows us to compare one version of a product to its predecessor to see if we can reduce the RASQ from version to version.'

The second metric, informally known as the 'vulnerability coverage method,' assumes the existence of an 'outside community of researchers providing a stream of vulnerability reports on new versions of Microsoft products,' Lipner said. This external research community is a 'euphemism for vulnerability finders that either report or exploit' vulnerabilities.

A Microsoft team analyzes each vulnerability reported and determines whether it has been removed from the product version under development and, if not, whether it ought to be, based on the risk it presents.

By looking at both of these measurements, Microsoft assesses whether a product in development is safe enough and stable enough to be shipped. This methodology was used in the development of Vista, Microsoft Office 2007 and Microsoft SQL Server 2005, Lipner said.

'We try to reduce vulnerability levels in new versions before we ship them,' he said. 'These measures provide a quantifiable basis for keeping score.'

Realistically, Microsoft's aim is not necessarily eliminating all vulnerabilities but continually improving assurance levels, he said.

NSA's Sager pointed out, 'Not all flaws need to be dealt with right away. It depends on the extent to which the software supports critical activities.'

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above