NASA puts its security in order
David Nelson says NASA improved its systems security by taking first things first'identifying the top problems and working to eliminate them.
Vulnerabilities list helps in endless battle to protect systems
'If [prioritization of security problems] is not done, nothing else will work.'
'SANS Institute's Alan Paller
When the security staff of NASA's chief information office studied network intrusions in 1999, 'We noticed we were getting compromised by the same old vulnerabilities over and over,' said David Nelson, deputy CIO for IT security. 'We decided to prioritize and clean up the most serious problems.'
Today, most of the top 50 vulnerabilities they identified in 1999 have been eliminated or mitigated within the space agency's 80,000 computers. But every quarter a new batch of vulnerabilities shows up.
Picking the low-hanging fruit has paid off, however. Although the number of attacks against NASA systems has gone up, the number of successful penetrations is sharply lower, Nelson said.
The success of NASA's Vulnerability Reduction Program inspired Alan Paller, research director of the SANS Institute of Bethesda, Md., to preach the gospel of prioritizing the most commonly exploited vulnerabilities.
In 2000, based on NASA's work, SANS and the FBI drew up a separate top 10 list. They expanded it to 20 in 2001 and updated it again recently.
'Our top 10 was an attempt to replicate for other people what NASA had accomplished,' Paller said. 'I wanted to use their top 50, but that would have given people a road map to attack NASA.'
Prioritizing and fixing the most common problems is not a cure-all but a necessary first step, Paller said.
'If it is not done, nothing else will work,' he said.
The NASA list grew out of a consensus. Ames Research Center at Moffett Field, Calif., the principal center for IT security, first put out a list for comment by other NASA centers. In fall of 1999, NASA agreed on the 50 worst threats.
The CIO office bought standard network scanning tools, trained field staff at each center and required them to report results of their quarterly scans to the CIO's office.
An initial baseline scan found an average of one targeted vulnerability on each of the agency's 80,000 computers. The CIO office set a goal of reducing that to one in every four computers by the end of fiscal 2000.
'It got to be kind of fun,' Nelson said. 'A lot of positive competition grew up' between centers. 'By the end of 2000 we were below the 0.25 rate we had set ourselves.'
When the rate dropped to 0.16 vulnerabilities per computer, Nelson's office announced a new goal, 'and we made it,' he said. At the end of fiscal 2001, the rate was down to 0.0068 vulnerability per computer.
The list was then updated and pared to 38 vulnerabilities, and the target was reset at 0.25 for the new list. Again, NASA did better, reaching 0.097 vulnerability per computer by year's end.Dynamic fractions
Now the list gets quarterly updates with a constant target rate of 0.25 for the current list.
'A quarter or a third get changed each quarter, so we're more dynamic in our approach,' Nelson said.
The reduction in systems intrusions also reflects other security initiatives, including more training and a better security architecture. But Nelson said the sharp drop proves the program's success.
When a serious vulnerability was reported in Apache Web server freeware, Nelson's office ordered an emergency scan and fix, which was completed throughout NASA in a matter of days.
To succeed, the Vulnerability Reduction Program needed cooperation from all the centers' IT staffs. Nelson's office had the authority to order scans and remediation, but he preferred a carrot-and-stick approach.
'The centers knew we had a stick, but we knew that without their buy-in, the hard work they would have to do would be grudging,' he said.
Headquarters could offer no funding for the program. Centers had to pay for the labor from existing budgets, so it was necessary to win the participants' good will.
One or two people are now working nearly full time to scan and report vulnerabilities at each of the 10 centers. Total cost is $2 million to $3 million per year, or about $30 per computer.
A standard scanning and reporting platform was necessary to establish metrics and let centers compare performance against each other.
For security reasons, Nelson declined to name the products used.
Because labor is the principal cost, ease of management mattered more than the price of the software, he said. NASA wanted to centrally manage a number of scanning engines. Other selection factors were scanning accuracy and rapid addition of new virus signatures to the scanning engine.
Despite the program's success, IT staff members hate the fact that, for every vulnerability they correct, another crops up.
'The string of vulnerabilities never ends,' Nelson said. 'It's frustrating to me that software is so poorly done. We would rather vendors sell us reliable, secure software than expand it with all the features of the week.'
NASA has not yet held up a procurement because of software security weaknesses, but Nelson did not rule it out.
'The costs are escalating,' he said.