What good are DoD's supercomputers?

Study finds high-performance software is lacking

Even as the Defense Department funds development plans for petascale computers, it is finding that it may have little commercial software to run on such machines once they are finished.

A study funded by the Defense Advanced Research Projects Agency says most high-performance computing software vendors have no immediate plans to upgrade their software to run on systems with larger numbers of processors. As a result, agencies with large HPC systems may not be able to rely on commercial software.

'The business model for HPC-specific application software has all but evaporated in the last decade,' said Earl Joseph, research vice president for International Data Corp. of Framingham, Mass., which conducted the research for DARPA. Joseph and IDC research director Addison Snell presented the findings at the High Performance Computing Users conference, held in Washington by the Council on Competitiveness.

The research team surveyed 54 independent software vendors, or ISVs, who oversee 110 commercial applications favored by HPC system managers. They found that HPC sales accounted for only 5 percent of the surveyed companies' revenue. As a result, such vendors have little impetus to upgrade the software to work on even larger systems, since they represent a small portion of the potential market.

'These are for-profit companies. They are not just in the business of philanthropy,' Snell said. Among the vendors surveyed were BakBone Software Inc. of San Diego, which produces the Net Vault backup software, MySQL AB of Sweden, and Altair Grid Technologies LLC of Troy, Mich., which makes the OpenPBS batch processor software.

According to the survey, only 82 percent of the software packages frequently used today in HPC environments can make use of all the processors in a rather small 32-processor machine. Less than half would scale to systems with more than 100 processors. Moreover, few ISVs plan to upgrade.

This finding is troubling given that most of the government HPC systems being built these days run a thousand processors or more. Earlier this month, the Defense Department Aeronautical Systems Center's Major Shared Resource Center purchased a 1,024-node HP Cluster Platform 4000 system, capable of executing 10 trillion floating-point operations per second, from Hewlett-Packard Co. of Palo Alto, Calif. DARPA's High-Productivity Computing Systems program is also distributing research money for developing computers that can run in the petaflop range, or 1,000 TFLOPS or more.

In order to make full use of such systems, software must be written so that it can be executed across all, or many of, the machine's processors at once. Upgrading the programs, some of which are decades old, could be as simple as rewriting some of the code to handle parallel execution of tasks, or it could involve rewriting the program from scratch.

But few companies would reconfigure their applications to run on more processors, even if given funding to do so, IDC found.

'We just have too much to do. We would need more time in the day to address the needs of HPC users,' one vendor told IDC.

John Towns, the director of the scientific computing division for the National Center for Supercomputing Applications, agreed with the study's findings. NCSA offers supercomputing time to U.S. commercial companies that do not have in-house HPC resources.

NCSA had, on multiple occasions, offered to broker a deal between NCSA clients and ISVs. In these proposals, the customer or NCSA itself would cover the costs and provide expertise to upgrade ISV software to work on NCSA's own HPC systems. Only two companies in the past 15 years have taken NCSA up on the offer.

'We have been able to answer every objection that an ISV has when going down the road. There is a customer willing to put money upfront for licenses, there are resources and expertise available through the center. Everything is there, but ISV still turns it down. I fundamentally don't understand why this is the case,' Towns said.

Government agencies that now use supercomputers tend to rely on applications built in-house to do their work. Sandia National Laboratories, for instance, has spent about $800 million over the past 10 years to develop applications for their computers, William Camp, the lab's director for computation, computers, information and mathematics, said during one conference panel.

To read the survey, available from Council on Competitiveness Web site, go to www.gcn.com and enter 468 in the GCN.com/search box.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above