Will storage hold back high-performance computing?

Will storage hold back high-performance computing?

By Patricia Daukantas

GCN Staff

The massive data sets from the government's high-performance computing programs will cause a storage crisis in coming years if storage transfer bandwidth doesn't increase, a NASA researcher warned this month.

The bottleneck is in transferring data from old to new storage media, said Milton Halem, chief information officer at NASA's Goddard Space Flight Center in Greenbelt, Md. He spoke at the recent e-Gov 2000 conference in Washington.

Halem's interest in long-term data storage stems from his research in Earth science modeling, which produces extremely large data sets requiring terabyte-capacity storage.

Earth scientists must develop models at the highest possible resolution, Halem said, because low-res models fail to show important details. For example, he said, a model at 200- to 250-kilometer resolution in a one-day regional weather forecast in 1998 indicated little atmospheric activity. But successively higher-resolution simulations revealed more disturbances, until a 25- to 50-kilometer model showed a hurricane forming.

Growing needs

'Obviously, [increased detail] is where we want to go,' Halem said, but every gain in model resolution ratchets up the computing requirements by a factor of eight. 'As our computing capacity grows, our data storage requirements grow.'

Long-range seasonal and climate predictions generally require averaging tens or hundreds of runs of the same simulation, a process known as ensemble forecasting. The repetitions further burden storage space.

Climate researchers also must have access to years' worth of weather data for reliable predictions.

Storage media density has been growing at about the same rate as Intel Corp. founder Gordon Moore's famous rule of microprocessor complexity: It doubles roughly every 18 months. But bandwidth for reading and writing data to media is doubling only every nine years, Halem said.

'We have a bandwidth inconsistency between our ability to read and write data and our ability to output data,' he said.

Halem said the life expectancy of the typical magnetic tape or disk is 10 to 20 years. Some experts believe magnetic storage could hit its physical limits in as little as five years.

At the extreme, the disparity between the growth rate of storage and data transfer bandwidth would mean that tapes and disks could not survive long enough to transfer all their data, Halem said.

'It's going to take us more than 20 years to copy all the data we have on tapes onto a new medium. That's what I call the looming storage crisis,' Halem said.

NASA funds information systems, not specifically storage systems, he said. The National Science Foundation and the Defense Advanced Research Projects Agency have started a digital library program that addresses archival data preservation, but the effort is not large.

'Unfortunately, [storage is] falling through the cracks,' he said.

Officials of the Accelerated Strategic Computing Initiative are working with the National Security Agency to develop tera-byte devices and media storage, said Paul Messina, director of the Energy Department's Office of Advanced Simulation and Computing. ASCI researchers, however, are more focused on improving software for finding specific data sets within massive libraries than on saving the data for 100 years, Messina said.

inside gcn

  • Phishing

    Phishing is still a big problem, but users can help shrink it

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group