NASA scientist makes dire supercomputing prediction

NASA scientist makes dire supercomputing prediction

Richard B. Rood says the United States should work on optimizing climate software for parallel machines.

By Patricia Daukantas

GCN Staff

The U.S. government is falling behind other countries in large-scale weather prediction and climate modeling because it overemphasizes the need to achieve megaspeeds across multiple processors, a NASA scientist has charged.

Richard B. Rood, a senior scientist at NASA's Goddard Space Flight Center Data Assimilation Office in Greenbelt, Md., said agencies' current software cannot calculate climate models well on the massively parallel commodity supercomputers that are becoming the mainstream platform.

The government's high-performance computing programs focus too much on achieving trillions of floating-point operations per second, thereby marginalizing Earth science computing, Rood said.


'If there's one thing that characterizes an Earth science computing problem, it's complexity,' Rood said. 'And if there's one thing that makes a parallel computing environment difficult to use, it's complexity.'

Rood and like-minded researchers in the Energy Department, National Science Foundation, and National Oceanic and Atmospheric Administration are calling for an integrated strategy for Earth science modeling and data use.

Earth science modeling taxes current parallel supercomputers because it assimilates observations from remote-sensing satellites and ground weather stations into models while they execute the operations.

In the last few years, scientists have found that incorporating meteorological observations into the running models can highlight information that is difficult to observe directly, Rood said.

But the inherently sequential process slows computation on the massively parallel machines, whose commodity processors have been edging out traditional vector supercomputers at government facilities.

Keeping up

During the last five years, agency buying decisions and market trends have forced researchers to convert to the massively parallel architecture, said Clifford A. Jacobs, the National Science Foundation section chief responsible for overseeing the National Center for Atmospheric Research (NCAR) in Boulder, Colo.

Also, solving the vector equations of fluid dynamics in a climate model requires more interprocessor communication than ordinary transaction processing because the atmosphere and the ocean interact, Rood said.

Rood suggested there may even be a fundamental reason why climate models involving data assimilation do not scale very well to thousands of processors.

'From a theoretical point of view, if 1 percent of your code remains sequential, you're not going to scale beyond 100 processors,' he said.

In the words of David Evans, assistant administrator of NOAA's Office of Oceanic and Atmospheric Research, the maximum useful number of processors is the reciprocal of the inherently sequential fraction of the model.

Despite the apparent limitation, climate researchers are not necessarily barred from ever working with thousand-processor supercomputers, Evans said. They just have to find new ways such as ensemble forecasting, which would run 10 simultaneous models on 100 processors each and average the results.

One NSF-funded group, the Center for the Analysis and Prediction of Storms (CAPS) at the University of Oklahoma, has succeeded with small-scale weather-prediction models on massively parallel computers, said Richard S. Hirsh, deputy director of NSF's Division of Advanced Computational Infrastructure and Research.

CAPS director Kelvin K. Droegemeier tailored his weather-prediction algorithms for a 1,300-processor parallel supercomputer, Hirsch said. But global climate prediction with enough detail to support policy decisions is a 'different kettle of fish,' he said.

The 18-nation European Centre for Medium-Range Weather Forecasts (ECMWF) in Reading, England, has the world's best numerical weather prediction capabilities, Rood said. The ECMWF's 116-processor Fujitsu VPP700 vector supercomputer produces higher-resolution models and handles more data than U.S. counterparts, he said.

Japanese-built vector supercomputers have fewer processors and are easier to use in climate modeling, Rood said. But the Commerce Department's 1997 antidumping case against NEC Corp. and Fujitsu Ltd. has discouraged U.S. research organizations from buying the vector systems.

Too late

Climate modeling problems would not be wiped out simply by buying Japanese-built supercomputers, either. Rood said the government would have to spend three to five years developing application software and middleware to make the highly-distributed-memory computers useful to climate researchers.

'I personally don't think it's a matter of transporting existing codes,' NOAA's Evans said. 'My suspicion is we're going to need new ways of solving the equations.'

Software engineers and climate scientists will have to work together writing new code from scratch, he predicted.

Last year, the Climate Research Committee of the National Research Council reported that 'the United States lags behind other countries in its ability to model long-term climate change.'

The panel of 19 scientists from government and academia said it is inappropriate for the United States to rely heavily on foreign centers to provide high-end modeling capabilities because U.S. scientists do not always have access to output from other countries' models. Also, other nations' priorities could affect the simulations and the U.S. economic policy based on them, the committee said.

The National Academy Press put the full text of the NRC report, Capacity of U.S. Climate Modeling to Support Climate Change Assessment Activities, on the Web at www.nap.edu/books/0309063752/html/index.html.

A member of the NRC committee, Maurice L. Blackmon, agreed with Rood that the United States should get to work optimizing its climate software for parallel machines.

Blackmon, director of NCAR's Climate and Global Dynamics Division, said the nation is doing well in small- and medium-range simulations on desktop workstations or supercomputers such as the SGI Cray J90.

ECMWF is experimenting with global models with 30-kilometer resolution, whereas U.S. researchers typically run simulations at 300-km resolution, Blackmon said.

Predictions get better as the resolution sharpens.


A number of scientists have started working on a common model infrastructure that would make it easier for researchers to share chunks of code, Blackmon said.

'Even with interagency squabbles, agencies usually follow the community's lead,' Evans said.

As part of its Scientific Simulation Initiative, Energy had proposed a massive climate-prediction project to scale up research in both hardware and software. But Congress axed the entire simulation initiative from Energy's fiscal 2000 appropriations bill [GCN, Oct. 11, Page 1].

Earth scientists also believe there are not enough computer scientists trained in modeling to support four or five independent, comprehensive development activities, Rood said.

Brain drain

The academic software world is losing talent to burgeoning electronic commerce and Web activities, Rood said, noting that about 10 people left his NASA office for nonresearch jobs in the past year.

'The science community is competing with the commercial sector and, to be honest, we don't have the monetary resources to do that,' he said.





How climate-modeling centers stack up


  • The 16-processor Cray C90 supercomputer at the U.S. National Center for Atmospheric Research can process a model with 310-km horizontal resolution and 18 vertical levels in two and a half days. Sustained speed: 5 GFLOPS.

  • The 32-processor NEC SX-4 vector supercomputer at the Australian Bureau of Meteorology Research Centre can run a model with 310-km horizontal resolution and 17 vertical levels in one and a half days. Sustained speed: 20 to 25 GFLOPS.

  • The 116-processor Fujitsu VPP vector computer at the European Centre for Medium-Range Weather Forecast-ing can run a model with 200-km horizontal resolution and 31 vertical levels in 14 days. Sustained speed: 75 GFLOPS.

  • The U.S. Accelerated Climate Prediction Initiative (not funded in fiscal 2000) called for development, by 2003, of a model with 30-km horizontal resolution and 18 vertical levels that would complete a run in eight hours. Sustained speed: 40,000 GFLOPS.



inside gcn

  • artificial intelligence (vs148/Shutterstock.com)

    Government leans into machine learning

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group