Task force pushes forward on supercomputing coordination

An interagency task force is well on its way to building a five-year roadmap for federal spending on high-end computers.

About 60 employees from 10 agencies, including the Office of Management and Budget, are participating in the High-End Computing Revitalization Task Force, said John Grosh, associate director for advanced computing in the Office of the Deputy Undersecretary of Defense for Science and Technology.

Results of a June workshop with academic and industry officials were submitted to OMB and the Office of Science and Technology Policy in September, and Grosh said he expects that 'some sort of sanitized report' will be released publicly within two months.

Agencies have demonstrated a growing gap between high-end computational resources and the demands that users are making on the machines, Grosh said. The task force has been studying that capacity issue, as well as underlying technology and system procurement issues.

A National Oceanic and Atmospheric Administration official announced the creation of the interagency task force last spring (Click for April 7 GCN story). It was one of several federal responses to the advent of Japan's Earth Simulator supercomputer, which performs much faster and more efficiently than any U.S.-built system.

Grosh was one of several speakers at today's panel discussion in Washington sponsored by IBM Corp.'s Center for the Business of Government.

Any proposed high-end computing policy must succeed within tight fiscal constraints, said Juan D. Rogers, associate professor of public policy at Georgia Institute of Technology. He co-authored a recent report, Advancing High-End Computing: Linking to National Goals, for the IBM center.

'The importance of high-end computing for contemporary science and technology cannot be exaggerated,' Rogers said. It's not just a tool to achieve specific results, but a part of the process of knowledge creation, he said.

Today's high-end computing is tomorrow's mainstream computing, but it's a moving target for policy, Rogers said. Uncoordinated demands may create a fragmented computing environment, and the fragmented computing environment in turn may lead to a fragmented intellectual community, he said.

Leadership in high-performance computing should be measured in sustained performance on real-world scientific problems, not raw performance on simple benchmarks, Rogers said.

Dona Crawford, associate director for computation at the Energy Department's Lawrence Livermore National Laboratory, said that the department's nuclear weapons stewardship program, which relies heavy on simulations, drives advancements in unclassified areas of computing as well.

Most people know of Energy's Advanced Simulation and Computing program, which simulates the aging of nuclear weapons, from the big supercomputers, but only 12 to 16 percent of the program's budget goes to the big machines, Crawford said. The bulk of funding is spent on personnel and software development.

Crawford said she doesn't believe that the United States has lost its leadership in supercomputing. 'Being number one on the Top 500 list is a silly way to measure leadership,' she said. U.S. companies built roughly 90 percent of the systems on the semiannual Top 500 ranking.

The next Top 500 list will be released Sunday in conjunction with a supercomputing conference in Phoenix.

inside gcn

  • robot typing on laptop (Zapp2Photo/Shutterstock.com)

    GSA to agencies: Tap MGT for emerging tech

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group