Feds look at the big computer picture

Secretary Spencer Abraham says Energy's ultrascale project will help the United States regain the lead in high-end computing.

Federal R&D policy-makers are trying to revitalize big iron amid concerns that a Japanese supercomputer has held the title of world's fastest by a wide margin for a year and a half, despite U.S. domination of the top 500 systems.

In one interagency effort, a task force is building a five-year road map for federal spending on high-end computers.

About 60 employees from 10 agencies, including the Office of Management and Budget, are participating in the High-End Computing Revitalization Task Force, said John Grosh, associate director for advanced computing in the Office of the Deputy Undersecretary of Defense for Science and Technology.

Speaking at a November meeting sponsored by IBM Corp.'s Center for the Business of Government, Grosh said he expects 'some sort of sanitized report' will be released publicly within a few months.

There is a growing gap between high-end computational resources and the demands users are making on them, Grosh said. The task force has been studying the capacity problem as well as underlying technology and procurement issues.

The interagency task force was one of several federal responses to Japan's Earth Simulator supercomputer, which performs faster and more efficiently than any U.S. system. Built by NEC Corp. with 5,120 vector processors, the Earth Simulator clocked nearly 36 trillion floating-point operations per second on a standard benchmark.

In contrast, the world's second-fastest computer, the classified ASCI Q system at Los Alamos National Laboratory in New Mexico, performed 13.9 TFLOPS on the benchmark.

The federal government owns or funds eight of the top 10 supercomputers on the most recent list, at www.Top500.org, and 14 of the top 20.

Among them are systems at five Energy Department laboratories, the Naval Oceanographic Office, and the National Oceanic and Atmospheric Administration's Forecast Systems Laboratory.

Of the 14 government computers in the top 20, Hewlett-Packard Co. built three and IBM four. Linux Networx Inc. of Bluffdale, Utah, and Cray Inc. of Seattle each built two.

A home-grown cluster of Apple G5 computers at Virginia Polytechnic Institute and State University in Blacksburg made its debut in the No. 3 slot, at 10.3 TFLOPS, and a Dell Inc. Linux cluster at the National Center for Supercomputing Applications in Illinois held the fourth spot with 9.8 TFLOPS.

High-Performance Technologies Inc. of Reston, Va., built the NOAA lab's 3.16-TFLOPS Linux cluster.

Energy's large-scale plans

In one of the largest single-agency blueprints for scaled-up systems, a new, unclassified supercomputing proposal ranks second out of 28 research programs that Energy has prioritized for the next two decades.

The UltraScale Scientific Computing Capability, proposed by Energy secretary Spencer Abraham, aims to regain the U.S. lead in high-performance computing. The initiative would boost the unclassified computing capability open to researchers by a factor of 100, Abraham said.

Many of the supercomputers and clusters on the semiannual Top 500 list are souped-up versions of systems designed for transaction processing. Under the UltraScale program, however, Energy would partner with U.S. hardware and software vendors to tune systems for complex scientific calculations.

In announcing Energy's 20-year research priorities, Abraham said, 'We believe this list of 28 facilities outlines to an important extent the future of science in America'and indeed the world.'

Abraham did not specify funding but said Energy would provide details to Congress and the White House for future spending.

Two other IT-related projects tied for seventh place on Energy's list of 28 priorities. Abraham called for a substantial enhancement to the high-speed Energy Sciences Network and an upgrade to the National Energy Research Scientific Computing Center in Berkeley, Calif.

Any proposed federal policy on high-performance computing will have tight fiscal constraints, said Juan D. Rogers, associate professor of public policy at Georgia Institute of Technology and co-author of a recent high-end computing report for the IBM center.

'The importance of high-end computing for contemporary science and technology cannot be exaggerated,' Rogers said.

Today's high-end computing is tomorrow's mainstream computing, but it's a moving target for policy, he said, because of the uncoordinated demands. Leadership should be measured by sustained performance on real-world scientific problems, not raw performance on simple benchmarks, he said.

Dona Crawford, associate director for computation at Energy's Lawrence Livermore National Laboratory in California, said the Advanced Simulation and Computing nuclear weapons stewardship program, which relies heavily on simulation, drives advances in unclassified areas as well.

The simulations of nuclear aging take only 12 percent to 16 percent of the program's budget, Crawford said. Most is spent on personnel and software development.

Crawford said she doesn't believe the United States has lost its leadership in supercomputing. 'Being No. 1 on the Top 500 list is a silly way to measure leadership,' she said.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above