NASA supercomputer outpaces the pack

NASA supercomputer outpaces the pack

The latest salvo in the war to build the most powerful supercomputer has come from NASA: The Ames Research Center in California today claimed its new system, Columbia, surpasses other recent contenders for the title of fastest computer performer.

The Moffett Field, Calif., center clocked its computer achieving a sustained rate of 42 trillion floating-point operations per second.

The supercomputer, provided by SGI of Mountain View, Calif., consists of 20 tethered SGI 512-processor Altix systems. It runs a total of 10,240 Intel Itanium 2 processors and has 20T of memory. The system will use a 440T SGI InfiniteStorage storage area network.

NASA announced the purchase of Columbia in July [See GCN story].

The announcement follows a declaration last month from IBM Corp. that it had the world's fastest supercomputer, one the company plans to deliver to Lawrence Livermore National Laboratories next year [See GCN story]. IBM claimed that its system achieved a sustained performance of 36.01 TFLOPS in the IBM laboratory.

IBM's number edged out the performance of the Earth Simulator in Yokohama, Japan, thought to be the fastest computer of the last few years. That unit, built by NEC Corp., can run at 35.86 TFLOPS, according to the simulator manager's submission to the most recent Top500.Org, a list of the top 500 supercomputers.

The processing ratings for all these computers were obtained using the Linpack benchmark, which measures the performance of a computer in floating-point operations.

A floating-point operation expresses numbers in variable exponent lengths. Floating point operations offer more precise calculations than operations working with fixed-point calculations'the greater precision being handy for the big simulation jobs NASA performs.

Walt Brooks, head of advanced supercomputing at Ames, said the space agency did not set out to design a system that would best the Top 500 List. It worked with SGI to design a system to handle the large computational problems pressing the agency's current computational capabilities, such as hurricane forecasting, supernova simulation and next-generation aircraft designs.

'We really don't care about Linpack that much. We'll measure it because everyone wants us to measure it, but we're much more interested in reliability, usability, productivity and what really happens with the system,' Brooks said.

About the Author

Joab Jackson is the senior technology editor for Government Computer News.


  • business meeting (Monkey Business Images/

    Civic tech volunteers help states with legacy systems

    As COVID-19 exposed vulnerabilities in state and local government IT systems, the newly formed U.S. Digital Response stepped in to help. Its successes offer insight into existing barriers and the future of the civic tech movement.

  • data analytics (

    More visible data helps drive DOD decision-making

    CDOs in the Defense Department are opening up their data to take advantage of artificial intelligence and machine learning tools that help surface insights and improve decision-making.

Stay Connected