Emerging Tech

Blog archive
New ways to measure supercomputer performance

A new way to measure the best supercomputers

The TOP500 list of the fastest supercomputers in the world has been a staple of computing for over 20 years. Universities, governments and even private companies compete to build the fastest supercomputers and earn a place on the list. How long a supercomputer can stay on the list is faithfully reported by technology media outlets around the world.

To rate one supercomputer against another, the TOP500 list uses the Linpack benchmark, which was created by Jack Dongarra, a professor at the University of Tennessee. The benchmark as worked well, but there are those who say that  it no longer accurately represents true supercomputing power. The scientists working on supercomputer development at the Energy Department's Oak Ridge National Laboratory said much the same thing when I interviewed them earlier this year about the United States’ supercomputer development plans.

Now, Dongarra shares that sentiment, too. In a post on the University of Tennessee blog, he explains why Linpack no longer works as well as it once did.

"We have reached a point where designing a system for good Linpack performance can actually lead to design choices that are wrong for the real application mix, or add unnecessary components or complexity to the system," Dongarra said, according to the blog.

“The Linpack benchmark is an incredibly successful metric for the high-performance computing community,” Dongarra said. “Yet the relevance of the Linpack as a proxy for real application performance has become very low, creating a need for an alternative.”

The alternative is a new benchmark he and Sandia National Laboratories’ Michael Heroux are developing called the High Performance Conjugate Gradient. Dongarra explained the new benchmark measures how supercomputers are able to drive applications through their increasingly diverse makeup of CPU and GPU chips instead of just taking a raw power snapshot.

Data Center Knowledge reports the new HPCG won't replace Linpack, but will instead be used at the same time. The two together will determine how supercomputers place on the new TOP500 lists.

Given how competitive the agencies, governments and companies whose computers make the list seem to be, it’s a sure bet that using two benchmarks will be controversial, and give people ammunition to argue that they should be ranked higher.

But having benchmarked everyday nonsupercomputers for the past 15 years in the GCN Lab, I can say that benchmark technology has certainly changed. Where we used to use a standard benchmark that just looked at raw performance, we now make use of the Passmark Performance Benchmarks, which takes a much more well-rounded look at how a system is doing, component by component, while also considering how they are linked.

It makes sense that supercomputer benchmarks would also need some updating as the computing landscape changes. On another front, a group of supercomputing experts have started the Graph 500, to measure how well the machines handle big data, as well as the Green 500 to measure their energy efficiency.

With HPCG added to the test for overall performance, it will be interesting to see if, or by how much, the current rankings change once HPCG is factored in with Linpack.

Posted by John Breeden II on Jul 29, 2013 at 12:07 PM


  • Social Media
    Editorial credit: pcruciatti / Shutterstock.com

    They took all the tweets and put 'em in a tweet museum

    Twitter cancelled @realdonaldtrump, but the National Archives will bring presidential tweets back via the Trump library website.

  • Workforce
    Avril Haines testifies SSCI Jan. 19, 2021

    Haines looks to restore IC workforce morale

    If confirmed, Avril Haines says that one of her top priorities as the Director of National Intelligence will be "institutional" issues, like renewing public trust in the intelligence community and improving workforce morale.

Stay Connected