Is the US set to lose the supercomputer race?
Full disclosure: I was not around during most of the space race. My experience with NASA’s Apollo program comes mostly from the Tom Hanks movie, “Apollo 13,” which as sources go, isn’t so bad. But the recent struggle to build the fastest supercomputer in the world, with nations each trying to outdo each other, seems like a similar race.
Currently, the United States is in the supercomputing lead with Oak Ridge National Laboratory’s Titan, which is capable of sustained computing of 20 petaflops, or 20 thousand trillion calculations per second. No sooner did the United States take the lead than an angry China announced it would recapture the top spot with Tianhe-2, a system it’s hoping to complete by 2015.
China’s Tianhe-1A, which two years ago was ranked the fastest supercomputer, now sits in eighth place on the Top500 list. Lawrence Livermore National Laboratory’s Sequoia, which hit 16.32 petaflops, is in second place, followed by Japan’s K-Computer and Argonne National Laboratory’s Mira, giving the United States three of the top four spots.
The magic of these new computers is the use of graphical processing units in conjunction with traditional processors, with Nvidia being the primary manufacturer in the supercomputer space. Someone figured out that GPUs, which make all the amazing 3D games possible these days, could be used to process more mundane calculations and even drive simulations, which are one of these computers’ biggest uses.
The next big target is exascale computing, which, if you are keeping score, is 1,000 petaflops. China thinks it can achieve exascale computing by 2018 at the latest. The United States was pretty sure it could beat anything China could make, but now it looks like a different set of numbers might make America sit this battle out. It’s been reported that the United States simply doesn’t have the money to make another supercomputer that can hold on to the No. 1 spot. Given that an exascale computer could cost a few billion dollars, it’s probably going to have to wait until the economy gets a bit better, and that means no exascale supercomputer before 2020 or 2022 in any case.
I used to cover supercomputers years ago, back when reaching one trillion calculations per second was a big deal, so I’ve seen what they can do. Everything from crash test modeling to make safer vehicles to simulating a nuclear blast so we don’t have to set off a real one can be achieved with a fast enough supercomputer. Lately they have turned their brain-power to matters such as climate change, which can be just as deadly as a bomb blast when superstorms and hurricanes become commonplace.
Then again, the realist in me also wonders if those billions of dollars could be better spent somewhere else. As I suspect the space race was, the supercomputer struggle seems to be about 50 percent real science and 50 percent bragging rights. Sure, the science the computers would allow is amazing, but so is letting the world know that the United States is better than China, or Japan, or any country in Europe. We like to be winners, and working with the second or third fastest computer in the world is pretty darn un-American.
So I don’t know. I’m about 50/50 split on this one, too. Why don’t I throw this one open to comments and see what you think? Should the United States try and find the money to build the next Titan, or should we be satisfied with what we’ve got, and let China take the top spot in the supercomputer arena?
Posted by John Breeden II on Nov 27, 2012 at 9:39 AM