Perlmutter supercomputer (NERSC)

Next-gen supercomputer lays foundation for exascale

The newly unveiled Perlmutter next-generation supercomputer is expected to take high-performance computing capabilities at the Department of Energy to new levels.

Housed at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory, the HPC machine was named in honor of the lab’s astrophysicist Saul Perlmutter who shared the 2011 Nobel Prize in Physics. It will be used for unclassified research into advanced computing, artificial intelligence and data science as well as climate and environmental studies, semiconductor and microelectronic research and quantum information science.

Perlmutter will be among the fastest supercomputers in the world for scientific simulation, data analysis and AI applications, NERSC officials said in the May 27 announcement. Its heterogeneous architecture is based on the HPE Cray “Shasta” platform and will provide four times the computational power currently available from NERSC's current flagship system, Cori. 

Perlmutter is being delivered in two phases. The first, unwrapped May 27, features 1,536 GPU-accelerated nodes, each containing four NVIDIA NVlink-connected A100 Tensor Core GPUs and one 3rd Gen AMD EPYC processor. Phase 1 also includes a 35-petabyte Lustre filesystem will move data at a rate of more than 5 terabytes/sec. When Phase 2 arrives later this year, Perlmutter will gain 3,072 CPU-only nodes, each with two 3rd Gen AMD EPYC processors and 512 GB of memory per node.

“Perlmutter will provide considerably more computing power than our current supercomputer, Cori, and will introduce several key technologies that will be used in exascale systems in the coming years,” NERSC Director Sudip Dosanjh said. “It will enable a larger range of applications than previous NERSC systems and is the first NERSC supercomputer designed from the very beginning to meet the needs of both simulation and data analysis."

“This is a very exciting time to be combining the power of supercomputer facilities with science, and that is partly because science has developed the ability to collect very large amounts of data and bring them all to bear at one time,” said Perlmutter, who was on hand for the launch. “This new supercomputer is exactly what we need to handle these datasets. As a result, we are expecting to find new discoveries in cosmology, microbiology, genetics, climate change, material sciences, and pretty much any other field you can think of."

About the Author

Connect with the GCN staff on Twitter @GCNtech.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected