High-performance systems branch out

New software releases aim to expand the range of applications and storage for supercomputing

High-performance computing vendors are increasing performance and developing standards-based tools to enable a broader range of users to tackle challenging workloads. LSI, Microsoft, Red Hat and SGI last week each unveiled system enhancements or initiatives geared toward making applications in areas such as digital content, science, defense and intelligence, and energy run more efficiently.

Microsoft released the first public beta version of Microsoft Windows HPC Server 2008, a server operating system and set of tools designed for high-performance computing.

The company also established the Parallel Computing Initiative, a program that creates a set of common development tools across multicore desktops and clusters.

Windows HPC Server 2008, a successor to Windows Compute Cluster Server 2003, is based on the Windows Server 2008 operating system. With Windows Compute Cluster Server, Microsoft addressed desktop supercomputing and workgroup clusters, but users want the ability to work with larger clusters, said Jeff Wierer, senior product manager for Windows HPC Server 2008.

The beta is now available for download at microsoft.com/hpc. The final version will be generally available in the second half of 2008.

The Parallel Computing Initiative is designed to simplify and enable parallel computing for a broad set of applications in multicore and cluster environments.

The initiative will complement other standards-based tools such as Message Passing Interface and OpenMP and native parallel debugger support in Visual Studio 2007. New technologies include the Parallel Extensions to the .NET Framework, which will enable developers to create and improve parallel applications, Wierer said.

Meanwhile, LSI announced the first deployment of a storage system based on its new XBB-2 architecture.

A beta system with more than one petabyte of storage was installed at the National Center for Computational Sciences at the Oak Ridge National Laboratory (ORNL).

The XBB-2 storage system was selected for evaluation by NCCS to provide the storage support for the world's first petascale-class supercomputer because of its demonstrated bandwidth of 6.4 gigabytes/sec and its data integrity capabilities, LSI said.

'As Oak Ridge prepares to deploy a petaflop computer system in 2009, it is critical that we provide a storage system that is up to the task,' said Shane Canon, technology group leader at NCCS.

Meanwhile, Red Hat and Platform Computing, a provider of high-performance computing infrastructure software, have joined forces to offer the Red Hat HPC Solution.

The product fully integrates Platform's Open Cluster Stack1 with Red Hat Enterprise Linux. The new integrated offering gives users a range of tools to deploy and manage an HPC cluster in a various environments.

And SGI officials unveiled the company's visual supercomputing strategy, which integrates visualization with the compute and data management stages of high-performance computing workflows.

About the Author

Rutrell Yasin is is a freelance technology writer for GCN.


  • business meeting (Monkey Business Images/Shutterstock.com)

    Civic tech volunteers help states with legacy systems

    As COVID-19 exposed vulnerabilities in state and local government IT systems, the newly formed U.S. Digital Response stepped in to help. Its successes offer insight into existing barriers and the future of the civic tech movement.

  • data analytics (Shutterstock.com)

    More visible data helps drive DOD decision-making

    CDOs in the Defense Department are opening up their data to take advantage of artificial intelligence and machine learning tools that help surface insights and improve decision-making.

Stay Connected