Supercomputer speed analysis of  ‘alternative futures’ for water management

HPC speeds analysis of ‘alternative futures’ for water management

As water becomes a more precious and scarce commodity, effective use and conservation require that researchers test different water management strategies. It’s a complex problem that involves conservation, groundwater and seawater desalination and water reuse, as well as the uncertainty about future climate change and development patterns.

Managing the flow of the Colorado River, which winds almost 1,500 miles through seven states and some of the most arid land in the country, has been a big job for the past century. Like many other U.S. waterways, the Colorado River is under increasing pressure from a growing throng of consumers. Lab officials said the river provides water for 30 million people.

To speed the data analysis needed to manage the allocation of Colorado River water to millions of stakeholders, Lawrence Livermore National Laboratory has teamed with nonprofit research organization Rand Corp. to use the lab's High Performance Computing Innovation Center (HPCIC) to perform water allocation simulations in minutes instead of weeks.

By using the lab's supercomputer, researchers said they ran 12,000 "alternative futures" for the river, which are plans for allocation among the states, cities and American Indian tribal communities that use the river's water.

Traditional computers would have taken six weeks to complete the study, but officials said the supercomputer crunched the data in 45 minutes.

They added that the approach has promise for other data-centric policy issues.

"These same methods and resources can improve the decision-making process for a broad spectrum of national and corporate challenges," HPCIC Director Fred Streitz said. "The types of resource allocation and policy questions that can be answered with high-fidelity simulation show up regularly [in energy, agriculture and transportation]. There are many, many other areas where complex, slow-running models inform analysts, who, in turn, inform policymakers."

Adding high-performance computing to the mix compresses "the time frame between asking questions and getting answers," he added.

The analytical capabilities build on previous work done by Rand's researchers in 2012 and 2014 on the Colorado River Basin, including a joint workshop that used high-performance computer analytics to fuel the collaborative "deliberation with analysis" method. That process brings together stakeholders and experts to assess complex problems and find alternative solutions using scientific methods, including data analysis.

This article was first posted on FCW, a sister site to GCN.

About the Author

Mark Rockwell is a senior staff writer at FCW, whose beat focuses on acquisition, the Department of Homeland Security and the Department of Energy.

Before joining FCW, Rockwell was Washington correspondent for Government Security News, where he covered all aspects of homeland security from IT to detection dogs and border security. Over the last 25 years in Washington as a reporter, editor and correspondent, he has covered an increasingly wide array of high-tech issues for publications like Communications Week, Internet Week, Fiber Optics News, tele.com magazine and Wireless Week.

Rockwell received a Jesse H. Neal Award for his work covering telecommunications issues, and is a graduate of James Madison University.

Click here for previous articles by Rockwell. Contact him at mrockwell@fcw.com or follow him on Twitter at @MRockwell4.


Stay Connected

Sign up for our newsletter.

I agree to this site's Privacy Policy.