How predicting floods is like catching a baseball
- By Patrick Marshall
- Jun 14, 2019
It’s not just New Orleans and Houston that worry about flooding, especially as the effects of climate change grow more visible. Every coastal community has to be concerned.
Researchers have developed plenty of software models that can predict when and where flooding will impact a community. The snag? Those models take so long to run that the damage is done before the results are in. “You can't run a traditional flood model in time when you see a storm coming,” said Kai Parker, a coastal engineer and Fulbright scholar currently studying at the Universidad Técnica Federico Santa Maria in Chile.
As part of his PhD program at Oregon State University, Parker and his collaborators developed a program for rapidly predicting the impacts of coastal flooding.
Traditional methods of modeling floods involve reproducing the various physical processes that are involved -- tides, storms surges and underlying geography -- and running them through differential equations to model the interaction of those processes and to project impacts, Parker said.
Those physical flood models, he told GCN, “solve really complicated differential equations, but it takes a really long time.”
Imagine the human brain trying to compute all the individual factors involved in catching a baseball, Parker suggested. If we tried to solve the problem using differential equations, he said, “we can't really do it fast enough to catch the ball in time.”
Instead of doing the math, the brain relies on previous experience. Anyone who has ever played catch has built up a brain-based dataset. “Previously I saw the ball going this way, and I kind of assume I know where it's going to go from my experience,” Parker said. “Your brain interpolates between what it's seen in the past, and you guess where the ball is going to go.”
That’s essentially what Parker’s program, developed with support from the National Oceanic and Atmospheric Administration, does. Rather than trying to reproduce the complex interactions of all the environmental factors involved in a flood event, the program builds a model by running full analyses for short periods at carefully chosen locations and then employs a complex statistical model to interpolate data between the areas where there is detailed information to get what the model would produce if it were to be run completely.
In short, while the traditional methods aimed at reproducing flood events in detail, Parker said his method is one of “emulation.”
According to Parker, it took his team about a month to build a flood model for Gray’s Harbor, Wash. But once the model was built, he said, results for a specific event are available in seconds. “We were able to produce 100 different versions over 100 years of flooding in a day, which would take a lifetime to do if we were doing it the old way,” he said.
The emulator performed well reproducing extreme water levels from recent flooding events in Grays Harbor, he said.
Now that he knows the emulator works, “we can hopefully produce this in new locations relatively quickly,” Parker said.
And by delivering rapid results, Parker's emulator will not only be able to give planners useful information for infrastructure development, but as storms approach, the emulator can deliver details to local authorities in time to inform decisions about evacuations and road closures.
“It's also really good for climate-change assessment,” Parker said. “We're always looking for more funding and opportunities to expand.”
Patrick Marshall is a freelance technology writer for GCN.