Dousing wildfires with big data
If there’s one thing San Diego Supercomputer Center (SDSC) researchers know, it’s how to handle big data. A second thing they are getting increasingly familiar with – thanks to California’s ongoing drought – is the threat of wildfires.
So it probably seemed like a no-brainer for the scientists at the University of California, San Diego, to apply their big data skills to battling the danger of wildfires.
“In the San Diego area we have a long history of using sensor data and satellite information for situational surveillance,” said Ilkay Altintas, director of the Scientific Workflow Automation Technologies Lab at the SDSC. “Then the idea came up that we could feed this data into models that would give more accurate predictions of the rate of spread and direction of wildfires.”
As principal investigator of the WIFIRE project, Altintas is working with a multidisciplinary group of researchers drawn primarily from SDSC and the University of Maryland to develop data models that can transform sensor and satellite data into predictions of the behavior of wildfires.
According to Altintas, the WIFIRE project has a head start because of the High Performance Wireless Research and Education Network (HPWREN), also funded by the National Science Foundation. HPWREN, which was developed and managed by SDSC, is a high-bandwidth wireless data network that covers San Diego, Riverside and Imperial counties in areas that are typically not well-served by other data networking technologies. The network includes backbone nodes, many of them on mountain tops, to transmit data in the back country.
HWREN collects data from a network of sensors, including video cameras and more than 100 weather stations that provide real-time data about wind, temperature and humidity. In addition, Altintas said, WIFIRE is integrating NASA’s MODIS satellite data, which includes near-real-time information about water vapor, ground cover and other potentially relevant conditions.
“We’ve honed our methods based on historical fires,” Altintas said. “Our approach works by detecting potential parameters and constantly adjusting them based on data. This is what we call ‘dynamic data-driven modeling.’”
The WIFIRE system constantly monitors dozens of ground and atmospheric conditions and updates its models every 15 minutes. The team is currently about halfway through the first year of the three-year project.
In addition to working to improve the data models it has developed, the team is building a variety of prototypes for making the data and data models accessible to others. The first, and most dramatic, product is the “cave,” a wrap-around system of monitors for displaying WIFIRE data.
“We can see San Diego County in 3D with monitors that show temperature and wind speed information in visual form within the cave,” Altintas said. “We overlay data on the images. The trained eye can see the wind speed information, and realize (that) there’s a need to be more careful.”
The team is also producing catalogs of the data it collects and is outputting it to broadly used geospatial data formats for use by others.
Finally, the team is building open-source, Kepler-based workflows to make its models more programmable for other researchers. The Kepler Project is yet another NSF-funded effort dedicated to supporting the development and use of its free and open-source, scientific workflow applications.
“We’re trying to create a modeling community around this,” Altintas said.
Posted by Patrick Marshall on Aug 19, 2014 at 11:35 AM