Emerging Tech

Blog archive

Predicting earthquake damage with big data analytics

When a major disaster strikes, first responders routinely report that reliable information is what they need most.  What is the extent of the damage? What critical infrastructure needs attention? And of course, most important of all, where are the most casualties likely to be?

Ahmad Wani experienced the problem firsthand in 2005 when a magnitude 7.6 earthquake struck near his family’s home in Pakistan, killing more than 70,000 people. Although Wani was not harmed, he watched as responders struggled to identify areas most in need of rescue efforts.

When Wani arrived at Stanford Graduate School Business in 2013, earthquakes were on his mind, and using what he learned in machine learning and earthquake engineering classes, he launched a project to see if algorithms could be used to accurately predict the location and extent of damage.  The results were encouraging enough to attract the interest of investors, and Wani, along with two others, founded One Concern.

The company’s web platform -- Seismic Concern -- collects a wide array of public and private data on the physical environment of a subscribing city or county, including geography, soil conditions, bodies of water, as well as details about physical structures, including the age of structures and how they have been affected by previous earthquakes.  After the company’s algorithms are applied to the data, the resulting map provides a reading of which structures are likely to be damaged.

Seismic Concern also integrates time- and location-sensitive population data to help emergency managers focus on areas where casualties are likely to be high.  If an earthquake strikes during school hours, for example, the algorithm prioritizes school buildings.

In the event of an earthquake, Wani said, the program delivers a report on hot spots immediately.  But that, he said, is only the beginning.  Seismic Concern has been designed to integrate sensor data, when available, and social media, such as tweets. 

“In the initial critical moments after an earthquake, we are 80 to 85 percent accurate,” he said.  “But then we leverage live data.” According to Wani, the live data includes “perception data” from responders, reports from building inspectors and social media.

“If there is even a moderate 6.5 magnitude earthquake, you expect around 500,000 tweets,” Wani said.  Because humans cannot possibly read every tweet, Seismic Concern uses analytics to process the Twitter stream.  “We basically squeeze out all of the relevant possible information we can from the tweet and enrich our map using it,” Wani said. 

One Concern is also developing two other modules to assist emergency planners.  First, a critical infrastructure module takes analysis down to the building level instead of the block level.  Analysis of buildings officials are especially interested in -- hospitals or schools, fire stations or any sort of critical infrastructure -- requires specific details for that building, such as blueprints.

The company has also developed a training module that emergency planners can use to simulate events and responses.

According to Wani, One Concern has been beta testing its software with emergency operations centers in several cities and counties in California, primarily in the San Francisco Bay Area. 

Posted by Patrick Marshall on May 10, 2016 at 1:10 PM


inside gcn

  • NGA tries paying by the sprint

    State strategies for smarter IT procurement

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

resources

HTML - No Current Item Deck
  • Transforming Constituent Services with Business Process Management
  • Improving Performance in Hybrid Clouds
  • Data Center Consolidation & Energy Efficiency in Federal Facilities

More from 1105 Public Sector Media Group