Predicting earthquake risks and effects around the world

International, open-source modeling effort looks to develop a global view

An international consortium is developing an open-source earthquake model that will help planners map high-risk zones and take preventive measures. 

When it is complete, the software program will tap into a variety of databases, permitting users to create a variety of models and maps, ranging from national or regional assessments of vulnerable areas to street-level studies of individual buildings near fault lines. Such a tool could help at-risk cities and regions institute repairs and construction before succumbing to a Haiti-type disaster. 

The Global Earthquake Model is a public/private partnership supported by the United Nations, the World Bank, the Organization for Economic Co-operation and Development, individual nations and private firms. GEM grew out of efforts to provide something more dynamic than a mere map of earthquake risk zones, said Josh McKenty, the project's IT manager.

Headquartered in Pavia, Italy, the GEM consortium began work in 2009. When it is complete, GEM will provide a scalable software tool that will allow users to calculate earthquake risks worldwide, provide a basis for comparing earthquake risks across regions and borders, permit the estimation of socio-economic impact and cost-benefit analysis of mitigating actions. It is designed to clearly communicate earthquake risks accurately and transparently to allow organizations, institutions and individuals to make decisions about risk mitigation and it will be available to a wide variety of users, from communities to nations.

Related coverage:

Satellites come to the rescue when ground systems fail

Earthquakes are something to tweet about

International groups are using the open-source architecture to create the various tools that will make up the software package, McKenty said. The consortium has open calls for proposals out for 10 projects focused on various aspects of GEM. Much of this work is being conducted at a supercomputer cluster linked to GEM’s modeling facility in Zurich.

McKenty said that GEM is not necessarily trying to raise the state of the art, because earthquake modeling technology is very good in some regions of the world, but rather to extend the state of the art to cover places where there is little or no earthquake modeling or where such techniques are not very sophisticated.

GEM is an open-source and open-data project seeking to create a tool that can access regional data down to the individual household level. When the site is complete, McKenty said that anyone interested in building or remodeling a house would be able to visit the GEM website and call up data about their regional earthquake requirements.

Risk modeling is an essential feature of GEM. McKenty said that international groups such as the World Bank need decision-support tools for risk analysis. He noted that risk analysis has become highly refined only in the last several years; GEM seeks to move risk modeling to a higher level. “When you do risk analysis instead of basic hazard analysis, what you end up with is a platform to make intelligent investment decisions,” he said.

This capability will allow regional and municipal governments to conduct their own analyses, such as determining how many schools in a region require upgrades to meet seismic standards, and which most need work within existing budget levels. “These kinds of decision-support tools aren’t available in most places in the world, and even where they are available, they’re still a manual process. So we’re looking at addressing socio-economic impact as part of the modeling language,” he said.

McKenty, who currently on sabbatical from his job as the chief architect of cloud computing for NASA’s Nebula program, said there are 10 regional GEM programs. He said that most of these regional efforts will use their own hardware. Although they can access the modeling facility in Zurich, but for various reasons such as latency, these groups will want to run their own systems, he said. He added that some regions will want to run their own systems because they may not want to share their data.

One of the challenges is to federate these various systems together so that in cases where certain regions do not allow their databases to be fully replicated, they will permit calculations and queries to be performed against it. As an example, he cited the border region between India and Pakistan, which is crossed by fault lines. India and Pakistan each have half of the data for these faults, but they do not share with each other. “Nobody has a complete picture of that region of the world for seismic activity,” he said.

McKenty said it is unlikely that both nations will let their seismic data be shared through GEM. But it may be possible to persuade each nation to run their own data infrastructure and that GEM can run a federated model taking advantage of the data sets on each side of the border without sharing the underlying data with each side. This would allow computer-based earthquake risk products that are more accurate than are available today, he said.

Individual modelers can take the open GEM system and run it on their own desktop or laptop computers. These individuals can either connect online to the open GEM architecture or they can process it with their own data sets and use it to refine their own models.

Work on GEM began 18 months ago. The first public code release is due Jan. 1, 2011. By early next year, McKenty hopes that GEM will begin offering services in its modeling facilities. In March, he hopes to provide socio economic impact tools. He is optimistic about the program’s goals because of its ability to pull in contributors from other projects to include them into the platform.

Observing that GEM this is the most complex piece of scientific software that he has worked on, McKenty said that there is probably no comparable program that is trying to build a platform at this scale. The closest model is Google Maps, but he said that this is not close enough because Google only has a geospatial imaging component, whereas GEM collects a variety of other data and runs the information through complex levels of computation to produce layered, specific area maps.

“It’s a tremendous piece of software to build,” he said.

Stay Connected

Sign up for our newsletter.

I agree to this site's Privacy Policy.