Fault finding

USGS geologist Kathleen Haller oversees the new Quarternary Fault and Fold Database, the first to collect in one Web-accessible place the data scientists use to predict earthquake risk.

Courtesy of Geological Survey

The Quaternary database shows California's infamous San Andreas fault beginning at 1g, upper left. Clicking on a text hyperlink brings up the fault's history with detailed geological and seismological data.

USGS tracks earthquakes back 1.6 million years

A new Web-accessible database built by the Geological Survey will help scientists and building engineers predict earthquake risks more accurately, saving time, money and, possibly, lives.

Seismologists and geologists are excited about the 100M database of earthquake-related faults in the continental United States. Some call it a one-stop treasure trove of scientifically rich and geospatially integrated data.

But the chief benefit might not materialize for years. Ultimately it will help structural engineers, state disaster planners and regulators make buildings and public infrastructure more resistant to earthquake damage.

The Geological Survey's Quaternary Fault and Fold Database, unveiled in June at qfaults.cr.usgs.gov, locates and describes areas in the continental United States where the earth has fractured and slipped, or folded over like a rug, creating the potential for earthquakes. It covers the last 1.6 million years, which geologists call the Quaternary Period.

Proponents say the database, in the works for a decade, is unique.

'This is the first one that actually shows the data that underpins our inference that these are active faults,' said database manager Kathleen Haller, a geologist at USGS' Earthquake Hazards Program office in Golden, Colo.

By typing in a few parameters, users can get answers to such questions as which faults have slipped more than a millimeter during a certain period and are in greater risk of causing earthquakes. Or users can find, say, all the faults in a region being considered for a housing development.

The data conforms to the Federal Geographic Data Committee's geospatial metadata standards. It currently resides in FileMaker database management software from FileMaker Inc. of Santa Clara, Calif., and is served to the public via an Apple Macintosh G4 server running Mac OS X. Haller said the data will be ported to an Oracle database in the next few months.

'We provide locations of the faults at fairly good detail,' she said. 'We provide many details of the nature of the fault,' including an especially telling data point called the slip rate'a calculation of how fast earth is moving along a fault.

Faults 'tend to build up a certain amount of stress over time, and then they tend to release most of that stress,' Haller said. When they do, an earthquake may result.

'The West Coast is incredibly active in using this sort of material to help clients build' safer buildings that are not overly expensive, she said.

Engineers at the federal Bureau of Reclamation consult the database when deciding where to locate dams and other critical public infrastructure. They save labor in collecting and displaying data and free up time for risk analysis, said Susan Olig, a senior project geologist specializing in seismic hazards at engineering firm URS Corp. of San Francisco.

She is a database contributor and a frequent user. She nominated it for an award from the Western States Seismic Policy Council. 'It's technically very reliable and easy to access,' she said.

Olig and URS colleagues often perform seismic analysis of existing data for Bureau of Reclamation dam projects.

Another user is USGS' National Seismic Mapping Project, which is updating maps that show locations of earthquake hazards just as flood maps indicate flood zones. Such maps are heavily used by civil engineers and others involved in construction.

Scientists gather the data with a wide range of methods. Although the fault descriptions include historical records of nearby earthquakes, human habitation and written histories don't go back far enough to give a large enough sampling of the stresses, which grow over hundreds of years.

So geologists and seismologists must surmise past activity from telltale signs in the terrain itself. Flyovers, plus existing aerial maps and satellite images, help them characterize surface faults. Even better detail comes from trenching, or running a backhoe across faults and examining the soil and rocks for signs of slip, such as offset sediments.

Mapping and trenching are the two most common data gathering techniques, Olig said. With trenching, 'we can identify specific prehistoric earthquake events.'

Much more difficult to investigate are so-called underground blind-thrust faults such as the one that caused the 1994 Northridge, Calif., earthquake, which cost 57 lives and more than $20 billion. These faults require special technology such as ground-penetrating radar and geophones, which measure energy released by dynamite charges.

Seeing through the trees

A more recent laser imaging technology called lidar, for laser imaging detection and ranging, lets aerial photographers 'see' through trees obscuring surface faults.

A single database should make it easier for scientists in the two main disciplines'geology and seismology'to coordinate their efforts, which Haller said does not always happen.

The database's builders still need to automate remote data entry. At the moment, people on the existing list of about 50 compilers and contributors send fault information to Haller's office, where it is entered by hand in the correct geospatial format.

The office is still working to complete all the important data for California and to add Idaho. Haller said she hopes faults in Alaska and Hawaii will be included within two or three years. A rigorous, numeric earthquake chronology is also on the drawing board, as are 3-D interactive models that show how seismologic features interact.

Olig said she would like to see less hazardous, so-called class C and D faults on the maps along with more details on each.

One issue that remains unresolved is how rigorous the database's peer-review standards should be. The strictest standards would admit only fully reviewed, previously published data, said William Bryant, senior geologist at the California Geological Survey in Sacramento, but that would prohibit timely posting of fresh, high-interest data such as trenching at an important fault.

Flat funding

'At best, 50 percent of the salary load is covered by USGS,' Haller said.

Olig said the level funding has meant a net loss from inflation in the past 20 years. 'There are so many faults out there that need to be looked at,' she said.

Bryant is currently working to send Haller more accurate digital traces of the faults in some quadrants of California. He called the maps 'not that good' around the site of the 1993 San Simeon earthquake, for example. And that's not even new data, added Bryant, who in 1998 began compiling existing literature to add to his state's database. Federal funding 'is not really designed to have researchers do additional trenching of a fault,' he said.

The USGS database is so new that officials can't cite instances where it has saved lives, but they fully expect it to fine-tune risk analysis.

'We can't really predict earthquakes, but we can provide probabilities,' Olig said. She expects the database to slash the typical four-year wait for updating local building codes with new risk analyses.

Another long-term improvement, Bryant said, is that the new database will help localize California building codes, which tend to paint earthquake risk with too broad a brush. 'You can certainly cut down the cost,' Bryant said. 'You don't want to force someone in, say, Sacramento to meet the same design requirements as in San Francisco.'

But he hesitates to oversell the short-term benefits of greater precision in hazard analysis.
'What if we say there's going to be an earthquake next week?' he asked. 'Does everyone have to go outside? We might put a little more money into fixing a building to better withstand an earthquake, rather than replace the building.'

David Essex is a freelance technology writer based in Antrim, N.H.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected