DHS Special Report | FEMA maps out a better response
Geospatial and flood-mapping operations are helping to streamline emergency response
- By David Essex
- Jun 17, 2006
BREAKTHROUGH TECHNOLOGY: This image of New Orleans, created by DHS with the assistance of the National Geospatial-Intelligence Agency shortly after Katrina, was made using LIDAR technology. It allowed FEMA to assess the extent of flooding damage in New Orleans.
Photo Courtesy of DHS
While withstanding a storm of criticism for its response to recent large-scale natural disasters, the Federal Emergency Management Agency has continued to make substantial investments in its geospatial and flood-mapping operations.
The strategy has provided both a ray of sunshine for the beleaguered agency and a public relations coup, as Web maps and data have proved to be almost ideal mechanisms to reach out to disaster victims en masse and respond to their requests. In the long term, geospatial technology should help FEMA become more effective in its core missions of disaster response and hazard mitigation, officials said.
Though some experts say the two terms have become almost synonymous, geospatial technology is generally regarded as going beyond the now-familiar marriage of electronic maps and data of geographic information systems by adding more sophisticated analysis. FEMA's geospatial analysts work side by side with first responders, either in the field or remotely, helping them make critical decisions and direct resources where they are needed most.
'They're supposed to provide sort of a backroom geospatial support function,' said John Wilson, a geography professor at the University of Southern California and president of the University Consortium of Geographic Information Science.
Wilson said geospatial teams are among the first to set up shop in a disaster area.
'You have to ramp up from ... nothing to everything very quickly,' he said, citing the example of a local college that sent personnel, maps and data into New York City to replace geospatial support functions lost when the local emergency-management office was destroyed on Sept. 11, 2001.
Understandably sensitive to fair and unfair criticism'and the resulting finger-pointing'after 'heckuva job,' FEMA's geospatial team seems quick to prescribe the limits of its responsibilities.
'FEMA is responsible for the flood maps,' said Frank Oporto, an IT specialist in the agency's geospatial solution section. 'Everything else we do is in consultation with multiple data sources.'
The two halves of FEMA's geospatial operation have sometimes collaborated with other agencies with great effectiveness. Paul Rooney, a FEMA mapping technology specialist in the flood map side who serves as a sort of liaison to the geospatial section, recalled such an incident from Hurricane Katrina.
'I knew that the [Agriculture Department] 2004 National Agricultural Imagery Program (NAIP) was the most recent detailed statewide imagery available for the affected areas of Mississippi,' Rooney said. 'Working with the USDA Aerial Photography Field Office, we were able to get all this imagery available quickly through the USDA Geospatial Data Gateway for disaster responders.'
FEMA's other major geospatial effort is a massive, five-year effort begun in November 2004 to update the flood maps provided for communities that are members of the National Flood Insurance Program.
'The existing FEMA flood hazard maps were developed in the 1970s and 1980s, at a time when cartographic processes introduced substantial errors,' said Alan Lulloff, director of research and development at the Association of State Floodplain Managers of Madison, Wis.
This Flood Map Modernization program (Map Mod) has not been immune to criticism. 'When flood map modernization started, there were concerns that in some parts of the country, FEMA was digitizing these existing poor-quality maps without checking that they matched the ground surface,' Lulloff said.
The consensus among ASFPM members, he said, is that funding is inadequate, that it shortchanges ongoing map maintenance, and that the five-year schedule should be extended to 10 years.
FEMA also does not have a way to incorporate updates funded by local governments, Lulloff said. But he called FEMA's new standard for defining the flood plain boundaries a good foundation for improvement.
Rooney added that FEMA has recently instituted stronger incentives for federal agencies and other partners to drop an old mapping standard and conform to NAVD88, a more accurate standard that supports LIDAR (Light Detection and Ranging) and global positioning systems.
William Burgess, Washington liaison at the National States Geographic Information Council of Bel Air, Md., said FEMA will provide a key piece of the emerging National Spatial Data Infrastructure if it can deliver accurate elevation data and road centerlines from orthographic imagery'two-dimensional views of three-dimensional objects captured in aerial and satellite photographs'in a standardized format nationwide.
'The elevation model is critical,' Burgess said. 'The way it's done today is dramatically different than it was 10 years ago. You're still going to have to have on-the-ground surveys.'
One promising technology for improving accuracy is LIDAR, a sort of laser-based radar.
'It's a breakthrough technology that's causing the cost of collecting that data to drop not quite an order of magnitude,' Lulloff said.
Rooney cited an example from Katrina. 'Map Mod, working cooperatively with the state of Mississippi and [the National Oceanic and Atmospheric Administration] had access to new, precise ground elevation data, acquired by LIDAR technology, in all the Mississippi coastal counties,' he said. 'This facilitated many disaster-response activities, including production of advisory flood maps.'
Technology aside, getting accurate flood maps could prove to be more of a political problem, as the federal system sometimes works against FEMA's ability to coordinate, assist or direct local governments, the natural repositories and collectors of local data. FEMA has its work cut out.
In a July 2005 study of the flood-map modernization program, the Government Accountability Office said FEMA had yet to formulate a plan for establishing partnerships with communities that have limited existing resources or mapping capabilities.
'It is usually the local and regional agencies that have the most current, up-to-date, accurate and precise data,' said Marc Berryman, GIS manager at the Greater Harris County 911 Emergency Network in Houston, and a member of the National Emergency Number Association of Arlington, Va., which promulgates standards for a nationwide 911 system.
'The federal government should help develop funding, tools and data-sharing mechanisms that coordinate and utilize the local government data to meet the needs of the federal government,' he said.
Berryman knows something about local geospatial efforts. He said Houston-area agencies have done a good job of standardizing data formats and base maps, and of funding their own mapping, spurred in part by the wake-up call of Hurricane Rita, a near-miss.
'If FEMA came in, we could give them all sorts of information,' Berryman said.
Lulloff agrees that FEMA needs to work better with local partners.
'In some instances, FEMA is utilizing the funding under flood-map modernization to supplement state and local government flood-plain-management programs and having those existing programs produce the modernized flood hazard maps,' he said. 'In other cases, FEMA is missing this opportunity and developing the maps with less-than-optimal state and local involvement.'
Rooney said FEMA has made substantial progress in this area and is working closely with stakeholders.
ASFPM members also have complained that FEMA forces them to upload maps to a central repository that runs on a slow server and is burdened by a cumbersome error-checking system, he said.
The penchant of FEMA's parent agency, the Homeland Security Department, for classifying material has also made it harder for FEMA to share data with geospatial partners.
'This administration has been steadily shifting resources out of the public side of government to the classified side,' said Ed Wells, president-elect of the Urban and Regional Information Systems Association of Park Ridge, Ill.
Classification and security
To illustrate the problem, Wells told the story of a URISA board member in Orleans Parish, La., who handed over all her available data to FEMA geospatial staff members after Katrina. When she asked for FEMA data in exchange, she was told much of it wasn't readily available.
'I'm not advocating we abandon classification and security'not at all,' Wells said. 'It's out of balance. If [information] is classified or for official use only, or password-protected, all the effort they put into compiling it is wasted.'
FEMA's ongoing effort to upgrade geospatial operations at its 10 field offices could also improve local outreach. William Henriques, chief of the geospatial-solution section and Oporto's boss, estimated the agency has hired 50 geospatial specialists in the past year.
The ultimate goal is to provide standardized, layered maps and data that are easily shared among stakeholders.
'We're working very closely to try to standardize these base layers and make sure everyone involved in disaster response is using these same base layers. Within the next 12 months, we'll be well on our way,' Henriques said.
The recent emphasis on intergovernmental relations does not mean technical challenges have disappeared. Henriques is especially concerned about upgrading bandwidth among FEMA offices, along with storage capacity, as geospatial information grows exponentially from gigabytes to terabytes of imagery and data. On the existing network, FEMA must often break its geospatial 'products' into pieces. 'The files are so large that we can't even serve them up,' Henriques said.
Both Lulloff and FEMA staff say Katrina provided a stellar example of how the agency's two primary geospatial responsibilities can come together for powerful effect. FEMA correlated high-water marks with LIDAR data to determine flood depths in neighborhoods after it was determined that structures in more than two feet of water were automatic targets for demolition'an 'imagery-derived assessment,' in FEMA lingo.
'They were then able to use this information to determine which neighborhoods did not need house-by-house damage inspections,' Lulloff said. 'This likely saved FEMA hundreds of thousands of dollars in site-inspection costs.'
David Essex is a freelance technology writer based in Antrim, N.H.