Where GIS gets lost

Agencies' use of powerful geospatial apps often hits a dead end when it comes to working together<@VM>Sidebar: Toronto police struggle with GIS' success

Geospatial applications are as hot as a California wildfire. A sampling of geographic information system projects recently launched by federal agencies includes:
  • Military personnel in Iraq using GIS software to track the movements of warfighters carrying Global Positioning System devices and send them critical information about conditions in the area.
  • The National Oceanic and Atmospheric Administration transmitting real-time data on oceanographic and meteorological conditions via a Web interface to help vessels navigate the waters of 14 ports nationwide.
  • The National Geospatial-Intelligence Agency (NGA) helping the Coast Guard create a 3-D map of the ocean bottom in the Arctic in an effort to chart new channels.
  • An NGA support team assisting in a multiagency effort in October to use GIS for delivery of critical information to fight a rash of wildfires in Southern California.

These projects are just the tip of the iceberg.

In an April GCN survey, two-thirds of respondents said their agencies already use GIS applications, and most said they expected to use more geospatial data in the next five years. Surprisingly, 67 percent of respondents also reported that their organizations were already using location-based services for tracking vehicles or other assets.

Although hundreds of geospatial applications are blossoming at the federal, state and local levels, they have been developed in large part independently, without common standards.

As a result, an application developed by one organization often can't digest and work with data collected by another.

'It's incredibly frustrating,' said Dan Ponte, a geologist at the U.S. Geological Survey. 'Even within our own organization, being able to get information easily from one data system into another is not a simple task. We haven't really cracked that nut.'

Part of the problem is that geospatial applications are complex and combine many types of data. Most provide tools for importing and exporting a variety of formats for maps and data contained in databases ' and for displaying maps in hundreds of different projections.

But the data collected by agencies and departments might have been geocoded using criteria such as ZIP codes, census tracts or latitude/longitude coordinates.

Bill Holler, GIS administrator of Wethersfield, Conn., said he feels many of the same frustrations. His team has developed a system of location maps for the town's police and fire vehicles, but other data collected by the city ' such as resident information that might be helpful in an evacuation ' is in other databases and hasn't been geocoded in a way that allows integration into his location maps. 'It will take a concerted effort on the part of several different departments with respect to how they prepare, generate and store data, and get it into whatever format is considered desirable,' Holler said. 'That is a whole different can of worms.'

And if Wethersfield suffered a catastrophe that prompted a federal emergency services response, would the feds be able to access the data Holler has collected? For example, it might be useful for those responders to know the location of every fire hydrant in town. But that information was geocoded using street addresses, so any application used by federal responders would need to have the same underlying data.

A call for integration

Federal efforts to coordinate geospatial information date to at least April 1994, when President Clinton signed an executive order calling for development of a National Spatial Data Infrastructure. According to the order, NSDI would 'support public- and private-sector applications of geospatial data in such areas as transportation, community development, agriculture, emergency response, environmental management and information technology.' The order also said NSDI was necessary to avoid wasteful duplication of effort and promote effective, economical management of resources by federal, state and local governments.

Deborah Mitchell, chief of the North American Homeland Security Division at NGA, said the 2001 terrorist attacks gave new impetus to efforts to coordinate geospatial efforts.

'After 9/11, a group called the Homeland Infrastructure Foundation-Level Data Working Group was established,' Mitchell said. 'This was a group of federal, state and local government organizations that were all looking at geospatial issues as they relate to homeland security. It was quite clear that we needed to get a database that would support the community at large.'

In early 2002, NGA and USGS formed a team to identify critical infrastructure sectors, and by July 2005, those efforts eventually produced Homeland Security Infrastructure Program Gold, Mitchell said.

HSIP Gold is a unified geospatial data inventory assembled by NGA in partnership with the Defense and Homeland Security departments and USGS.

'This was the first unified homeland infrastructure geospatial data that was disseminated to the federal community,' Mitchell said. 'It's really quite good data. It supports us in our modeling, our simulation, our vulnerability assessments. We use it for visualization of critical infrastructure.

It's the primary geospatial database to satisfy the Homeland Security Presidential Directive 7 to geospatially map our critical infrastructure and key resources.'

However, HSIP Gold ran into interoperability problems, Mitchell said. It offers a geospatial data infrastructure for federal users but doesn't reach the state and local government communities.

Because of licensing restrictions, those sectors have only limited access to the data unless there is a declared emergency.

HSIP Gold also has other limitations. It delivers numerous databases in their original form, with no single data model. As a result, users must often query HSIP Gold multiple times to access the data they need. NGA reportedly has hired an outside contractor to assist in harmonizing the databases so that multiple queries are not necessary. That harmonization is also expected to lead toward standardizing information as it is brought into the data model.

The next step is dubbed HSIP Freedom, a project Mitchell said is just getting under way to provide license-free geospatial data. 'This time, we're looking to support the federal, state and local levels because it is quite obvious that they need the same data that the federal organizations do,' Mitchell said. HSIP Freedom is a multiyear effort expected to assemble a dataset equivalent to HSIP Gold by 2011 or 2012, she said.

Another major federal effort to bring some standardization to geospatial efforts is iCAV ' the Integrated Common Analytical Viewer.

Developed by DHS, iCAV is designed to use technology-neutral Web services to deliver an application that can incorporate and integrate virtually any data that can be geocoded. Officials say federal, state and local government users will be able to access iCAV data via the Web.

Independent efforts

Despite the mandate to coordinate efforts at the federal, state and local levels through NSDI, most geospatial applications don't receive significant federal encouragement toward integration or compatibility. That's especially true of state and local efforts even though many of those applications are being developed with the help of federal grants.

Many of the most promising state and local efforts still do not have enough resources and capabilities to expand and integrate even with nearby entities.

The San Bernardino County, Calif., Sheriff 's Department has, for example, developed a mobile mapping unit that has been useful in search-and-rescue operations and planning for evacuation in the event of a disaster. John Amrhein, emergency services coordinator at the department, said the unit was launched 1 1⁄2 years ago with grant money from DHS.

It is housed in a 34-foot mobile home refitted with workstations, communications devices, a high-resolution plotter and other equipment. Search teams ' including dogs and helicopters ' carry GPS tracking equipment so the unit's ESRI software can accurately track them and record the ground they cover during search-and-rescue operations.

The unit is already expanding to accommodate other departments in the county, Amrhein said. 'We are currently loading all of the major county data into our command post so that if we ever have a big earthquake'we would be able to do emergency mapping right there in our command post, our mobile mapping unit.'

Paper maps

But Amrhein said he is aware of the mapping unit's limitations. For example, it can only deliver maps to the field on paper. Field units don't have devices for receiving digital maps ' and even if they did, the mobile mapping unit doesn't have the technology to deliver them.

'Unfortunately, we couldn't afford the pipeline both ways,' Amrhein said. 'We have fast [speeds] coming down but very slow going up. I'd like to see a better satellite system with more up-and-down pipeline so we can get it out to the field. Hopefully, with future grants, we will be able to upgrade that.'

Amrhein said he would also like to be able to track search teams using real-time GPS instead of downloading location data when they return.

Although geospatial application developers still face many technological and financial challenges, some experts say that the time is ripe to press ahead with implementation.

Sam Bacharach, executive director at the Open Geospatial Consortium (OGC), said progress in setting standards for geospatial application developers has diminished the risks of creating new geospatial applications.

'If everybody had started to do this in, let's say, January 2002, they would now be going back replacing stuff because the technology has changed,' Bacharach said. 'But the standards are now stable. The standards are now robust enough. The hardware is fast enough.'

If an organization uses an OGC-compliant application, they can be sure the data can be integrated with other applications, he said.

However, just because most software supports OGC interfaces doesn't mean that a consultant or vendor will use them. 'They know [their own interfaces] better, and it's to their competitive advantage to keep you tied up with their proprietary interfaces,' Bacharach said.

So he said he advises implementers to insist that any application be customized to use standard interfaces.
23 Toronto police analysts use GIS to analyze crime, but the data is not as easy to access in the field and officers can't run queries against the data from the field

THE TORONTO POLICE Service was an early adopter of geographic information system technology, using MapInfo Professional software in the early 1990s to plot crime locations for preventative analysis.

Now, 23 Toronto police analysts use GIS to analyze crime, said Det. Constable Manny San Pedro, the service's crime analysis training and development coordinator.

The analysts generate weekly crime reports that display the geographic arrangement of incidents based on data gathered from arrest reports, 911 calls and other sources. 'It shows where things have happened during the past week and compares that to where things happened in the previous week,' San Pedro said. 'They [also] try to identify patterns.'

The reports are sent as electronic bulletin boards to employees in the field, and divisional analysts have electronic bulletin boards on mobile workstations in patrol cars.

As helpful as that effort has been, San Pedro said, the data is not as easy to access in the field as he'd prefer. And officers can't run queries against the data from the field. 'We're working toward delivering our MapXtreme [Web] application, which will allow frontline officers to see what has happened during the previous shift over the past week or two,' he said.

'And we're looking to give the frontline officers the ability to [query directly] and drill down further.'

Analysts also have limited ability to bring in data from different sources. 'What we are working toward is some type of metadata layer that allows us to...tap into the various different data sources regardless of where they are and be able to display that spatially or in a tabular format,' San Pedro said.

The challenges can be seen as a measure of success. 'We are creating a critical mass,' he said. 'We are starting to change the attitudes of people within the organization. We are starting to get greater demands of our analytical products. More and more units are starting to warm to the idea that an analyst can produce some products that they can base their tactical and strategic decisions on.'


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected