GCN Interview | Jack Dangermond

The next step for agency GIS: shared services

ESRI's Dangermond touts moving beyond file sharing to a network of geospatial functions

The advent of Data.gov, and much of the data delivered through it, is to a large extent an outgrowth of work that has evolved in the geographic information systems community. And the roots of GIS software tools, used around the world to share geospatial information, lie in the vision and work of Jack Dangermond, founder and president of ESRI, based in Redlands, Calif.

GCN’s editor-in-chief, Wyatt Kash, caught up with Dangermond recently to talk about how government agencies are capitalizing on GIS — and Dangermond’s notion of turning geospatial databases into services.

GCN: The Homeland Security Department is looking to build on GIS-based programs such as Virtual Alabama, which uses Google Enterprise, and the Virginia Interoperability Picture for Emergency Response system, which is built on the ESRI platform. How would you characterize the differences between those two platforms — and how do they reflect the state of GIS tools?

DANGERMOND: The Google environment in Alabama is focused on visualization. ESRI’s technology is actually a complete GIS system, including visualization, mapping, data management and a rich library of spatial analysis functions. This system works in a distributed environment and uses a variety of open standards to interconnect and integrate different types of services — such as 3-D services, mapping data editing and spatial analysis. Those services can be mashed up with other services and made available through a variety of rich Internet applications for the Web, mobile and geobrowsers.

Both the ESRI and Google systems provide situation awareness. Virtual Alabama copies data from different state and local agencies into a central server for visualization in the Google environment. In contrast, the VIPER system uses a distributed Web service architecture that dynamically integrates a network of real-time authoritative source services. Distributed servers integrate various maps as mashup applications, not just visualization. While some of these services are simply mapping and visualization, many others integrate analytic services and support sophisticated applications.

This distributed architecture has many advantages, including the fact that government data is served dynamically by each agency, and as changes are made in the authoritative data sources, they are immediately available to users.

Using the recent L.A. fires, for example, one could start with a base map from [the U.S. Geological Survey] and then overlay a service of fire boundaries from the state, parcel data from the county, [and] census data from the Census Bureau and manipulate it to discover the people and property who are going to be impacted when a fire burns in a particular direction. This type of real-time application is not just visualization; it is reaching back and doing analytics on dynamically changing datasets managed by the agency responsible for the data.

It’s worth noting, however, that ESRI has worked closely with Google to integrate our tools, so our servers can be easily discovered and provide services into the visualization environment that Google provides.

You’ve championed "serverizing" — or serving out geospatial data and applications in real time. How well is government serving up data that way?

While it is still in its early days, things are changing rapidly. Several states have already made available complete parcel base map services, along with lots of other data.

Internet mapping started around 1996 with the introduction of MapQuest. About that same time, simple database-driven map servers were also introduced, which let people serve dynamic maps that originated in a GIS. Many of these maps were automatically generated from databases and, as a result, were always current with the database and were often created around specific applications, [such as] zoning notification, economic development, planning and environmental maps, etc. A few years ago, the consumer mapping sites introduced [Representational State Transfer] services, APIs and mashups. This has opened people’s eyes to the potential of the Internet as a computing platform for geospatial applications.

The most recent development has been the development of full GIS server platforms. This technology goes far beyond mapping and visualization and includes the business logic and tools for data management, editing, spatial analysis and high-quality cartography on the Web platform. It also supports the ability to create rich and easy-to-use GIS applications in the Web and mobile environments.

The result of these advancements is that many new and wonderful GIS applications are emerging on the Web. Under the hood of these applications, authors can choose to make available or expose the generic geoservices, such as maps and analysis, that are used to create these applications. Users have actually implemented thousands of the servers around the world. A lot of them are being used in government agencies — behind the firewall for enterprise applications. This federated pattern promises to provide a whole new framework for geospatial applications within federal government [and within] state and local governments. As this trend continues, we will see a whole new platform for developing lightweight Web apps.

This new federated Web architecture is not without issues, particularly with small agencies. They include guaranteeing reliability and availability of services and sharing costs of distributing services. There are financial issues, too, when one agency that creates and disseminates the service is impacted by other agencies wishing to use the service as part of their application — effectively using the originating agency’s computational services. Fortunately, drops in prices of the infrastructure to support this as well as cloud computing are going to help overcome these issues.

How is the Obama administration’s push to make more government data publicly available affecting the GIS landscape?

I think our new federal [Chief Information Officer] Vivek Kundra has it right. The government will need to set up a cloud environment for hosting services, where different agencies can put their geospatial services. I am not talking about outsourcing but about creating a shared network of generic services that can be shared across the government. I don’t know exactly how this will work, but I do know that if geographic framework data — base maps, soils, land use, imagery, census data and geologic data — were served openly for all levels of government, it would eliminate a lot of costs and open up the opportunity for the emergence of the next-generation Web applications that will cut across government. Some of those apps would be built by government people, some by the private sector, and others by [nongovernment organizations] and citizens who might in turn share them among each other and back to the government. That is a very exciting idea.

My colleagues and I are developing a GIS in the cloud. It’s called ArcGIS Online. It is built using our GIS Server platform and includes many free content services as well as a data-sharing capability that allows our users to share their datasets, or layers. Users can also register their geoservices for easy discovery. These capabilities are designed to serve the common interests of the GIS community and to demonstrate how our users can implement their own cloud GIS with their agency. We also plan to introduce a hosting service on ArcGIS Online that will give users additional options to extend their systems.

Do you believe Data.gov will accelerate efforts to expand geospatial services?

Perhaps. Certainly their promotion of easily available government data will make more people aware. GIS has been successful in part because people shared their data. The early sharing of geospatial data on the Web involved FTP “clearinghouses.” About four years ago, the federal government created Geodata.gov, an integrated portal that brought together all the separate clearinghouses and mandated all agencies enter, at minimum, metadata about their information. It has about 100,000 entries and has most recently been integrated with Data.gov.

The next big step will be to move these datasets into map and other geospatial services, letting anyone integrate these services with JavaScript or other Internet clients. I believe this will result in the development of a geospatial applet environment that easily connects and leverages these services into valuable applications.

The architectural answer for an integrated geospatial framework is not to put all the data into one big database. It will involve creating a network of distributed geospatial services that can be dynamically integrated using open standards and free APIs that can visualize, query and support advanced applications on the Web.

We have seen, in the last 30 years, GIS slowly grow in the federal government — actually not so slowly in the last 10 years. The benefits of it are breathtaking in terms of saving money, communicating more effectively, collaborating, and making better decisions.

Nevertheless, we have not achieved a widespread geospatial framework that can be used to better collaborate across agencies and between government and citizens, or between citizens and citizens, or between business and government. Geography is a wonderful integrator that can bring data from many different missions and allow orchestration and synergy to occur. My dream is that people will take their data and move beyond the Data.gov FTP data sharing — which is a great idea — to the vision of services sharing, where data is turned into generic map services that can be easily integrated across the Web. I like to call this Web GIS.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected