Is your data disaster-ready?
- By Sara Friedman
- Jan 19, 2018
When it comes to disaster response, having access to reliable information about local shelters and hospitals gives officials and responders crucial knowledge that can mean the difference between life and death. But what happens when government agencies haven't prepared to share these key details?
Maksim Pecherskiy, San Diego’s first chief data officer and a 2014 Code for America fellow, was visiting Puerto Rico in September 2017 when Hurricane Irma hit the island. He was asked to make a map of shelter locations, hospitals and flood zones, but he ran into problems collecting the information.
To map the shelters, Pecherskiy had to combine information from a .csv file on 450 shelters with a school locations dataset and clean both files so his map could display the address and phone number. The merged file had complete information for only 88 shelters.
He then had to input the location data into Google Map’s geocoding application programming interface. Geocoding involves taking physical addresses and converting them into geographic coordinates for mapping purposes. Only half of the 88 shelters could be found using Google.
When trying to map flood zones, Pecherskiy used data from the Federal Emergency Management Agency, but there was a problem with the agency’s geographic information system’s server that prevented him from accessing the information directly. Instead, he had to access the data in a raw form, which led to a 15-hour process of filtering and cleaning the data.
Pecherskiy’s work on the project led him to ask how governments can make their data “disaster ready” and test it in a “time constrained project.”
“Disasters require a different way of thinking than our regular day-to-day,” Pecherskiy said during a presentation at Data-Smart Government summit on Nov. 8. “How do we make sure that our data is disaster ready and really ready to respond when it needs to be?”
New York City, which has had its share of disasters, has taken a proactive approach to deal with emergencies. Data drills are conducted each month that prepare agencies for different scenarios ranging from hurricanes to terrorist attacks, according to a Motherboard article.
The drills allow agencies to think about ways to provide the most important information expediently to enhance leadership’s decision-making abilities. For instance, after a major storm, traffic and demographic data can be used to prioritize tree removals from areas with heavy traffic or population centers with disabled residents, rather than assigning cleanup crews based on which calls came in first. And for issues that involve several agencies, the city is working to create an environment where accurate data can be quickly and easily shared.
For its disaster-response decision-making, FEMA developed the Hurricane Journal, a cloud-based dashboard that uses GIS technology from Esri to combine information from joint field offices, disaster recovery centers, shelters and other sources. It has replaced FEMA’s traditional practice of passing static maps among departments for analysis. The project won a GCN dig IT award in 2017.
Sara Friedman is a reporter/producer for GCN, covering cloud, cybersecurity and a wide range of other public-sector IT topics.
Before joining GCN, Friedman was a reporter for Gambling Compliance, where she covered state issues related to casinos, lotteries and fantasy sports. She has also written for Communications Daily and Washington Internet Daily on state telecom and cloud computing. Friedman is a graduate of Ithaca College, where she studied journalism, politics and international communications.
Friedman can be contacted at firstname.lastname@example.org or follow her on Twitter @SaraEFriedman.
Click here for previous articles by Friedman.