A test of the Smart City Interoperability Reference Architecture demonstrated that technology that any common operating picture technology must simple and operation-ready for responders to seriously consider it.
The key to first responders sharing data during emergencies is operation-ready technology, a test of the Open Geospatial Consortium’s Smart City Interoperability Reference Architecture (SCIRA) showed.
More than 40 people participated in a January pilot in St. Louis, Mo., to test SCIRA as an interoperable framework that integrates commercial internet-of-things sensors for public-safety applications. At the event, participants conducted tabletop exercises and took part in five operational scenarios -- preparedness, street flooding, vulnerable populations stranded in floods, a building fire and a vehicle accident -- to demonstrate how first responders, emergency managers and other city officials could use these data-sharing technologies in real-life situations.
“It really showed us how simplified and bombproof those technologies need to be,” said Josh Lieberman, director of OGC’s Innovation Program, which works on SCIRA with the Department of Homeland Security’s Science and Technology Directorate. “There is this … stereotype of the grizzled old fireman who says, ‘All I need is my hose and my radio and I’m set,’ and that was absolutely not the case in St. Louis. We got a lot of enthusiasm and interest in new technologies and ideas for how things could work there, but it just really had to be operation-ready for them to seriously consider it.”
The test featured mobile apps -- some that were web-based and others that were Android- or iOS-specific -- and two 3D models. One model of the whole city was used for web-based dashboards showing what was happening in a scenario and for modeling street flooding. For the preparedness scenario, it predicted what areas of the city would be most likely to flood based on rainfall and Mississippi River levels so that public-safety officials could be ready.
The other 3D model was of the interior of the T-REX Innovation Center, where responders in the fire scenario had to navigate to a blaze in the basement and also check each room for people in need of help.
“What we wanted to do was look at ways responders could navigate inside of the building and also prepare themselves,” Lieberman said. “There were literally police cars and fire engines driving from the stations to this building while the commanders, the supervisors, in each vehicle had tablets. … They were able to basically look through the building virtually and familiarize themselves with the layout of the building and who needed to go where.”
For the street-flooding scenario, the team combined the prediction model with sensor measurements. Smart cameras that had views of streets used machine learning to analyze the water on the wheels of cars going by and judge the water’s depth, Lieberman said.
Observations of flooding also came in via mobile apps from first responders and the community. The data from humans and sensors were combined to identify streets were impassable and dispatch crews close them, he explained.
Another scenario tested a dynamic rerouting service that responded to street closures in real time. “If one crew came in and said, ‘You can’t go through here’ and put in a street closure [notice], the next crew would receive a navigation [instructions] around that closure,” Lieberman said.
In the vulnerable populations scenario, responders had to use similar technology but with a different goal: to get human rescue crews out to save people stranded by floods.
Lastly, the vehicle accident setup involved a car that skidded on a street and knocked over a fire hydrant and brought down a power line. Five city departments had to respond at once.
The exercise was designed “to expose the volunteers to how their operations might be different with access to some of these technologies and with more of an ability to exchange and share information,” Lieberman said. With a common operating picture, everybody could see the same observations, the same map of where things are flooded and not be on their radios all the time and try to keep each other up-to-date,” he said.
A major lesson Lieberman said the OGC team took from the pilot was how long it takes to go “from an idea for data sharing to something that’s operational, how long that pipeline can be if you really are taking the needs of people on the ground.”
The team is now working to finish an engineering report – due out at the end of the month – that explains the implementation, configuration and operational experience, and they will lay out a solution architecture that shows the architecture at a city scale should a municipality want to implement it as a pervasive and sustainable capability.
NEXT STORY: How AI helps predict fire risk