Smart phones as sensors: locating snipers, or parking spots
- By Patrick Marshall
- Feb 04, 2013
First of four parts.
They’ve got more computing power than some PCs had only a few years ago. They can capture geocoded photos and videos, measure sound levels, and take temperature and humidity readings. They can be used by soldiers on patrol to locate snipers or by shoppers to find open parking spaces. And they fit neatly in your pocket. Oh, and they make phone calls, too.
Developers are rushing to take advantage of the rapidly enhancing capabilities of smart phones to create mobile networks that collect a wide range of data, process it and deliver it to those who can make use of it.
Community organizers and some in government see great potential for data collection from swarms of cell phones, especially for such purposes as providing situational awareness to first responders after disasters or monitoring environmental conditions.
“We’re very excited about where this technology is going,” said Richard Wayland, director of the Environmental Protection Agency’s Air Quality Assessment Division. What has him most excited, he said, is the potential the technology offers for community-based sensing that can supplement the data collected by the EPA.
“There is a lot of effort for communities doing their own monitoring,” Wayland said. “Sometimes it's difficult to get the high-end expensive equipment everywhere you need it to be. This kind of technology shows a lot of potential benefits down the road.”
One of the most dramatic implementations of smart-phone sensor networks is being developed by Akos Ledeczi, associate professor of computer engineering at Vanderbilt University. He and his team are putting the final touches on SOLOMON (Shooter Localization with Mobile Phones), a project funded by the Defense Advanced Research Projects Agency.
With SOLOMON, a group of Android phone-toting soldiers can quickly locate a sniper by processing data from the sound and shockwave of a gunshot. Each soldier wears a sensor paired to an Android cell phone. “When the sensor detects a gunshot it measures the time of arrival of the shockwave,” Ledeczi said. “Then it waits for the muzzle blast, which is actual sound of gunfire, so it measures the times and some characteristics of the acoustic events and then sends it to the phone using Bluetooth.”
Next, the phones share the gathered data and, using algorithms developed by Ledeczi’s team, the direction and range of the shot’s origin is calculated and displayed on each phone using Google Maps.
According to Ledeczi, SOLOMON can determine the direction of the shooter within a few degrees and the range within about 10 meters.
In fact, SOLOMON is just one part of a broad effort by DARPA to develop inexpensive sensor systems by adapting commercial technologies. As part of that effort, the agency’s Adaptable Sensor System program recently announced that it was actively recruiting smart-phone application developers.
Civilian agencies at all levels of government are looking into smart-phone sensor networks. But although a lot of development is underway, only a handful have actually been deployed to date.
San Francisco and Los Angeles, for example, have deployed systems that help drivers find parking spots. In San Francisco, data from sensors implanted in 8,200 street parking spaces is broadcast to a mesh network. Drivers can access the data from a smart phone application (as well as from a SFpark website) to see where the empty spots are.
And researchers at universities have developed a surprising array of pilot applications that can do such things as automatically alert your emergency contact if your phone detects that you've had an auto accident.
In May 2012, researchers at the University of California at Berkeley sent a fleet of 100 smart phone-equipped robots floating down the Sacramento River to map water currents. The GPS-equipped smart phones transmitted location data to servers at the university every few seconds, allowing researchers to record not only the path but the speed of each device.
Researchers at Dartmouth College have set up a Smartphone Sensing Group to develop applications to take advantage of the geo-location and sensing capabilities of commercially available smart phones. The group’s MetroTrack system was designed to use smart phones to track noise sources, such as the voice of a lost child. The system works by activating sensors of all available phones in an area and using the data from those multiple sensors to triangulate the moving target.
The Dartmouth researchers have turned to the same technologies to create tracking applications for skiers and bicyclists. BikeNet, for example, allows users to share data, including average times for specified routes.
And researchers are testing ways of using smart phone sensors to help in emergency response.
NEXT: Take a picture, and 4D app gives you the details