DHS pushes screening of first responder mobile apps
- By Patrick Marshall
- Jan 30, 2018
While malware in a consumer's smartphone is an increasing irritant, the same problem in mobile devices used by first responders can cost lives.
Accordingly, a pilot project of the Department of Homeland Security’s Science and Technology Directorate that concluded last month aimed to assess and remediate vulnerabilities in apps used by public-safety professionals.
A joint effort of DHS, the Association of Public-Safety Communications Officials and Kryptowire, a private-sector developer of a mobile app-vetting platform, the pilot tested 33 popular apps from 20 different developers offered on APCO’s AppComm website that hosts public-safety apps.
The apps were tested over three months using Kryptowire’s testing platform -- also developed with funding from DHS -- which was integrated with AppComm.
“We are able to test android and iOS apps against the same standards that are used for classified national security systems,” said Tom Karygiannis, Kryptowire’s vice president for products. Those standards are specified in the National Information Assurance Partnership, a joint effort of the National Security Agency and the National Institute for Standards and Technology.
Karygiannis added that the testing was done without access to source code. “It’s standards-based,” he said. “We are looking for security vulnerabilities, not for malware.”
More specifically, the tests examined each app for vulnerabilities in its security measures, privacy protocols, access to information and device access.
“For example, if [the developers] haven’t implemented their cryptography properly, if they’re not checking their certificates or if they made any number of programming mistakes, we can detect them,” Karygiannis said. “Another thing we test is whether there is an insecure connection to a cloud infrastructure that might expose data. We can identify that vulnerability, but we do not monitor the cloud service or the device to see if anyone has actually attempted to exploit that.”
The results were, in a hyphenated word, eye-opening.
Of the 33 mobile apps tested (18 iOS and 15 Android), 32 had security or privacy issues such as allowing access to the device’s camera, contacts or SMS messages. What’s more, 18 of the apps had “critical flaws” affecting the device’s overall security.
According to the DHS report, iOS apps were somewhat more vulnerable than Android apps.
The test results of the pilot were shared with app developers, with half of the 20 developers using the information to repair their apps. According to the report, “Some companies and developers dropped out due to lack of time, perceived level of difficulty to fix identified concerns/issues or did not respond after the pilot’s kick-off.”
Of the developers who worked with the pilot, 90 percent told the DHS team that the testing process was not burdensome and that the time spent on remediation ranged from zero to eight hours. (One might suspect that developers of apps requiring more time to repair may have had an incentive to drop out of the pilot.)
“The goal was to improve the quality of the apps," Karygiannis said. "Once we identif[ied] the errors, the developers were notified and they fixed all these problems,” he said, adding that there is ongoing discussion about formalizing pre-screening of apps before they go public.
APCO, which hosts the AppComm app site, declined to comment.
According to the DHS report, the number of vulnerabilities found should concern the public safety community. Additionally, the high attrition rate of developers after the testing phase “demonstrates the need for a formal, ongoing app evaluation process with appropriate incentives for developer participation.”
“All this is new, even for larger corporate app developers,” Karygiannis said. “This is a first step to getting some objective metrics as to where people stand.”
Read the full DHS report, “Securing Mobile Applications for First Responders," here.
Patrick Marshall is a freelance technology writer for GCN.