The Defense Advanced Research Projects Agency recently posted an online catalog designed to give the computer science community a central source for updates on DARPA software development, research results and technical publications.
The Open Catalog is “a curated list of DARPA-sponsored software and peer-reviewed publications,” the R&D agency said, which would make available information “that may lead to experimental results and reusable technology to benefit multiple government domains.”
“Making our open source catalog available increases the number of experts who can help quickly develop relevant software for the government,” said DARPA program manager Chris White. “Our hope is that the computer science community will test and evaluate elements of our software and afterward adopt them as either standalone offerings or as components of their products.”
The initial Open Catalog offerings included software toolkits and peer-reviewed publications from the XDATA program in DARPA’s Information Innovation Office. The project aims to develop computational techniques and software tools for processing and analyzing large, imperfect and incomplete data sets.
DARPA said the catalog reflects its interest in building communities around government-funded software and research. If the R&D community shows sufficient interest, DARPA will continue to make updates and other information available, said the agency.
Today, the catalog includes licensing information for project software , links to the external project page or contact information, and a link to the code repository for the project.
Programs in the current catalog currently include:
Active Authentication, a program that seeks to develop novel ways of validating the identity of computer users by focusing on unique aspects of individuals through software-based biometrics.
Crowd Sourced Formal Verification, which that aims to investigate whether large numbers of non-experts can perform formal verification faster and more cost-effectively than conventional processes. The goal is to transform verification into a more accessible task by creating fun, intuitive games that reflect formal verification problems. Playing the games would effectively help software verification tools complete corresponding formal verification.
Detection of Psychological Signals, which aims to develop novel analytical tools to assess psychological status of warfighters in the hopes of improving psychological health awareness.
Posted on May 22, 2014 at 11:17 AM0 comments
The four largest wireless telephone companies, AT&T, Sprint, T-Mobile, and Verizon, met their voluntarily commitment to support 911 emergency text messages by May 15, according to the Federal Communications Commission.
But that does not mean people today can place a 911 call by sending a text message. That’s because most emergency call centers are not yet equipped to receive the texts. In many instances, text-to-911 awaits upgrades to local 911 centers, coordination among phone companies, equipment vendors and public safety call centers.
In fact, for now said the FCC, callers simply "should not rely on text to reach 911."
The FCC wants all 911 call centers to accept 911 texts as soon as possible, but it is not required. Today, 911 texts are accepted in limited areas in Colorado, Georgia, Illinois, Indiana, Iowa, Maine, Maryland, Montana, New York, North Carolina, Ohio, Pennsylvania, South Carolina, Texas, Vermont and Virginia.
If a caller attempts to send a text to 911 where the service is not available, he would receive a "bounce-back" message advising him to contact emergency services by another means, such as by making a voice call or using telecommunications relay services.
To help support call centers deploy text to 911 services, the FCC has posted best practices for text message providers and 911 call centers deploying text-to-911.
The webpage has materials from by Vermont, Texas, and other state 911 public safety answering points that have already integrated text-to-911. The website includes links to lessons learned from Vermont’s “highly successful text-to-911 implementation” and informational videos for potential text-to-911.
Other sources of support are also available to help test new 911 features. TeleCommunications Systems Inc., an Annapolis, Md., a provider of secure mobile systems, opened an Interoperability Lab in April to help developers of call-handling systems test new next-generation 911 applications before they are deployed by emergency call centers.
Posted on May 21, 2014 at 12:41 PM0 comments
A mechanical arm, developed as a Small Business Innovation Research project, will help reduce injuries to government employees who are responsible for testing thousands of rounds of ammunition weekly.
The Virtual Shooter’s mechanical arm and hand replicate major human bone and muscular structures during the firing process and should spare human shooters from stress injuries and chronic nerve and joint pain.
The Department of Homeland Security’s Armory Operations Branch tests more than 200,000 rounds of ammunition and a variety of handguns annually before they are approved for use in the field.
“This repetitive firing takes a toll on the shooters and results in stressed joints, debilitating pain, and other physical injuries. The Virtual Shooter will go a long way in in reducing, if not eliminating, those injuries,” explained John Price, program manager of the Science &Technology Directorate’s First Responder Group.
The Virtual Shooter project was demonstrated in March 2014 at the Immigration and Customs Enforcement armory in Altoona, Pa., where agents tested multiple weapons and ammunition.
Radiance Technologies, the commercial partner that developed the prototypes, demonstrated single- and a double-armed models that fired multiple handgun and ammunition types.
Over the next year, Radiance Technologies will develop and deliver the second and final prototype. The model will then be available commercially for government and industry use.
Posted on May 20, 2014 at 9:21 AM0 comments
Government website managers looking to get a better handle on their sites’ traffic and system performance have a new resource. Sarah Kaczmarek released a second edition of her Google Analytics for Government training manual.
As the digital communications manager for the Government Accountability Office, Kaczmarek develops and manages all digital communication projects for the 3,000-person federal agency. Her manual includes chapters on getting started, interpreting core reports, setting up conversion goals, customization, and a glossary of the terms used in Google Analytics.
According to the blog post announcing the manual, the second edition has gone through an extensive rewrite, based on Kaczmarek’s own experience and insights from people who have shared their stories with her. Some of the new sections include audience demographics and interests reports using real-time data, as well as campaign tracking. There’s also information on setting up accounts, managing users and standard reporting.
For website managers seeking more information, DigitalGov has links to a series of webinars showing how to create meaningful metrics from Google Analytics as well as using templates for reporting metrics on a weekly, quarterly and annual basis.
Posted on May 19, 2014 at 9:21 AM0 comments
FOSE's three-day conference and expo addresses the technology and management priorities and issues for government across cloud, cybersecurity, mobile, big data and more. Here are some of the more newsworthy announcements from among the keynotes, panels and conference sessions.
Cobert tees up IT management plans
Even as the federal government seeks to streamline IT acquisition to include more technology startups, a plan is afoot to capture existing knowledge about how to best use Federal Acquisition Regulation guidelines to support agile procurement, said Beth Cobert, deputy director for management at the Office of Management and Budget, during a keynote speech May 13 at 1105 Media's FOSE conference.
Some specifics include the development of a "digital services playbook" of best practices in IT procurement, design and deployment. The goal is to identify best practices and share them across the government.
Those lessons will be reinforced by the deployment of two specialized teams: the digital services team at the U.S. CIO's office, which is seeking funding to help agencies with high-profile IT projects, and the 18F team at the General Services Administration, which helps agencies build websites and other public-facing federal IT projects.
BYOD coming soon to NASA
NASA's bring-your-own-device policy is expected to be approved within weeks, said John Sprague, the agency's enterprise applications service executive.
Although NASA is a relative latecomer to the BYOD scene, other agencies with stringent security protocols, such as the Defense Department, have not yet taken the leap.
Sprague said many NASA employees were already bringing their personal mobile devices to work, which created a need for officials to codify proper use.
"There was no previous BYOD policy," Sprague said. "There are telework agreements and things like that, but nothing that really touched on people bringing in their devices. People were just doing it."
Show, don't tell
Open source is no longer the novelty it was just a few years ago in government, but that doesn't mean agencies have shed all their doubts and hesitations. The solution, according to advocates at one FOSE session, is to just do it.
An Interior Department employee asked the panelists, "What do you do when your agency is just hell bent on using [commercial off-the-shelf] software?" He said his superiors seem hostile to the very idea of open-source solutions, even after he and his team produced a cost/benefit analysis comparing in-house development to COTS integration.
"Forget the cost/benefit analysis," said panelist Erie Meyer, an aide to U.S. Chief Technology Officer Todd Park. "The only way to move these conversations forward is to build what you're talking about."
Matthew Burton, the Consumer Financial Protection Bureau former deputy CIO, agreed. "Mock something up," he said. "Pull up PowerPoint, and draw some boxes." People don't understand, really understand, a project until they see it. "And they don't need to see something fully functional," he added.
The panelists agreed that there are still fear, uncertainty and doubt about open-source solutions, but "sometimes these people don't disagree with you," Meyer said. "They literally have no idea what you're talking about."
NIST framework paying dividends
The White House is seeing payoff in the form of more secure supply chains because some financial-sector firms are implementing President Barack Obama's 2013 cybersecurity executive order, a top aide said.
"One of the areas that we've seen companies already really start to use the [cybersecurity] framework is in vendor management," said Ari Schwartz, a cybersecurity adviser on the National Security Council. The companies have mostly been in the financial sector, he added.
The National Institute of Standards and Technology released an initial framework for implementing the executive order in February. The document is a voluntary guideline by which operators of critical infrastructure, for example, can assess their cybersecurity posture and set goals for improving it.
"The key to the cybersecurity framework is it allows a baseline across different sectors," Schwartz said. "So if you can start to audit different sectors using the same framework, you can come up with a kind of baseline that works and that gives information to the CIO and gives information to boards."
He said a new marketplace was sprouting up for products that incorporate cybersecurity standards delineated by the NIST framework.
"I wouldn't say that we have seen it widespread yet, but we have heard…anecdotally that some sectors have really taken this on as an important goal," Schwartz said in reply to a question from FCW.
Posted on May 14, 2014 at 9:21 AM0 comments