With tax day just around the corner, the IRS has added new features to its IRS2Go mobile application, which provides users with tax tips and tools to navigate key IRS services.
Taxpayers can now check the status of their federal income tax refund using their Apple or Android mobile device. They can access the information by entering their Social Security number, filing status and the amount of their anticipated refund from their 2013 tax return.
Users who filed a return electronically can check their refund status within 24 hours after the return is received. The update also allows users to request their tax return or account transcript, which they will receive in the mail.
In addition to IRS2Go, the IRS is moving rapidly to adopt electronic and social media services to assist taxpayers and lower the cost overhead of traditional customer services. Currently, the IRS has channels up and going for taxpayer tips and information on YouTube, Twitter and Facebook.
The federal government has been successful in getting a return on these investments in IRS electronic services, according to a recent report by Accenture. Of 147.6 million tax returns that were filed in September 2012, 113.8 million (77 percent) were done online, it said.
The IRS2Go smartphone app in particular was a key reason for the agency’s success with e-services, according to Accenture, which put the value of potentially fraudulent refunds avoided by using the app at $4.2 billion.
The IRS first launched IRS2Go in January 2011. Within two months, 110,000 iPhone users and 135,000 Android users had downloaded the app. Total cost of the in-house development: $50,000, spent mostly for security features.
Posted on Mar 27, 2014 at 11:20 AM0 comments
Police and military dogs face many of the same dangers as their human partners. Many of these dogs, also known as K9s, fall victim to heat related conditions such as heat stroke, which could result in death.
To combat K9 casualties, Massachusetts, Arizona and Texas law enforcement units have invested in a wireless monitoring system to convey the dog’s internal body temperature to its human partner. Data Sciences International and Blueforce Development Corp. have partnered to develop the new system.
The system continuously measures the K9’s body temperature using a small surgically implanted sensor. The sensor then relays the temperature to a receiver attached to the dog’s protective gear, where it can be monitored by the human partners. The receiver relays the information to the K9 officer's smartphone and will instantly alert him if the K9's body temperature exceeds safe health limits.
"Our active involvement in public safety revealed that officers have serious K9 safety needs," said Blueforce CEO Mike Helfrich. "We expect this solution to help save K9 lives by communicating real-time temperature."
The telemetry is communicated to anyone subscribed to the animal through the Blueforce Tactical mobile application for Android or iOS, according to Blueforce blog post. Those who are subscribed receive a notification when the dog’s body temperature exceeds or falls below prescribed values.
Posted on Mar 26, 2014 at 11:17 AM0 comments
Scientists as the Naval Research Laboratory (NRL) have spent the past two years helping the Defense Threat Reduction Agency (DTRA) better predict pending epidemics and regional disease outbreaks.
The objective is at the heart of two linked programs at DTRA. First, the 24 Month Challenge is a multi-agency project to identify and develop diagnostic devices needed to make biosurveillance analytics a reality. Meanwhile, a parallel DTRA program is developing a cloud database that analyzes the incoming data, according to the NRL announcement.
In the first phase of the 24 Month Challenge, the NRL team solicited proposals for diagnostic technologies that met several core requirements: the ability to differentiate the cause of febrile illness and send the diagnostic data to the cloud database. Evaluations over the past year whittled down the original list to four technologies, enabling NRL to work with three companies to develop prototype technologies that directly address the program's requirements.
"NRL has developed a relationship with two companies, InBIOS International Inc. and ChemBio Diagnostics Systems Inc., that make lateral flow immunoassay strips or LFIs. For reference, the best known example of a LFI is the home pregnancy test," said the NRL principal investigator Shawn Mulvaney. "We then challenged these companies to make their new LFIs capable of detecting the causative agents for malaria, dengue fever, melioidosis, and the plague using only a blood sample obtained from a finger prick. These are some of the most concerning diseases found in theater, particularly for our troops stationed in tropical climates."
NRL has also partnered with Fio Corp. to use the Deki Reader for test analysis and communications. The Deki Reader is a portable unit built around an Android smartphone. It can use its camera feature to take pictures of every test result, and the software can guide the user, analyze the outcomes, and upload the data over the cellular network.
"This is a clever solution to multiple challenges," said NRL’s Mulvaney.
Based on strong analytical data obtained during NRL's testing, the three technologies are set for field-trials in South America, Africa and southeast Asia.
Posted on Mar 25, 2014 at 9:56 AM0 comments
Virginia Tech has created a new high-performance data network exchange in Atlanta, Ga., which it says is dramatically improving access to national and international research networks. The facility, named the Mid-Atlantic Research and Education Exchange Atlanta, will also streamline access to major research centers throughout the southeastern United States.
The new facility complements another data network exchange already operated by Virginia Tech in the Washington, D.C.-metropolitan area, which was formerly known as the NatCap Aggregation Facility.
Modern science and engineering research depends on specialized high-performance networks, such as Internet2, to link distributed computing, storage, visualization, sensor and laboratory resources with research scientists and students collaborating over a global cyber-infrastructure.
Since 1998, Virginia Tech has operated statewide network and aggregation systems that link Virginia’s major research institutions to national research networks and provide a regional hub for data transfer.
In 2012, the primary network aggregation facility in the Washington, D.C.-area was rebuilt using the latest technology, raising transfer speeds to 100 gigabit/sec.
The establishment of a second data network exchange in Atlanta provides geographic diversity, backup connectivity, and direct peering connections with major research institutions such as Oak Ridge National Laboratory and the Centers for Disease Control, according to Virginia Tech.
The university partnered with the Georgia Institute of Technology to house the new facility and to establish the regional peering connections.
Both data exchange facilities are linked to Virginia Tech via extremely fast fiber optic connections established through the university’s participation in the Broadband Technology Opportunities Program, which expanded the regional fiber network.
Having two connections, one going north and one south, greatly enhances reliability and up-time, according to Virginia Tech, which will remain connected even in the face of a fiber cut on one of the paths or a problem at one of the facilities.
The university’s Network Infrastructure and Services unit designed both facilities and operates them under contract to the Mid-Atlantic Research Infrastructure Alliance. Executive Director William Dougherty says he considers the data exchange to be a critical program for the university.
“Providing the best possible access to national research networks is vital to our mission to enable the competitiveness of Virginia Tech research,” said Dougherty. “By allowing other regional universities to use these facilities, we create economies of scale and support Virginia Tech’s commitment to engagement and leadership.” Participating institutions all contribute to the cost of operating the data network exchange facilities.
Posted on Mar 25, 2014 at 10:04 AM0 comments
During the past decade information technologies have driven productivity gains that are essential to U.S. economic competitiveness, and computing systems now control a significant portion of the critical infrastructure.
As a result, tremendous public and commercial resources are devoted to ensuring that programs are correct, especially at scale. Yet, in spite of sizeable efforts by developers, software defects remain at the root of most system errors and security vulnerabilities.
To address the predicament, the Defense Advanced Research Projects Agency wants to advance the way software is built, debugged, verified, maintained and understood by combining principles of big data analytics with software analysis.
DARPA said its Mining and Understanding Software Enclaves (MUSE) program would facilitate new ways to dramatically improve software correctness and help develop radically different approaches for automatically constructing and repairing complex software, according to its announcement.
“Our goal is to apply the principles of big data analytics to identify and understand deep commonalities among the constantly evolving corpus of software drawn from the hundreds of billions of lines of open source code available today,” said Suresh Jagannathan, DARPA program manager in the announcement.
“We’re aiming to treat programs—more precisely, facts about programs—as data, discovering new relationships (enclaves) among this ‘big code’ to build better, more robust software.”
Central to MUSE’s approach is the creation of a community infrastructure that would incorporate a continuously operating specification-mining engine, the agency said. This engine would use “deep program analyses and big data analytics to create a public database containing … inferences about salient properties, behaviors and vulnerabilities of software drawn from the hundreds of billions of lines of open source code available today.”
“The collective knowledge gleaned from this effort would facilitate new mechanisms for dramatically improving software reliability and help develop radically different approaches for automatically constructing and repairing complex software,” DARPA said in describing the project.
Posted on Mar 24, 2014 at 9:20 AM0 comments