Pulse


organizations suffer from common data quality errors

U.S. organizations say a third of their data is bad

Agencies are relying on data aggregation and analytics to enhance citizen services and understand social, scientific and financial trends.  Given the meteoric rise in the uses of data aggregation, as well as a growing reliance on its methods, data accuracy is paramount.

Many organizations struggle with data inaccuracy, despite having an established data quality strategy. However, in a startling increase from last year, 1,200 respondents to a global study believe 26 percent of their data is inaccurate, U.S. respondents believe 32 percent of their data is inaccurate. 

The Experian Data Quality’s study noted three common data quality errors: incomplete or missing data, outdated information and inaccurate data. Most organizations cited duplicate data as a contributor to overall inaccuracies , while human error is believed to be the biggest factor in the data spoilage. Lack of automation – and a consequent dependence on manual data input – has also contributed to the problem, the study suggested.

One way to address these concerns is to institute data audit software, Experian suggested, noting that  only 24 percent of their study’s respondents use such software.  Organizations that do not deploy proactive software to detect errors not only waste resources and damage productivity, but they may not be able to derive accurate insights from their data.

Besides auditing technology, organizations can use data profiling or matching and linking technology to detect errors.

In order to make improvements, 89 percent of U.S. organizations will seek to invest in some type of data management solution, Experian said, warning that without a coherent data management strategy, these types of errors will continue to increase. 

Posted on Feb 03, 2015 at 2:05 PM0 comments


USGS releases open-source groundwater toolkit

Because the nation rely on groundwater for its drinking water, agriculture and industry, a robust monitoring network is needed to track water quality as well as reports of contaminated wells and fluctuating water levels.

The U.S. Geological Survey recently introduced a new open-source Groundwater Toolbox that estimates groundwater flow, including base flow (the groundwater-discharge component of streamflow), surface runoff and groundwater recharge from streamflow data, according to USGS.

The geographic information system-based toolbox brings together various analytical methods used by USGS and the Bureau of Reclamation to pull data automatically from the National Water Information System from more than 26,000 streamgage sites. 

The GW Toolbox can be run on any Microsoft Windows compatible platform.  A customizable interface includes data analytical programs and methods for estimating groundwater recharge.  Users can also glean water availability and hydrologic trends related to changes in the environment.

The toolbox is free and available to the public.  Engineers, members of the academic community and government agencies will benefit from the information provided by the open-source GW toolbox for independent assessments and research. 

Posted on Feb 02, 2015 at 9:11 AM0 comments


NIST issues final guidance for mobile app security

Today’s mobile-enabled workers have access to a variety of apps that are designed to improve productivity, but an employee who downloads an unsafe app may unwittingly expose an organization’s computer network to security and privacy risks.

The National Institute of Standards and Technology’s Vetting the Security of Mobile Applications, (SP 800-163) aims to help organizations assess the security and privacy risks associated with mobile apps, whether developed in-house or downloaded from mobile app marketplaces.

It is the final version of Technical Considerations of Vetting 3rd Party Mobile Applications guide that was published for comments in August 2014.

The guide offers plans for implementing the vetting process as well as  considerations for developing app security requirements, and describes the types of app vulnerabilities and the testing methods to use to detect them. The document also provides guidance for determining if an app is acceptable for an organization to use.

The publication is a guide for developers seeking to understand the types of vulnerabilities that can be introduced during an app’s software development cycle.

Posted on Jan 27, 2015 at 1:02 PM0 comments


NIST retires security standards

The National Institute of Standards and Technology is proposing to withdraw six Federal Information Processing Standards from its roster because of their obsolescence or lack of support from developers, according to a Jan. 16 notice in the Federal Register. The FIPS include:

FIPS 188, Standard Security Label for Information Transfer. This standard is now maintained, updated and kept current by the National Archives and Records Administration.

FIPS 191, Guideline for the Analysis of Local-Area Network Security. This standard is being withdrawn because new technologies, techniques and threats to computer networks have made the standard obsolete.

FIPS 185, Escrowed Encryption Standard. Released during the Clinton administration, this standard was based on a secret encryption algorithm called Skipjack that the National Security Agency began developing in 1985. Its goal was to hardwire an encryption standard into computers, communications networks and devices on a so-called Clipper chip that would be accessible to law enforcement agencies conducting lawful electronic surveillance. The system never caught on in the private sector and, according to the Federal Register notice, "is no longer approved to protect sensitive government information."

FIPS 190, Guideline for the Use of Advanced Authentication Technology Alternatives; FIPS 196, Entity Authentication using Public Key Cryptography; and FIPS 181, Automated Password Generator. These FIPS referenced withdrawn cryptographic standards, and newer guidance has been developed based on modern technologies.

Withdrawal means that the FIPS would no longer be part of a subscription service provided by the National Technical Information Service and federal agencies will no longer be required to comply with them. NIST said it will continue to provide relevant information on standards and guidelines by means of electronic dissemination methods.

Comments on the proposed withdrawal of the FIPS should be sent to fipswithdrawal@nist.gov by March 2, 2015.

Posted on Jan 20, 2015 at 10:06 AM0 comments


Harrisburg University of Science and Technology

Harrisburg U builds cybersecurity center for state, local gov

The Harrisburg University of Science and Technology’s Government Technology Institute has established a new center focused exclusively on safeguarding government data and systems from unauthorized access.

The Security Center of Excellence (SCoE) is believed to be the first such center focused solely on securing data entrusted to state, county and local governments, the university said

Cisco, Deloitte Consulting, IBM, Symantec and Unisys have all agreed to sponsor the SCoE and bring their global experts to HU to help GTI showcase the benefits of collaboration among cybersecurity experts from government, academia and the private sector.

“Our goal is to make this a national best practice for training and supporting those within government responsible for safeguarding sensitive data,” said Barb Shelton and Charlie Gerhards, co-directors of the GTI. 

Eric Darr, President of HU, said “these are some of the best security companies in the world and they will clearly help this Center to achieve its goal and in turn help Pennsylvania’s governments safeguard citizen data.”

It also is a tremendous opportunity for our faculty and students to work closely with government IT leaders and distinguished experts from the technology companies that have agreed to help Pennsylvania continually improve cybersecurity,” he added.

The educational program for security specialists in government is planned to begin in spring 2015 and will be followed with seminars, technology testing and collaboration among multiple levels of government. 

Posted on Jan 15, 2015 at 9:53 AM1 comments