The International City and County Management Association (ICMA) and analytics software firm SAS have launched a platform that allows local governments to compare program performance data against other jurisdictions using the tool.
The cloud-based ICMA Insights application runs comparisons on metrics gathered from police, fire, trash collection and other permit issuing agencies. Insights analyzes data from 950 different measures across seven categories, according to ICMA.
As local government agencies add data to the system they can scan similar jurisdictions and compare program outcomes and track areas of improvement.
ICMA Executive Director Bob O’Neill said the organization’s goal “is to grow the ICMA Insights database to the point at which participants can match their performance against hundreds of other communities.”
“They can then base evaluations not only on their own historical performance, but also that of a universe of similar communities,” he said So far, more than 100 local governments are using the system, according to ICMA.
Insights is organized around various processes local governments follow in analyzing service performance, including measurement, comparison, exploration, analysis and transformation. Analytics offered range from basic summary statistics to more customizable graphics, scorecards and forecasting dashboards, according to ICMA.
“Comparative performance analytics on the scale of what ICMA envisions would be a major asset—not just to a local government looking to benchmark its performance against a similar community but to the cause of improving service delivery to citizens,” said Tony Gardner, a former county manager of Arlington, Va., and member of the leadership development faculty at the Weldon Cooper Center for Public Service at the University of Virginia.
User licenses for the Insights platform are available, based on population of the jurisdictions.
Posted on Feb 09, 2015 at 12:28 PM0 comments
FCW announced the winners of its 26th annual Federal 100 awards, which are presented to government, industry and academic leaders who have gone above and beyond to shape how the government manages, develops and acquires IT.
A sister publication to GCN, FCW compiles the list of winners from nominations received from readers and judges.
Profiles of the winners will be published in the March 30 issue of FCW and on FCW.com, and they will be honored at the March 26 Federal 100 awards gala.
Posted on Feb 06, 2015 at 10:38 AM0 comments
Information from sensors, video, satellites, genetic codes, social media and web tracking is contributing to a big data soup that web-connected devices are serving up. The vast amount of data now available, not to mention the storage and analytics technology that makes analysis of that data possible, is making “big data” look like the answer to every question.
But what is big data really?
According to the National Institute of Standards and Technology, big data consists of highly extensive datasets that require a scalable architecture for efficient storage, manipulation, and analysis. Commonly known as the ‘Vs’ of big data, the characteristics of data that require these new architectures include:
Volume – the size of the dataset at rest, referring to both the data object size and number of data objects. Although big data doesn’t specify a particular data quantity, the term is often used in discussing petabytes and exabytes of data.
Velocity – the data in motion, or rate of flow, referring to both the acquisition rate and the update rate of data from real-time sensors, streaming video or financial systems.
Variety – data at rest from multiple repositories, domains or types, from unstructured text or images to highly structured databases.
Variability – the rate of change of the data from applications that generate a surge in the amount of data arriving in a given amount of time.
Veracity – the completeness and accuracy of the data sources, the provenance of the data, its integrity and its governance.
Posted on Feb 06, 2015 at 12:43 PM0 comments
The Food Safety and Inspection Service wants to partner with the private sector firms to disseminate English and Spanish language versions of its publications using a variety of web and software-based publishing systems.
The agency’s Office of Public Affairs and Consumer Education (OPCE) noted in a request for information that while it is content with its current media software, it is interested in a commercial off-the-shelf dissemination tool for print and online publications.
Each year, OPACE distributes press releases to more than 100,000 media sources/contacts in metro area cross the country. OPACE has also significantly expanded its use of social media and other web-based channels.
To support its growing online publishing requirements, the food agency is looking for a web-based media public relations software system that is easily accessible through a secure desktop interface. The system should be able to accommodate multiple users to distribute, track, evaluate and report messaging in an efficient, 508-compliant manner.
Services and capabilities OPACE is seeking include press release distribution, a media sources and contacts database, database management, tracking of media calls, news monitoring, evaluation of impressions, training and support.
OPACE also requires effective methods of analyzing and evaluating system performance.
Eventually, OPACE envisions a system or a combination of systems to provide user-friendly publication distribution and analysis services. Interested parties have until Feb. 18 to respond.
Posted on Feb 05, 2015 at 10:11 AM0 comments
Agencies are relying on data aggregation and analytics to enhance citizen services and understand social, scientific and financial trends. Given the meteoric rise in the uses of data aggregation, as well as a growing reliance on its methods, data accuracy is paramount.
Many organizations struggle with data inaccuracy, despite having an established data quality strategy. However, in a startling increase from last year, 1,200 respondents to a global study believe 26 percent of their data is inaccurate, U.S. respondents believe 32 percent of their data is inaccurate.
The Experian Data Quality’s study noted three common data quality errors: incomplete or missing data, outdated information and inaccurate data. Most organizations cited duplicate data as a contributor to overall inaccuracies , while human error is believed to be the biggest factor in the data spoilage. Lack of automation – and a consequent dependence on manual data input – has also contributed to the problem, the study suggested.
One way to address these concerns is to institute data audit software, Experian suggested, noting that only 24 percent of their study’s respondents use such software. Organizations that do not deploy proactive software to detect errors not only waste resources and damage productivity, but they may not be able to derive accurate insights from their data.
Besides auditing technology, organizations can use data profiling or matching and linking technology to detect errors.
In order to make improvements, 89 percent of U.S. organizations will seek to invest in some type of data management solution, Experian said, warning that without a coherent data management strategy, these types of errors will continue to increase.
Posted on Feb 03, 2015 at 2:05 PM0 comments