Pulse


Boston tackles gridlock with Waze data

Boston tackles gridlock with Waze data

Boston Mayor Martin Walsh is continuing to take steps to ameliorate the sometimes discouraging traffic conditions within his city. Walsh announced a data-sharing partnership with the mobile traffic application Waze, which lets users see real-time traffic flows based on crowdsourced transportation information.

The city’s partnership is expected to help relieve traffic in two ways: more users will have access to information about road closures, and the Boston Traffic Management Center (TMC) will be able to use transportation data from Waze to better organize the city’s traffic flow–  including coordinating traffic signals more accurately. Boston has already been using data from Waze to supplement information received from hundreds of intersection cameras to help coordinate traffic signals.

This spring, the city will pilot several new approaches, such as evaluating traffic signal prioritization and its effectiveness along key routes. The city receives aggregated traffic speed data from the over 400,000 Waze users in the Boston area, which will allow it to measure before and after impacts on traffic speeds along targeted corridors.

The partnership with Waze follows the city’s earlier announcement to partner with the popular rideshare application Uber.  The Uber deal called for the ridesharing company to provide the city with its quarterly trip logs, which include time stamps as well as pick-up and drop-off data, distance traveled during trips and the duration of trips. The city plans to use this data to help its transportation system run smoother.  

“This partnership will help engineers in the TMC respond to traffic jams, accidents and road hazards quicker,” Boston’s Transportation Department Commissioner Gina Fiandaca said of the city’s new partnership with Waze. “And, looking forward, the Waze data will support us in implementing – and measuring the results of – new congestion management strategies.”  

Posted on Feb 17, 2015 at 1:14 PM0 comments


Facebook launches social media tool for cybersecurity pros

On the heels of President Obama’s announcement of a Cyber Threat Intelligence Center, Facebook also announced it was launching a framework for sharing cybersecurity information.

Facebook’s ThreatExchange is a social media framework that lets security professionals share threat information more easily, learn from each other’s discoveries and make their own systems safer, according to the platform’s website.

Mark Hammel, Facebook’s manager of threat infrastructure, told the Financial Times that ThreatExchange had been developed from a system that Facebook was already using internally to make it easier to catalog threats to the site in real time.

Facebook’s ThreatData  framework imports information about cybersecurity threats on the Internet in arbitrary formats, storing it efficiently and making it accessible for both real-time defensive systems and long-term analysis, the company said.

Early partners for ThreatExchange include Bitly, Dropbox, Facebook, Pinterest, Tumblr, Twitter and Yahoo.

Posted on Feb 12, 2015 at 7:35 AM0 comments


ICMA, SAS offer government performance analytics tool

The International City and County Management Association (ICMA) and analytics software firm SAS have launched a platform that allows local governments to compare program performance data against other jurisdictions using the tool.

The cloud-based ICMA Insights application runs comparisons on metrics gathered from police, fire, trash collection and other permit issuing agencies. Insights analyzes data from 950 different measures across seven categories, according to ICMA.

As local government agencies add data to the system they can scan similar jurisdictions and compare program outcomes and track areas of improvement.

ICMA Executive Director Bob O’Neill said the organization’s goal “is to grow the ICMA Insights database to the point at which participants can match their performance against hundreds of other communities.”

“They can then base evaluations not only on their own historical performance, but also that of a universe of similar communities,” he said So far, more than 100 local governments are using the system, according to ICMA.

Insights is organized around various processes local governments follow in analyzing service performance, including measurement, comparison, exploration, analysis and transformation. Analytics offered range from basic summary statistics to more customizable graphics, scorecards and forecasting dashboards, according to ICMA.

“Comparative performance analytics on the scale of what ICMA envisions would be a major asset—not just to a local government looking to benchmark its performance against a similar community but to the cause of improving service delivery to citizens,” said Tony Gardner, a former county manager of Arlington, Va., and member of the leadership development faculty at the Weldon Cooper Center for Public Service at the University of Virginia.

User licenses for the Insights platform are available, based on population of the jurisdictions.

Posted on Feb 09, 2015 at 12:28 PM0 comments


Vs of big data

Big data: How to know it when you see it

Information from sensors, video, satellites, genetic codes, social media and web tracking is contributing to a big data soup that web-connected devices are serving up. The vast amount of data now available, not to mention the storage and analytics technology that makes analysis of that data possible, is making “big data” look like the answer to every question.    

But what is big data really?

According to the National Institute of Standards and Technology, big data consists of highly extensive datasets that require a scalable architecture for efficient storage, manipulation, and analysis. Commonly known as the ‘Vs’ of big data, the characteristics of data that require these new architectures include:

Volume – the size of the dataset at rest, referring to both the data object size and number of data objects. Although big data doesn’t specify a particular data quantity, the term is often used in discussing petabytes and exabytes of data.

Velocity – the data in motion, or rate of flow, referring to both the acquisition rate and the update rate of data from real-time sensors, streaming video or financial systems.

Variety – data at rest from multiple repositories, domains or types, from unstructured text or images to highly structured databases.

Variability – the rate of change of the data from applications that generate a surge in the amount of data arriving in a given amount of time.

Veracity – the completeness and accuracy of the data sources, the provenance of the data, its integrity and its governance.

Posted on Feb 06, 2015 at 12:43 PM0 comments


FCW announces Federal 100

FCW announces Federal 100

FCW announced the winners of its 26th annual Federal 100 awards, which are presented to government, industry and academic leaders who have gone above and beyond to shape how the government manages, develops and acquires IT.

A sister publication to GCN, FCW compiles the list of winners from nominations received from readers and judges.

Profiles of the winners will be published in the March 30 issue of FCW and on FCW.com, and they will be honored at the March 26 Federal  100 awards gala.

Posted on Feb 06, 2015 at 10:38 AM0 comments