Few commuters stuck in big city traffic haven’t thought about sending a note to city hall complaining about slow traffic lights, especially during rush hour. If only the city could tighten up traffic signal synchronization, that would speed things up. So most people would think, anyway.
Well now there is definitive proof. The city of Los Angeles finished work just last month on the Automatic Traffic Surveillance and Control system, a $400 million effort to computerize its entire traffic management system.
First started 30 years ago to help improve traffic around L.A. in preparation for the 1980 Olympic Games, the system today controls the synchronization of each one of the city’s 4,500 traffic lights that handle the flow of 7 million commuters each day.
Regulating the signals is done through magnetic sensors planted at every intersection, which in turn are connected to a control center in downtown L.A. The system analyzes both past traffic data flowing through the network as well as real-time data to automatically regulate the signals, according to a report in the Los Angeles Times.
But is the system improving commuting times for the average L.A. citizen? The best answer might be ... yes, somewhat. The average speed of traffic has moved from 15 miles per hour to 17.3 miles per hour on a citywide basis, according to the city’s transportation department. Delays at intersections are down 12 percent, according to the report, which described the difference as a “smoother kind of slow.”
Why hasn’t the system cut traffic congestion more dramatically? For one thing there are more people poring into the city on a daily basis, with the overall population increasing about 20 percent since 198o. Another factor is that improved traffic speed leads to more people traveling, University of Southern California civil engineering professor James Moore told the New York Times. The “benefit is not speed, it’s throughput,” he said.
So might these findings appeal to other cities? Government workers in the Washington, D.C., area might have a chance to find out. The city is “considering” buying the ATSC software from Los Angeles, according a report in the New York Times.
Posted on Apr 04, 2013 at 9:39 AM0 comments
Microsoft says it’s making progress in moving cash-strapped public-sector organizations to the cloud, announcing that a group of eight local governments and universities were moving to its Office 365 platform. The suite of applications is a subscription-based, multitenant service offering e-mail, calendars and collaboration applications via a community cloud.
Eight local government and university organizations are moving to the service, Microsoft said, including Kansas City, Mo.; Seattle and King County, Wash.; and the San Diego Regional Airport Authority. The universities of Miami and Colorado at Colorado Springs, the California Institute of Technology and Sacramento State University were also making the move, Microsoft said.
Curt Kolcun, vice president of U.S. public sector at Microsoft, said the new sign-ups reflect the requirements of budget-conscious government and education organizations who also want access to some of the management conveniences promised by the cloud.
“Organizations are achieving significant cost savings through the cloud delivery model while gaining access to the latest collaboration tools, without sacrificing on security or privacy,” he said at the firm’s recent CIO summit.
The organizations sought out Office 365 solution for different reasons, the company said. Kansas City wanted to lower IT costs as well as its energy consumption. Kansas City CIO Mary Miller said the move to Office 365, “would enable our staff to be more efficient while reducing both the city’s IT costs and its energy footprint.”
The University of Miami, on the other hand, had a requirement for a cloud service that met federal health information privacy requirements. Microsoft was the “only vendor willing to offer additional security and privacy safeguards to meet this federal law,” according to the company. And CalTech wanted to “get out of the business of managing e-mail.”
King County CIO Bill Kehoe told InfoWorld the county had used Office 365’s forerunner, Microsoft Business Productivity Online Suite, in 2011 but made the switch to 365 last year.
"One efficiency has been that we don't have to build out an on-premise server environment for SharePoint and Lync," he told the magazine. "We rely on Microsoft's infrastructure, and they do the software upgrades and take care of the system maintenance."
More than 1 million government workers have made the move to Office 365 for productivity applications, including the Agriculture Department and the Federal Aviation Administration, the city of Chicago and the state of Texas.
Moving resource-strapped public sector agencies from their legacy office applications is no small feat. In making its transition to Office 365, the Environmental Protection Agency said it had to move more than 25,000 employee mailboxes, some of which it discovered held more than a million e-mails. The transition is expected to save the EPA approximately $12 million over the four-year contract period.
In a separate announcement, Microsoft said 11 K-12 school districts and universities have signed on to use Microsoft’s cross-platform Windows 8 operating system. The group includes the Atlanta Public Schools, Barry University, Fargo Public Schools, Fresno Unified School District, Jackson-Madison County School System, Pace University, San Antonio Independent Schools District, and Little Thomas College and Tuckahoe Common School District.
Posted on Apr 02, 2013 at 9:39 AM1 comments
The Homeland Security Department, in an effort to gain better control over its portable devices, has approved a line of secure USB drives for purchase by its component agencies. In an audit of portable device security policy last June DHS’ inspector general concluded that its component agencies were not adhering to policies regarding their inventory of USB thumb drives.
In an effort to unify its components in following policy, DHS has awarded a three-year blanket purchase agreement to Promark Technology, distributors of the Kanguru Defender Elite secure USB drives. The Defender Elite’s encryption model is FIPS 140-2 certified by the National Institute of Standards and Technology, so it meets DHS requirements for handling sensitive information. Its onboard antivirus, tamper-resistant design and remote management capabilities make it a good choice for use in secure environments.
The agreement approves for purchase hardware-encrypted models of the Kanguru USB drive ranging from 2G to 64 G in capacity. DHS also has approved Kanguru Remote Management Console Enterprise Edition 5.0 and other Kanguru management software for purchase.
The software lets admins track and manage the USB drives anywhere in the world, letting them set password changes, enforce strong password rules, restrict IP addresses, set permissions and disable or delete lost or stolen devices.
Posted on Apr 01, 2013 at 9:39 AM1 comments
When Congress tightened up financial data reporting regulations following the Wall Street securities trading scandals, few would have thought that once they were in place regulators wouldn’t be able to find the London Whale.
That’s the nickname given to a JP Morgan trader who became infamous for making bets upwards of $1 billion that companies would default on trades, the likes of which the new rules were designed to stop.
But while the Dodd-Frank Act called for increased public access to data related to credit swaps and other risky business, it left regulators on their own when it came to managing the data deluge and management challenges that came with it. The problem comes down to one many public-sector agencies can relate to: a mash of formats that aren’t interoperable, and therefore resistant to search.
And that’s what brought commissioner Scott O’Malia, a member of the Commodities Futures Trading Commission (CFTC), which regulates futures swaps, in front of a legal society audience March 19 with a warning that, “big data is the commission’s biggest problem.”
Dodd-Frank provided for greater transparency of financial trades so the commission could “look into the market and identify large swap positions that could have a destabilizing effect on our markets.”
However, the commission’s progress in understanding and using the data, O’Malia said, “is not going well,” according to a CFTC transcript of his remarks.
“The problem is so bad that staff have indicated that they currently cannot find the London Whale in in the current data files,” O’Malia reported.
In its rush to promote the new reporting rules, CFTC “failed to specify the data format parties must use when sending their (records to the database),” O’Malia said. “In other words, the commission told the industry what information to report, but didn’t specify what language to use. This has become a serious problem.”
And one that’s apparently mushrooming. That’s because each type of swap identified by 70-plus swap dealers will be reported in more than 70 different data formats.
“The permutations of data language are staggering,” O’Malia told the lawyers, adding, “doesn’t that sound like a reporting nightmare?”
To make matters worse, CFTC anticipates additional incoming data streams once major swap participants and end-users begin filing. “The commission now receives data on thousands of swaps each day,” the commissioner said. “So far, however, none of our computer programs load this data without crashing.”
Looking head, CFTC must “significantly improve its own IT capability,” said O’Malia. “Until such time, nobody should be under the illusion that promulgation of the reporting rules will enhance the Commission’s surveillance capabilities.”
In the meantime, O’Malia said, he would use his position as chairman of CFTC’s technology advisory committee, “to leverage the expertise of this group to assist in any way I can.”
Posted on Mar 27, 2013 at 9:39 AM0 comments
CIA chief technology officer Gus Hunt told a tech industry audience March 20 that the agency had a nearly limitless demand for big data surrounding its intelligence targets.
Hunt’s remarks might help explain the impetus behind news broken by Federal Computer Week two days earlier that CIA had cut a $600 million deal with Amazon Web Services to build a private cloud infrastructure to help manage its big data demands.
"Since you can't connect dots you don't have … we fundamentally try to collect everything and hang on to it forever,” Hunt told a GigaOm conference audience, according to a Huff Post Tech report.
The CIA has made no secret of its aim to become a more data-driven organization, a goal Hunt has been citing for several years. “We are going to have to get analytics and visualization [tools] that are so dead-simple easy to use, anybody can take advantage of them, anybody can use them,” he said at a recent conference.
Posted on Mar 22, 2013 at 9:39 AM2 comments