Paul McCloskey


Smart budgets put their money on IT

In a time of budget brinkmanship on Capitol Hill, it’s worth noting the role IT could play in paring down the cost of government. For whether you’re a deficit hawk or tax dove, improving the value of the tax dollar will be critical once any budget deal is cut.

Through its efforts to improve government information transparency by setting up the, and websites, the Obama administration has already put key building blocks in place for using IT to identify cost anomalies in government.

These access points into the flow of federal financial information offer opportunities for citizens and regulators to account for public spending, identify fraud and waste, and reuse publicly funded research and intellectual property.

It's one thing to pry open the ledger for analysts to pore over, but it’s another to have the technology in place to flag wasteful spending, spot where costs for similar public works projects vary widely from one county to another, or plan when and where to spend public resources most productively.

Getting answers to those kinds of questions will be possible as progress occurs in the integration of open datasets, the adoption of cloud computing and advances in the use of software analytics in government.

Shifts in those areas are well under way. Last month, Recovery Board chairman Earl Devane called for consolidating the data websites into a universal one-stop shop' to ultimately be applied "broadly over the whole spectrum of federal financial data collection, display and analysis."

That proposal will be easier as the government’s push into cloud-computing gathers steam. The resulting consolidation will make it easier to scope large datasets on a project-by-project basis or across local, state and federal government agencies.

But as Devane said, "even the best data is useless if it cannot be easily interpreted and understood." Fortunately a boom is under way in the creation of software tools and analytics capable of decoding government data, flagging trouble spots and optimizing existing resources to address them.

To cite a few random examples:

  • In New Brunswick, Canada, the highway department is using a predictive analysis tool to identify "sweet spots" in the deterioration curve of roadways, when maintenance could be done to avoid making more costly repairs later.
  • A predictive analytics system in Memphis, Tenn., has cut the city's crime rate by a third since 2006 by enabling police to analyze incident patterns, forecast the location of high crime zones and dispatch police teams.
  • In a different kind of value proposition, the National Weather Service is experimenting with mounting sensors on the U.S. Postal Service trucks fleet to gather ongoing ambient environmental data. That’s a double dividend: revenue for USPS and a data cost cutter for NWS.

It’s the ingenuity behind projects such as these that, in the long run, have the power to root out wasteful spending, whether or not the politicians can agree on how much to cut.

About the Author

Paul McCloskey is senior editor of GCN. A former editor-in-chief of both GCN and FCW, McCloskey was part of Federal Computer Week's founding editorial staff.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected