Feds falling behind in managing data: Heiner

ORLANDO, Fla.'The federal government is losing the battle to organize data and make it accessible. In fact, according to one expert, agencies are moving backward in regard to technology and delivering information.

Nate Heiner, a former Coast Guard chief knowledge officer, today said over the last 25 years the government, mainly led by the Defense Department, has been at the forefront of data search and accessibility R&D. But now the government is 'overbuilding data sets to define what data is.'

'It is a matter of flattening the data to make it accessible,' he said during the Information Processing Interagency Conference, sponsored by the Government IT Executive Conference. 'A minimalist interface can give you a good view of unstructured data. That is what Google has done. The government could get there with the technology that exists now.'

This concept of horizontal data could be applied to human resources, finance and operations and across the analytics, data warehousing agencies use and reports they run. Horizontal data enables data to flow across the agency's disparate systems.

Heiner said the government should consider the goals of ARPANet from the 1970s and try to achieve them at a high-level view.

The Defense Advanced Research Projects Agency wanted to:
  • Connect existing networks
  • Keep networks running
  • Support multiple types of services
  • Accommodate disparate network types

Heiner said the government's take on data is bleak, especially when you consider agencies still are using the Government Information Locator Service (GILS), which was developed in 1996 and decommissioned by the National Institute of Standards and Technology in 2005. The Office of Management and Budget had been considering moving beyond GILS, but no decision was made.

'NIST was right; it should have been decommissioned,' Heiner said. 'There are important standards that exist such as NASA's Global Change Master Directory for geospatial information. But the approach to assemble information has been disorganized, and agencies continue to struggle to predefine what this should look like.'

And things will get worse before they get better as more and more data is available on the Web. Heiner said some estimates are that there are 10 billion to 20 billion image files on the Web and Google has indexed only about 2 billion. He used Thomas Jefferson as an example of the challenge. When he put Jefferson into Google images, he got a picture of silverware, squash patch and assorted other images but not Jefferson. And with video, audio and other formats becoming more prevalent, agencies are being crushed by the scope of the data.

He said there are programs to help solve this situation, including ESP, which asks random people to play a game by labeling images.

'People spent 9 billion hours on Solitaire so the inventors of this game figured how could they get people to spend some of that time indexing images,' Heiner said. 'I'm not saying the government should do that, but there is creative thinking that can be applied to the federal side when it comes to the data problem.'

inside gcn

  • smart city (jamesteohart/Shutterstock.com)

    Toolkit for building a smart city plan

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group