Environmental data centers urged toward 'bleeding edge'

Federal environmental data centers face increasing and complex demands for their information. A National Research Council committee formed to study the centers said the process of gathering environmental information into databases is 'well planned and executed.' But federal and nonfederal researchers need easier access to the stored information, it said.

The committee's report, released late last month, recommended more use of commercial and open-source software. It urged data centers to 'aggressively adopt newer, 'bleeding edge' technical approaches where there might be significant return on investment.'

The committee said data centers rely too heavily on offline or near-line storage, slowing retrieval. Another hurdle is that many agencies do not have the resources to handle demands for their data. Requests for National Oceanic and Atmospheric Administration data rose from about 95,000 in 1979 to more than 4 million in 1999, while staff decreased, the report said.

The committee recommended that data centers:

  • Accelerate standardization and make formats more transparent for data and metadata

  • Store data sets in a directory hierarchy to ease access and distribution

  • Shift primary storage from tape to disk, which is now competitive with tape for long-term, archival-class storage

  • Provide direct random online access and queries in remote databases

  • Adopt more sophisticated database technologies to improve search and query, access and acquisition, interoperability and retrieval

  • Use commodity hardware and commercial and open-source software as much as possible while concentrating staff efforts on problems unique to environmental data management.


  • Click here for a link to "Government Data Centers: Meeting Increasing Demands'

    Stay Connected

    Sign up for our newsletter.

    I agree to this site's Privacy Policy.