The archiving tsunami

Future of IT: GPO's plan for handling the rise of new data and accessing old formats

Mike Wash, chief information officer at the Government Printing Office, expects GPO to have more than a petabyte of content available in five or 10 years. But to properly manage that data, the agency has to achieve a few critical transitions.

FOR THOSE RESPONSIBLE for collecting, archiving and managing government records, planning for the future is a daunting undertaking. We face a virtual tsunami of documents, reports, legislative and other official records, plus mounting volumes of audio and video material, all of which must be properly preserved for generations to come.

Tracking the anticipated growth and estimating the legacy collection size of government publications is one of the GPO's responsibilities.

As we look five to 10 years ahead, we anticipate having more than 1 petabyte of accessible content to maintain and manage. This assumes that government publications remain somewhat consistent.

When publications start to move more into the multi-media format, including audio and video, the content size will grow dramatically larger. And with the move to Web-based services, wikis, blogs and other collaboration tools for capturing the work of government, the job of archiving is about to get much more complicated.

One way to prepare for these expanding volumes is to create content and access models based on historical trends and anticipated shifts in the industries or markets in which agencies participate. One thing about predictive models is always certain: They are wrong. However, thinking about them leads to better strategies and solutions.

Looking at the technology challenges, there are several major issues that need to be understood and overcome:

STORAGE: This issue is up-front and visible to almost everyone. Good progress continues to be made in this area with lower-cost technologies and distributed storage models.

COMPUTING: This is emerging as an area that could pay big benefits to organizations managing large quantities of information.

Monolithic computing infrastructures will quickly become unwieldy and difficult to maintain, particularly as increasingly difficult information processes are required to meet access and preservation requirements. Alternatively, cloud computing, which stores user data in the Internet cloud and relies on Web-based applications, and virtualization of computing systems, will make it easier for computation to be coordinated and distributed.

For GPO, we envision the possibility of using computing facilities at large academic library partners to perform some of the functions of FDsys, our content management infrastructure for federal publications.

Distributed processing could support advanced search, data parsing for digitized content, cataloging and other computational work. I like this idea because this could be one of the platforms for the Depository Library program of the future.

PRESERVATION PROCESSES: We cannot forget the problems of the past 25 years, in which file formats and applications become obsolete, making access to a usable version of a file almost impossible.

Think of WordStar, a common word processing application that was available when the PC market started growing rapidly in the 1980s. Dealing with files in this format produces challenges, from reading the media where the file is stored to converting it for applications.

In a world of virtual and nearly unlimited storage, the first challenge will pass as the old files are read and moved to a managed storage system.

The bigger and probably eternal challenge is the rapidly evolving world of applications and formats.

We need to develop methods to assure this content is usable over time.

GPO is looking at existing reference models, such as the Open Archival Information System (OAIS), and making small modifications primarily in the area of access.

We're also implementing the packaging infrastructure outlined in OAIS, developing self-describing packages that will be independent of the software applications used to create and manage these packages.

The idea is to have them survive a change in the principal applications in FDsys and still allow archive packages to recreate the original publications. It is a difficult task.

GPO also is actively pursuing three transformation technology issues. One involves moving our information systems to a modern infrastructure; a second will transform GPO from a print-centric agency to a content-centric agency; and the third will support restructuring the agency into a business-focused agency through the formation of business units.

All three transformations are necessary to prepare for the content tsunami.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected