Library of Congress preservation program works with millions of items, terabytes of data in a full spectrum of formats
We are warned to be careful about what we put online because data on the Internet lives forever. But keeping random copies of files on servers, routers and databases is not the same as preservation, said Martha Anderson, director of program management for the Library of Congress’ National Digital Information Infrastructure and Preservation program. Digital data can be ephemeral. “That is the paradox,” she said.
Web sites can disappear in a matter of days or change repeatedly in a matter of hours. Files can become lost or corrupted, formats and hardware change, and physical media such as tapes and disks deteriorate far more quickly than anticipated.
So Congress charged the LOC in 2000 with preserving the nation’s digital heritage and at the same time making sure that its collection of 29 million books and 105 million other items gathered over the last 200 years remains available for the next 200 years. Toward that end, NDIIP has developed specifications and tools for the transfer of large digital files; worked with government, academia and industry on best practices for digitizing and preserving data; established programs to use delivery platforms such as Flickr to make LOC content available; and partnered with the private sector to harvest content from the Web for archiving.
Across the board: The LOC’s Office of Strategic Initiatives is preserving everything from books to Dictabelt recordings.
The library now has three broad initiatives under NDIIP: working with universities and libraries to understand the nature of digital content, working with state consortiums to help in the preservation of state government records, and working with commercial content providers to develop standards for digital preservation.
The challenges of wedding a physical past to a digital future are varied.
“The biggest difference is the element of time,” Anderson said. “Some physical artifacts can be put on a shelf and left for many years. Books from the 18th century are fine, for example. This is not so much the case with sound recordings and film.”
The technology changes and the media deteriorate. Finding playback equipment for a wax cylinder, an old movie or a Dictabelt can be difficult. And when they are available, the cylinder, film or belt might not be playable.
“The whole domain is looking to digital to carry this forward,” Anderson said.
But digital conversion is time-consuming, and each type of material requires its own technology and special handling. Although the library has been working since the 1990s on digitizing its collections and has made millions of files available online, Anderson estimates that only about 1 percent of the library’s holdings have been digitized.
And digital data can be tricky to handle. “Some formats are fairly stable,” Anderson said. Text and image files have not changed a lot in recent years, and there are plenty of PDF, TIFF and JPEG files that can be easily opened today. But sound and video formats tend to change more quickly. And the physical environment for storing and accessing files changes rapidly. “Servers and digital storage are a challenge. These turn over every three to five years and everything is moved off to another server.”
To accomplish its digital mission, the library takes advantage of work being done in industry and academia to establish standardized environments and tools rather than developing everything itself. Among the initiatives NDIIP is participating in are:
- Development of the BagIt protocol for large data transfers.
- A collaborative Web site for federal partners developing guidelines for digitizing records.
- The National Digital Newspaper Program, in collaboration with the National Endowment for the Humanities, to digitize and preserve regional newspapers.
- State-of-the-art facilities at the library’s Packard Campus for preserving the world’s largest collection of audiovisual works.
- Partnering with universities and the Internet Archive to harvest and preserve more than 69 terabytes of content from the Web.
- Supporting standards development for digital content, including Office Open XML, PDF/A and JPEG2000.
- Development of open-source tools for receiving, archiving and accessing data in digital repositories.
It is not the technology that poses the greatest challenge to digital preservation, Anderson said. “The biggest challenge is social, getting organizations to understand the value of digital materials.”
Most organizations focus on day-to-day operations without concern for preservation. “We would like to make preservation a part of regular operations and workflow,” she said. Part of the problem is the complexity of the tasks. “It is very complicated even to archive your own e-mail at home,” so preservation has not yet become a part of everyone’s digital environment.
Another challenge in establishing long-term programs for digital preservation is the speed of change in the digital environment. “Our job as we saw it in 2001 was much simpler than we see it today,” Anderson said.
When Congress gave the library the job of digital preservation, there was no Wikipedia, Google Maps, Flickr or Facebook. Today, those tools and others like them have changed the way digital content is created and distributed.
“We worked for nine months to gather video from the Internet,” Anderson said. “During that nine months, YouTube came onto the scene and changed everything.”