Census to outsource

The Census Bureau won't be buying much new hardware for the next
decennial census. Bureau officials plan to rent their large iron and rely on existing PCs
and workstations for analysis work.


"Since technology changes so rapidly, we don't want to have equipment packed up in
mothballs, sitting on the shelf" after the 2000 census, said Arnold Jackson, the
bureau's IT director.


The bureau will hire a systems integrator to handle the data-crunching and to run the
Data Capture System 2000 for the bureau. Jackson said Census plans to issue a DCS 2000
request for proposals in June.


"We've never been in that position before," he said. "We've always been
a very customized, in-house kind of place. Now we move the risk to one of contract and
technology management rather than doing it ourselves. That's no mean feat."


It's such a feat, in fact, that the Commerce Department's inspector general has
expressed concern about the Census Bureau's ability to manage a complex contract while
overseeing a new data capture process that will turn hundreds of millions of paper forms
into usable digital data.


But Jackson said sound vendor support, combined with proof-of-concept testing and
piloting over the next four years, will result in a proven system for the decennial census
by the time it begins in April 2000.


The data flow will be primarily one-way, from regional data capture centers to the
headquarters database in Suitland, Md. There were 12 data capture centers for the 1990
census; the bureau hopes to complete the next one with just four.


The communications architecture has yet to be defined. "We are working with FTS
2000" for the long-distance portion, and off-the-shelf LAN technology will be used
for local networks, Jackson said. "We will do the design and configuration," he
said. "There is a team that is beginning to meet on that now."


The database also has not been selected. "We're looking at a number of
alternatives," Jackson said. Census probably will use a client-server environment for
analyses.


Whatever database is chosen, it should not require significant new hardware, he said,
adding, "We have a pretty rich inventory of workstations and PCs."


Applications software for DCS 2000 will be developed in-house, but the movement of data
during the processing and sampling work will be done with a commercial product.
"We're testing off-the-shelf work flow management systems that will allow us to move
the images," Jackson said.


Gathering census data remains primarily a paper-driven process, in which an estimated
115 million households fill out and mail in printed forms. But in 1990, 35 percent of the
forms were not returned. Follow-up visits from census takers brought the count up to 98.4
percent, but the uncounted 1.6 percent of the population was concentrated in ethnic and
economic minorities, so that errors in counting blacks, Hispanics and the homeless were
magnified.


These problems will increase in 2000, Jackson said. "The raw increase in
population is not a back-breaker," he said. "It's the complexity of who needs to
be counted."


To meet these difficulties and keep costs under control, information must be
manipulated more efficiently, Jackson said.


The bureau will rely on optical character readers to turn paper forms into electronic
data that can be handed off to the DCS 2000 contractor. There, it will be checked and
refined so that statistical samples can be developed to account for the uncounted.


Then, armed with laptop PCs, census takers will hit the pavement and get additional
information, keying it in during interviews.


For the paper forms, the bureau conducted a proof-of-concept test of the data
collection process in March, gathering and scanning more than 120,000 forms in Oakland,
Calif., Paterson, N.J., and six rural Louisiana parishes. Results of that test, which
relied on Kodak scanners and software from several vendors, are being analyzed now.


"So far, all signals are go for migrating from the old system," Jackson said.


The old system, used for the last three decennial censuses, required converting
documents to microfilm and using a high-speed microfilm scanner to read the film. The data
then was uploaded to Census mainframes.


"It's built around components that are no longer available in the
marketplace," Jackson said. Many of the specialists who built and maintained the
system are retiring.


DCS 2000 will use off-the-shelf technology and be designed to run in an open
environment so that data will be stored in a platform-independent manner. "Open
system, that's our driving theme for the rest of the decade," Jackson said.


A Census Bureau acquisition review team is meeting weekly to develop specifications for
DCS 2000, and there is a parallel review process with Commerce officials to shorten the
acquisition time.


A vendors' meeting in March drew representatives from 300 companies who outlined ways
to meet Census needs. The bureau plans to host a second meeting in November and possibly a
third before the June RFP release, he said.


Much remains to be done to get DCS 2000 up and running. Jackson said his office has a
firm grip on the process and there will be plenty of time for a major dress rehearsal
later in the decade.


About the Author

William Jackson is a Maryland-based freelance writer.

inside gcn

  • Congressman sees broader role for DHS in state and local cyber efforts

    Automating the ATO

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above