Agencies look to industry for integration help

Federal information technology managers rarely have an easy road,
especially as they work to integrate the hodgepodge of systems that populate their
agencies. Somehow, it has always been easier to create new systems than to make them work
together.


But demand for data and information has grown. Application integration is one of the
biggest challenges facing IT managers.


Stovepipe systems quickly become outdated, thanks to new technology and changes in
agencies’ needs. The systems require upgrades and the ability to integrate with other
systems, or they become useless. Lately, the renovation that’s necessary to
compensate for date code shortsightedness has diverted attention from upgrades and
integration.


To get a perspective on the magnitude of the application integration challenge,
consider that federal IT spending has exceeded $30 billion per year in each of the past
five years and more than $25 billion in the years before that. That’s a massive
technology base for the government to manage. It seems that every system becomes a legacy
system. Some inventory systems still in daily use go back more than a decade.


A typical demand on those systems today might be to make records stored in a
proprietary, flat-file database available on the Web. Or to compare the flat-file records
with records in a relational database at another agency. Needs like this are driving
agencies to integrate legacy systems. What’s a poor agency to do?


From the viewpoint of two Cabinet-level federal agency managers I talked with recently,
there are no solutions—at least none that are proven in the marketplace and that you
can buy off the shelf.


Both managers know that consolidating old systems and technologies is no small task,
but they believe it is the best way to integrate functions and technologies and to reduce
operating costs. They are waiting for industry to develop something else that works.


Sounds like the centralized vs. distributed computing debate is swinging back toward
centralization.


Well, not exactly. There’s no way we can expect an entire industry to return to
the host-and-terminal configurations of the early years. But for some situations,
that’s exactly what will happen. The difficulty lies in the technical diversity of
government systems—some are old, some are new; some were developed under standards,
others were not.


Early data processing focused on glass-house mainframe operations outside of any user
control. Then came departmental minicomputers.


Later came distributed processing and the transformation of end-user gear from dumb
terminals to really powerful workstations. Power shifted to the users, but integration
suffered.


Data warehousing and data mining are sophisticated attempts to resolve the differences
from the data record perspective, but this highly technical approach doesn’t address
idiosyncrasies between systems and their requirements. And data dictionaries are still the
source of unending dispute.


Decentralization of initiatives returned integration to some prominence. Although the
glass house won’t resemble what it was in the 1970s, the pundits who predicted the
demise of big iron were premature.


Lately, the growth of the Internet, intranets, extranets and virtual private networks
have given managers and planners fresh impetus to consider distributing their
applications. But this doesn’t equal integration.


And as more requirements and lucky opportunities have appeared, agencies have focused
on new application development and not legacy systems integration.


Problems such as year 2000 date processing further exacerbated the complexities of
federal data processing. Portrayed as a looming disaster, many government managers saw the
year 2000 panic as an opportunity in disguise, and industry was more than willing to step
in and capitalize on that view.


In short, chaos has not only asserted itself but has become a kind of permanent guest.


Federal contractors are more interested in service contracts and less interested in
product distribution contracts and their low margins.


Lower prices are the goal of congressional leaders, and agency financial managers are
prepared to support the awarding of contracts for commercial products to gain both the
price and product advantage.


Where in the commercial market is there an operating environment that has anything
close to the complexity of the federal government’s?


Application integration packages the federal government badly needs do not exist
commercially. Federal managers who are waiting for them to appear will have a frustrating
time handling integration needs.


Some industry leaders are developing application integration tools, which are tested
commercially and are certified by successful experience. The approach seems sensible. The
challenge is to make sure the tools stay current no matter what direction their users
take.


We need to be able to plug into future systems with upgrades that are also
standard—and we still don’t have the standards.


Endless analysis of the centralized vs. decentralized question isn’t going to
teach us much. The problem today is to integrate what’s in place.


Lofty promises of an electronic government cannot be fulfilled without integration.
Government IT managers know it, and industry is preparing to help.


But just as vendors prepare commercial tools for implementation, industry must also
realize that there are pieces of government programs whose pedigree is found in every
generation of IT. 


Robert Deller is president of Market Access International Inc., an information
technology market research, sales and support company in Chevy Chase, Md. His e-mail
address is bdeller@markess.com.

inside gcn

  • When cybersecurity capabilities are paid for, but untapped

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above