Find answers to nagging compatibility questions
- By Florence Olsen
- Feb 08, 1999
Even all Microsoft
Corp. products dont interoperate.
How can we make all this stuff worktogether?
Its a common refrain that echoes through systems shops. Now a group of agencies
and vendors is working to create an online clearinghouse to help government agencies
answer that question for specific products.
The proliferation of new products and standards is creating chaos in governments
online initiatives because so little is known about present and future interoperability,
said John Weiler, government liaison for the Object Management Group of Alexandria, Va.,
and a leading participant in the new government clearinghouse initiative.
Even all Microsoft Corp. products dont interoperate, Weiler said.
Chief information officers struggle to build information infrastructures out of old and
new pieces, he said, but its impossible to absorb and correlate all the
function points unless youre a Borg.
Participants in the clearinghouse initiative have agreed on the need for an Internet
configurator that provides accurate, validated information about how and which commercial
A clearinghouse would have incredible value, if we can get everybody
participating and make it extremely accessible, said Tim McCaffrey, enterprise
systems team manager for Microsoft federal systems in Washington.
Microsoft has agreed to provide information about its products and to offer technical
help on the clearinghouse itself, McCaffrey said. I think its something a lot
of people could benefit from if it was done accurately and we could be assured of
validity, he said.
Such configuration tools are nothing new. IBM Corp. in the 1960s developed a
configurator to manage the complexity of developing systems with its own products, and
Digital Equipment Corp. developed something similar in the 1970s using artificial
intelligence, Weiler said.
Faced today with a dearth of tools for modern architecture modeling, the
interoperability working group will develop the tools and a repository of working parts,
The interoperability clearinghouse project is similar in intent to some aspects of the
Defense Departments goals for a Global Networked Information Enterprise. The DOD
initiative, presented at an industry briefing last month, goes several steps further than
current Joint Technical Architecture profiles mandated by the Defense Information Systems
Ron Turner, Navy deputy chief information officer for infrastructure, systems and
technology, said it would be naive to try to build a clearinghouse that could affect the
mass market unless there were participants from outside DOD.
If we bump this up to a level where were looking at the entire government
work force and the tools they use, Turner said, an interoperability
clearinghouse would be a wonderful idea.
Interoperability lies well beyond standards, Weiler said. Its about
software engineering to make it all work together, he said. The efforts of standards
bodies are routinely undercut by vendors who take standards and make proprietary products
from them, he added.
The hardest interoperability issues today, industry experts agree, are in the software
realm at layers 6 and 7 of the seven-layer Open Systems Interconnection model.
With funding from the Defense Advanced Research Projects Agency, the working group has
developed a prototype for a Java architecture modeling tool, or architects
workbench, called the Distributed Component-based Architecture Modeler. DCAM displays
reference implementations and lets users point and click on any object to check the
validity of each data point.
There are some harder problems that still need to be solved, Weiler said,
but DCAM is a good start. It functions as an inference engine, architecture modeler,
configurator, product and standard directory, and knowledge repository.
You enter data into forms on the screen, and it automatically generates the
taxonomies, Weiler said.
Most large organizations, including government and its systems integrators, acknowledge
they could do a better job of sharing configuration data, Weiler said.
But even if they succeed in establishing a common repository, it will take emerging
technologies such as the Extensible Markup Language and XML Metadata Exchange Format (XMI)
to keep the repository up to date.
OMG has proposed XMI as a standard for synchronizing metadata. When a vendor updates
specifications on its own Web site, the change would trigger a corresponding XMI update to
the interoperability database.
The interoperability working group has been evaluating knowledge management technology
from Lotus Development Corp. as one of the enabling technologies. Domino Release 5 is one
of the products the group is evaluating, Weiler said.
He estimated it could cost $10 million to build an operational system and populate the
repository. The sale of subscriptions might defray the costs of developing and maintaining
the metadata repository.
Much of what we need to do already exists, but its uncoordinated,
Government supporters of the interoperability clearinghouse include the Air Force
Electronic Systems Center, the Army Communications-Electronics Command, DARPA, the Defense
Information Systems Agencys Joint Interoperability and Engineering Organization, the
Energy and Justice departments, the National Imagery and Mapping Agency, the National
Security Agency, the Navy and the Office of the Secretary of Defense for Logistics.
Industry supporters include IBM, Knowledge Evolution Inc. of Washington, Microsoft, Sun
Microsystems Inc., Sybase Inc., Unisys Corp. and others.
Also participating are systems integrators and testing organizations such as
DISAs Joint Interoperability Test Command, the National Institute of Standards and
Technology, National Software Testing Laboratories Inc. of Conshohocken, Pa., Ernst &
Young LLP, Lockheed Martin Co. and Science Applications International Corp. of San Diego.