Reality Check | UCore's giant leap

A collaboration of four of the biggest federal departments will release Universal Core 2.0

'That's one small step for a man, one giant leap for mankind.' With those famous words, Neil Armstrong summed up a strategic victory almost a decade in the making. That historic mission to the moon pushed the boundaries of science and galvanized the country.

Following the 9-11 Commission's challenge to improve information sharing, a key group of chief information officers is, in its own way, pushing the boundaries of consensus building and galvanizing the federal information technology community.

This month, a collaboration of four of the biggest federal departments will release Universal Core 2.0, an impressive achievement in its scope, impact and design. The Defense, Justice and Homeland Security departments and the intelligence community developed UCore.

The scope of this standard is reflected in its initial sponsors and intention. It is impressive that four large federal organizations have thrown their collective weight behind this standard. Having crafted standards myself, I know that such consensus-building should not be trivialized. I consider it a true service to citizens and a success story in its own right.

The intention of the standard is ambitious: Craft a universal core of the most common data elements across all possible exchanges. By definition, that universal core lies at the center of all possible domain intersections. The question is: What are the most common things everyone must agree on to have minimal interoperability? The answer embodied in the standard is refreshingly minimalist: who, what, when and where.

The impact of standards based on Extensible Markup Language cannot be measured in terms of past data standards. Why? Because XML data standards can be immediately implemented in IT systems. Past standards could not. An average developer can take an XML data standard and have working code in a matter of days and a fully tested interface live in a matter of weeks.

More than that, if an organization has even a rudimentary service-oriented architecture in place, the same XML code can be rapidly ported to a Web 2.0 enterprise mashup. That highlights another key aspect of modern data standards: Standardizing at the exchange layer does not threaten existing systems. You can implement UCore without a rip-and-replace strategy because your interface code can easily transform internal data structures into the external standard for sharing.

The UCore design excels in three areas: simplicity, context and packaging. The standard has a conceptual model that clearly and intuitively provides a logical basis for describing the concepts of who, what, where and when. For example, 'who' is characterized as a type of agent, which could be a person, organization or group. The other concepts are also on a solid logical footing.

By standardizing those root concepts across IT systems, you essentially create search threads throughout your information holdings. Think about the power of being able to traverse all the information in your organization by time, geography, individual and key subjects. And combining those powerful search axes opens up endless productivity enhancements.

For context, the UCore standard defines a metadata element and some taxonomies and relationships. The taxonomies and metadata element perfectly implement the concepts in the Federal Enterprise Architecture Data Reference Model. The standard relationships move UCore beyond the model into the realm of advanced semantic technologies.

I'd like to congratulate the UCore development working group, especially Dan Green of DOD, Jeremy Warren of Justice, David Martin McCormick of the intelligence community and Anthony Hoang of DHS for their impressive work on this important standard. Finally, I encourage every IT leader concerned about interoperability to examine this new standard and take a giant leap forward.

Daconta ([email protected]) is chief technology officer at Accelerated Information Management and former metadata program manager at the Homeland Security Department. His latest book is 'Information as Product: How to Deliver the Right Information to the Right Person at the Right Time.'

About the Author

Michael C. Daconta ([email protected]) is the Vice President of Advanced Technology at InCadence Strategic Solutions and the former Metadata Program Manager for the Homeland Security Department. His new book is entitled, The Great Cloud Migration: Your Roadmap to Cloud Computing, Big Data and Linked Data.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected