Open government data shining through a giant keyhole

3 key steps to implementing the new open data policy

On May 9, the Obama Administration released an ambitious triple threat against the status quo in order to change the course of information management in the federal government. Specifically, the objective is to change the “default” behavior in the development of IT systems to create “open and machine-readable” data (a.k.a., standards-based and structured) as opposed to “closed and human-readable” data (a.k.a., proprietary and unstructured). 

The three-pronged attack includes a vision statement via an executive order,  an open data policy document by the Office of Management and Budget and technical support via an online repository of tools and guidance (called Project Open Data).

On a personal level, it is exciting to see this initiative as a solid down payment toward fulfilling the promise of the Federal Enterprise Architecture Data Reference Model (DRM), which I stewarded to fruition during my stint in federal service. Let’s examine three focus areas agencies must pay special attention to:

1. “Know what you know,” or mastering catalogs through metadata. At the center of the policy are two separate but related catalogs: an enterprise data inventory and a public data listing. To create the enterprise data inventory, the policy states that “agencies should use the Data Reference Model from the Federal Enterprise Architecture.” The policy goes on to specify that data assets should be described by using a set of common core and extensible metadata.

It is amazing to see the mainstream media frequently refer to “metadata” in its reporting on the National Security Agency and its request for Verizon phone metadata. The power and utility of metadata has become clear in that case (link analysis) and in this case (data asset discovery). Metadata’s value is something I’ve written about frequently in GCN. 

Agencies designing these metadata catalogs should take care in three areas:

  • Design the metadata to enable discovery and determine appropriate usage for data assets.
  • The public data listing should be derived from the enterprise data inventory and explicitly linked to those entries.
  • Enterprise data inventory should be complete, linked to the agency’s FISMA system inventory and include field-level (attribute-level) description.

2. “Stovepipes must die,” or creating a data services layer. The open data policy states that agencies  should “build information systems to support interoperability and information accessibility” but is vague on how to architect systems in this manner. The answer is to follow the Model-View-Controller (MVC) pattern in systems design.

Software engineers have been leveraging the MVC pattern for more than three decades in the design of widgets, applications and even large systems because it works. It works by specifying a clean separation between the user interface, the business logic and the data. “Clean separation” is the key point, and the way to do that for the “model” portion of a system is to develop a Web services layer between the data storage and the applications.

3. “Putting the ‘I’ back in CIO,” or enterprise information management gets real. By far the most risky part of this full-fledged assault on the status quo is agency heads giving CIOs the authority to implement open data correctly. I have led enterprise data management offices that had the backing of senior management and others that did not. An EDMO that lacks access to the data in the business units is bound to spin its wheels, waste money and accomplish little. Agency heads must view this new policy as an opportunity to centralize control of their data, which perfectly aligns with their objectives for transparency, information sharing, big data and the migration to cloud computing.

If implemented correctly, the above actions represent a sea change where information sharing with the public and other government agencies becomes a routine byproduct and not a special case. That difficult and evasive goal is finally within reach if federal agencies muster the courage and resources to change the “default” behavior to the “right” behavior.

About the Author

Michael C. Daconta ([email protected]) is the Vice President of Advanced Technology at InCadence Strategic Solutions and the former Metadata Program Manager for the Homeland Security Department. His new book is entitled, The Great Cloud Migration: Your Roadmap to Cloud Computing, Big Data and Linked Data.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected