Michael Daconta | Standards: See the forest and the trees
- By Michael Daconta
- May 04, 2007
Data standardization involves the gathering of often-independent stakeholders to hammer out agreements on the syntax and semantics of data across a community. This process is notoriously difficult because of the medium and the conjoining of technical and business needs.
Creating data structures is akin to creating virtual buildings in cyberspace where you don't have to obey the laws of physics, the conventions of tradition or even good sense. This lack of natural constraints gives standards bodies the rope to metaphorically hang themselves and the standard they are creating.
In addition, there inevitably is conflict between the ease of technical implementation for stakeholders and the business benefit of consensus.
This can lead to the cardinal sin of data standardization: favoring short-term technical convenience over cohesion.
Cohesion binds together all the individual design elements to the standard's primary purpose. Cohesion in design produces elegance. The lack of cohesion produces a disjointed base, compounded with every modification. And it starts the clock ticking on a standard's implosion.
During the past six months, I have witnessed several standards efforts losing cohesion. The Electronic Fingerprint Transmission Specification, used by the FBI, and the Electronic Biometric Transmission Specification, used by the Defense Department, are implementations of the ANSI/NIST-ITL 1-2000: Data Format for the Interchange of Fingerprint, Facial, and Scar Mark and Tattoo Information standard.
The current standard is binary, but the proposed standard will include an alternate Extensible Markup Language version. It suffers from a kitchen-sink mentality. It makes the mistake of combining a data standard with a service standard, which it calls a transaction.
It made sense to combine those two when the standard was a poor man's distributed system in which you embed transaction processing in e-mails. But now we have a better way. It's called the Internet and the World Wide Web. When legacy constraints hinder a standard, it begins bleeding cohesion.
Another example is the National Information Exchange Model, which recently voted to eliminate universal data elements from its resulting messages.
This was a knee-jerk reaction to complaints by one implementer confused by the distinction between universal core (intersection of all domains) and common core (intersection of two or more domains). This decision has serious ramifications: The federal Information Sharing Environment will no longer be able to easily establish universal rules governing the flow and processing of messages.
A commercial example is occurring now at the International Organization for Standardization with Microsoft's Office Open XML and the Oasis Open Document specifications. The redundancy of these two standards from the same organization screams technical 'and political ' convenience.
Favoring short-term technical convenience over cohesion is not seeing the forest for the trees.
A counterexample recently occurred at the Director of National Intelligence Data Management Committee meeting where a 'proposal of convenience' for one
implementer of the Terrorist Watchlist Person Data Exchange Standard was tabled by the chair of the DMC pending further investigation.
Such demonstrated leadership is needed for all standards bodies.
The solution to all of these examples is the same: strong technical leadership combining business acumen with deep technical expertise. If you don't have it, find someone who does.