Michael Daconta | The high cost of ambiguity
Reality Check | Commentary: Ambiguity is an unacceptable lack of precision that can cause immediate, blatant error. Or it can cause subtle, delayed error.
Ambiguity is an unacceptable lack of precision that can cause immediate, blatant error. Or it can cause subtle, delayed error. Both are costly.
Take, for instance, the case of duplicate records in the terrorist watch list, where redundancies, omissions and vagaries stifle the appropriate action in the best circumstances.
Alternatively, during a crisis, every such flaw is magnified. What's worse, ambiguity costs precious time. No better example of this exists than the failed responses during Hurricane Katrina, where the delays at the Superdome were measured in hours and broadcast to the world by the national media. Numerous Government Accountability Office reports detailed ambiguity in roles and responsibilities, and the resulting confusion hampered coordination.
Let's switch gears to the data and information space, where many managers hope to find the solution to this problem. Unfortunately, instead of offering clarity, the technology industry has only served a confusing array of jargon around metadata, data management, information sharing and knowledge management. Here are three egregious examples.Is metadata really 'data about data'? This classic definition of metadata is ambiguous and counterproductive. Specifically, it blurs the distinction between data and metadata and possesses zero business value. Metadata is external descriptions of data. The best example of it is the characteristics that describe the music files in your iPod, such as Artist, Song Title and Genre. This is important because metadata is a critical part of transforming data into information. If government organizations are not clear on its meaning and purpose, it is not something Congress will invest in.Are data management and information management synonymous? Many professionals use these terms interchangeably. Information is useable data, and therefore its focus must be on the consumer. So it should be evident that this answer is also no. Without clearly understanding this difference, how can you reliably produce information and deliver it to those who need it?Is there a repeatable pattern to information sharing? I often get asked, 'Why did you do the federal enterprise architecture's data reference model differently than the business reference model?' The short answer is clarity. The DRM solves cross-organizational sharing by explaining the process of information sharing instead of providing a set of common information-sharing topics. In contrast, the BRM provides a set of common business functions you can map your information technology investments to. The DRM approach follows the proverb about teaching a person to fish versus giving them a fish. Thus, the answer is yes.
Clarity, in the end, is having sufficient detail and distinction to create understanding. Such understanding affords action. And action is what brings results. Newsweek writer Jonathan Alter described how the 'unsexy' skill of database coordination cracked open the Washington sniper case in his article, 'Actually, the Database is God.'
Of course, what he calls 'database coordination' ' how connecting the dots speeded the investigation and saved lives ' is one form of information sharing specified in the data reference model. Thus, knowing what information is and how to reliably produce it is the key to delivering the right information to the right person at the right time. Ambiguity is a fight we can win.Michael Daconta is the former metadata program manager for DHS and the author of 'Information As Product: How to Deliver the Right Information to the Right Person at the Right Time.' Contact him at [email protected]