CIO OUTLOOK

Open reporting methods boost credibility

Mike Hale

Like many states, Georgia has been living and breathing the year 2000 readiness problem. We in government have had to respond to practically every readiness question and challenge imaginable. We have been probed from every quarter, managing in a fishbowl of public scrutiny.

I wonder about the real meaning of readiness and the accuracy of answers we give the citizens.

Georgia government has posted 4,000 readiness surveys of local infrastructures on the Web. Every public utility, health care facility, 911 service, fire department and policy organization received a survey. Today, residents in, say, Covington can access the year 2000 survey results of their municipal electric company or of the state financial system.

By 1998, I realized that to inspire confidence, agencies' year 2000 progress reports would have to be certified to show that technicians performed remediation and testing using rigorous industry benchmarks.

Unlike financial reporting, 2000 readiness reporting had no standards. The default standard has been simply the percentage of code fixed. The news media has accepted this standard because nothing else was available or because the alternatives were too complex. Percent-complete figures for various agencies have been published side-by-side even in the face of serious differences among methodologies.

Risks can remain hidden in the organizations that don't expose details of their methods. A focus on completion without a corresponding emphasis on quality standards is a disservice to both the organization and the public.

Georgia has used several readiness benchmarks that cover progress, risk and quality of year 2000 preparation. Agencies report their progress and risk metrics for 378 mission-critical systems biweekly. Metrics include tracking dollars and work hours spent on an remediation plan. Agencies post their standards and procedures on the state's 2000 Web site.

Georgia systems are bound by statewide standards that are monitored by the Year 2000 Project Office and documented in final certification reports. Each final report requires managers to delineate the methods and tools they used for remediation and testing.

Managers who cannot detail how they will assess a system don't get funding. When reporting completion, managers must certify that testing covered end-to-end system configuration.

For mainframe systems, managers must certify that testing included successive initial program loads in machines with their clocks turned forward. If testing has not been independently validated against pre-approved testing standards, testing stops. If the executive manager of the agency is not involved in the final compliance sign-off on a mission critical system, that report is incomplete.

I am convinced Georgia will be ready. Gov. Roy Barnes has provided both commitment and funding to ensure that no citizen will experience a disruption in services.

Still, the technical nitty-gritty of a readiness reporting philosophy is often lost when lawmakers or other judges compare large organizations across political jurisdictions'all the more so when there is no generally accepted reporting standard. That's why highlighting our standards and procedures alongside our reports has added to our credibility during this once-in-a-lifetime experience.



Mike Hale is chief information officer of Georgia. He previously was executive director of Florida's Information Resource Commission and is a retired Army colonel. His e-mail address is mhale@itpc.state.ga.us.

inside gcn

  • high performance computing (Gorodenkoff/Shutterstock.com)

    Does AI require high-end infrastructure?

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above