E-gov leaders look for a measure of satisfaction

OMB, agencies establish consistent ways to gauge citizen impact

What goes into measuring e-government success?

The Office of Management and Budget and agency e-government project managers will measure the initiatives' outcomes in three areas:

Adoption/participation: The degree to which the relevant community (agencies, bureaus and other organizations) participates in the initiative. Participation is demonstrated by contribution of information, involvement in governance, providing funds and other areas.

Usage: The level of use by the targeted end user.

Customer Satisfaction: End-user satisfaction with the initiative's products and/or services.

OMB submitted the fourth annual Expanding E-Government report to Congress in December. In the report, OMB detailed some of its goals for 2007:

  • Increase the number of e-government projects using the fee-for-service model, such as E-Authentication. OMB hopes to see the amount of money agencies contribute to each project drop to $156 million from $189 million in 2006, while the amount of money spent on services from e-government projects increases to $300 million from $239 million.

  • Provide enterprise architecture guidance describing segment architecture, transition strategy and architecture principles for the government; update the consolidated reference models and the EA assessment framework.

  • Raise to 100 the percentage of agencies scoring at least a 4 out of 5 for completion and at least a 3 out of 5 for use and results on the EA assessment.
    n Improve to 90 percent the number of business cases rated acceptable. Last year, OMB approved 81 percent.

  • Ensure that at least 90 percent of all IT systems have been certified and accredited. Last year, only 88 percent met the requirements.

  • Ensure that at least 90 percent of all IT systems have privacy impact assessments posted publicly, and 90 percent of the required systems have systems of records notices.

  • Expand efforts to close the IT workforce gap. In 2006, 65 percent of agencies met all skill gap deficits, and 58 percent have met or are meeting their IT hiring targets.

  • Improve to 75 percent the agencies using earned-value management to manage their IT projects within 10 percent of cost, schedule and performance. Last year, only 46 percent met this goal.

'One of the most important things about these metrics is that they are actionable. That is one of the guiding principles we built into the performance measures.' Andrew Ciafardini, OMB

Rick Steele

As the 25 e-government projects move into the third phase of their lifecycle'usage'the Office of Management and Budget is asking agencies to measure customer satisfaction.

While it's not a new concept, the way agencies will measure whether the Quicksilver initiatives meet their goals will be more consistent in 2007.

In its annual report to Congress on the benefits of e-government, OMB released the first set of measures for 18 of 25 Quicksilver projects focusing on three areas: customer satisfaction, adoption and participation, and usage (GCN.com/724).

'We are trying to measure what success means,' said Karen Evans, OMB's administrator for e-government and IT during a meeting held by the Treasury Department's Federal Consulting Group on the results of the 2006 Customer Satisfaction Survey. 'We want measures that show results. We want to increase the usage of the 25 initiatives.'

For customer satisfaction, Evans said agencies pulled the metrics directly from the American Customer Satisfaction Index, which is put together by the University of Michigan and ForeSee Results of Ann Arbor, Mich.

ForeSee measures satisfaction by randomly surveying selected site visitors and then grading on a 100-point scale.

The real question is how agencies are serving citizens better'and how they are able to track that, said Andrew Ciafardini, OMB's Government-to-Citizen portfolio manager.

'One of the most important things about these metrics is that they are actionable,' he added. 'That is one of the guiding principles we built into the performance measures: making sure they are there to do something about them and make initiatives better.'

The projects have received mixed reviews by citizens, businesses and federal users. Under the E-Training initiative, for instance, almost every major agency is using a learning management system or has hired one of the E-Training providers, said Norm Enger, Office of Personnel Management's director of the Human Resources Line of Business Program Management Office.

But under E-Travel, only 25 percent of agencies have fully deployed one of three available systems, OMB said in its report.

GovBenefits.gov and GovLoans.gov also experienced low citizen payoff, with only 35 percent of the visitors transferred to agency-specific benefits sites. Recreation One-Stop found slightly more success, as 55 percent of all reservations for national parks in the fourth quarter of 2006 were made through the site.

Evans said projects for too long were focused on outcomes such as Web site visitors, but that now these metrics provide outcome-oriented goals.

'The metrics will put clear focus on uptake, so that mature projects will shift their energy to get the good capabilities that are in place used more,' said Keith Thurston, an assistant deputy associate administrator in the General Services Administration's Office of Governmentwide Policy. '[The metrics] are better in that they give external-view metrics about utilization that a high-level executive would ask.'

In part, Evans credits OMB's deputy director for management Clay Johnson for asking outcome-oriented questions that her staff or the agency project managers could not readily answer.

Johnson's questions became a key factor in how OMB and agencies develop consistent and new measures for each project, Ciafardini said.

'Part of our guiding principles in this: Put it in context, so when you are measuring metrics, what is the universe it is out of rather than just saying you want to get to 100,000 hits a month on a Web site,' he said. 'How many should you be getting, and what is the universe of people who could use the Web site? The other thing is putting it into plain language. How can we make sure people can clearly understand them from how you list them? So we don't get into technical jargon.'


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected