Everybody has a role in evaluating technology
- By Richard W. Walker
- Nov 16, 2004
DOD's Mark Krzysko says it's not whether a technology is good, but whether it's good for an agency's needs.
Before evaluating products, agency managers first must have a clear grasp of the problem they're trying to solve. 'It's later in the process that you worry about what technologies might contribute to solving the problem,' says Mary Mitchell of GSA's Office of Governmentwide Policy.
You just got out of a long meeting and could use a bit of a stretch. So you think you might take a stroll down to the IT administrator's office, shoot the breeze and check out the latest software the techies have on the bench. After all, it's been a while since you've had a chance to stop by and'
Hold on. What's wrong with this picture? What's wrong is that you don't seem to know what's happening in the IT shop. As an agency manager, you should be part of the process of making sure that your agency's technology is closely aligned with your agency's business goals.
For government agencies today, technology acquisition, including assessment, is no longer just the concern of the network manager or even the CIO's office.
By all accounts, the process should involve all functional components of the agency. Not just the CIO or the CTO. It should include the financial office and the procurement office.
And you can't leave out the users, the ones who will have to deal with the technology on a daily basis once it is evaluated, acquired and deployed.
When Army officials were developing an information system for recruiters, user input in the evaluation stage proved critical.
'This was a system that developed by the recruiters for the recruiters,' said John Miller, functional manager for the U.S. Army Recruiting Command. 'They had the experience to know what is needed out in the field at the time.'
And when officials at the Army's Program Executive Office for Command, Control and Communications-Tactical at Fort Monmouth, N.J., were evaluating a system to let managers track the status of weapons systems, they spent long hours interviewing potential users to come up with a list of requirements.
'We didn't want to interview just the management end,' said Kelly Lyman, senior program analyst for Robbins-Gioia LLC in the PEO's chief information office. 'You have to talk to the stakeholders at all levels.'
Before a technology gets to the test environment, agency managers have to have a comprehensive understanding of the problem they're trying to solve, from both a business and a user perspective, said Mary Mitchell, deputy associate administrator for e-government and technology in the General Services Administration's Of- fice of Governmentwide Policy.
'It's later in the process that you worry about what technologies might contribute to solving the problem,' she said.
Once in the lab, a technology shouldn't be treated in isolation. You have to evaluate the risk of a technology from a business perspective.
'It's really critical not to get caught up in 'That looks like a really cool technology,' ' said Valerie Perlowitz, president of Reliable Integration Services of Vienna, Va.
In addition, a technology should be put to real-world tests during the evaluation phase.
In assessing an acquisition cost analysis system, for instance, Defense Department officials required vendors to measure their software prototypes against a series of scenarios designed to test the capabilities that users really needed from the system.
'Benchmarks should be as realistic as possible and represent the intended environment as closely as possible,' said Arlie Barber, senior engineer for the Army Information Systems Engineering Command's Technology Integration Center at Fort Huachuca, Ariz.
In the end, evaluation isn't just about technology and it isn't just about business. It's about both.
Mark Krzysko, deputy director of defense procurement and acquisition policy for e-business at DOD, said: 'It's about how we use technology to improve our business and move forward.'