Army PEO goes for the edge

The PEO's test environment puts a heavy emphasis on integration, says Kelly Lyman, senior program analyst for contractor Robbins-Gioia.

Courtesy of the Army

'The only way to understand what [the technology's] capabilities are is to get your hands dirty'bring it in, put it into your environment and do a pilot program.'

'Emerson Keslar

Courtesy of the Army

Office attempts to balance research with need to adapt quickly

Emerson Keslar has memory sticks on the mind. Keslar, CIO of the Army's Program Executive Office for Command, Control and Communications-Tactical at Fort Monmouth, N.J., is pondering ways the Army can use memory sticks on the battlefield.

'We try to do good market research on what's successful on the commercial side and see how it can be leveraged in the federal government,' Keslar said. 'A simple example is those little memory sticks. They're pretty much everywhere on the commercial side of the house, but you don't really see them on the federal side right now.'

He thinks memory sticks might be an effective way for the Army to distribute software patches in the field.

'We've just started looking at it and plan to do some pilots,' he said. 'It's mature technology, but what's not mature is how to utilize it in an Army environment.'

Keslar's slant on memory sticks illustrates the PEO's approach to evaluating technology to meet the program's mission: Look at technology creatively and stay on the cutting edge.

'Sometimes we bring in a technology that we may not see an immediate benefit for, but we believe that technology is going to have applicability at a later time,' he said. 'We accept the fact that we're going to have some failures as part of the game, but it's important to stay on top of that cutting-edge technology, especially in our area.'

It's part of the PEO's job to evaluate and apply state-of-the-art IT to support tactical weapons systems on the modern, digitized battlefield.

'We really try to promote an environment here of bringing in new products and technologies and trying to adapt them very quickly,' Keslar said.

A comprehensive test environment is crucial to the CIO office's methodology in evaluating technology.

'We use a test environment that's very critical to making sure that tools we're evaluating will integrate with existing tools,' said Kelly Lyman, senior program analyst for Robbins-Gioia LLC of Alexandria, Va., in the PEO's chief information office. 'We're trying to pull things together from an integrated standpoint as opposed to having too many standalones.'

A case in point is the PEO's recent rollout of Microsoft Project 2002, which lets PEO managers track the status of 35 different weapons programs across the organization. Previously, each program had its own system for managing data.

In the run-up to deployment, Project 2002 was put through the information office's rigorous evaluation procedures.

Robbins-Gioia officials first interviewed more than 30 PEO executives, project and program managers, schedulers and team members to develop a list of requirements.

'Interviewing the right people was very important in that phase,' she said. 'We didn't want to interview just the management end. You have to talk to the stakeholders at all levels.'
[IMGCAP(2)]
'We wanted to find out exactly what they needed, because this tool is so customizable,' she added. 'Once we had the defining requirements, we could put together a plan for building and customizing the tool and moving forward with the pilot projects.'

After the system was customized and built, it was piloted in a working environment, which generated some lessons.

'The pilot was successful and we ended up implementing it across the entire PEO,' Keslar said, adding that there will be some issues later on standardizing data fields and quality.

'Data is almost always overlooked in pilot programs because you're looking at it strictly from a technology standpoint, not a data and content perspective,' he said.

Nonetheless, Keslar is big on pilots as part of his strategy.

'I think the IT industry accepts a certain degree of things not being perfected,' he said. 'As a result, you've got to find out what these things can do before you make a big investment. The only way to understand what [the technology's] capabilities are is to get your hands dirty'bring it in, put it into your environment and do a pilot program.'

To provide incentive to vendors to make their technologies work in the PEO's environment, Keslar gets them to go 50/50 on labor or other costs including resources, license fees and other investments in the pilot phase.

'If it doesn't work, they've also lost,' he said. 'But they're also a stakeholder in the success [of the pilot].'

For Keslar, laying out a results-oriented standard at the outset is important to the evaluation process. 'We set up the criteria right up front on the definition of success'of what the objective of the pilot program is,' he said.

inside gcn

  • Autonomous driverless car with Head Up Display (Scharfsinn/Shutterstock.com)

    What are these 'levels' of autonomous vehicles?

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group