Build a better buy
- By Trudy Walsh
- Sep 22, 2003
During the Lego robot tests, DCMA engineering technician John Erickson gets a taste of what it's like for a vendor to meet Defense contract specs.
DOD teams play with Lego robots to boost their acquisition savvy
His team tried so hard to follow the technical specs that 'we missed the big picture,' DCMA's Jerry Dosdall says.
The bright-yellow robot struggled to climb over a land mine, chirped like an excited R2-D2, collapsed, backed up and then paused to regroup.
No, it wasn't an Army mission in the mountains of Afghanistan, but a systems engineering and team-building exercise in a classroom this month at the Defense Acquisition University at Fort Belvoir, Va.
Twenty-six students, all civilian employees of the Defense Contract Management Agency, were demonstrating robots they had built using Lego Mindstorm Robotics Invention System kits as part of a weeklong class titled 'Understanding the Program Manager's Technical Management Role.'
The class was divided into five teams combining program managers, hardware and software engineers, and test and evaluation engineers. Their task: Build robotic land-mine detectors using the $200 Mindstorm kits from Legoland A/S of Billund, Denmark.
Housed in tackle boxes, the kits came with hardware, software, wheels, treads, a wireless remote-control joystick, Lego bricks and other pieces needed to build a small robot.
Although small in scale, the robots had to conform to the kind of technical specifications found in real Defense contracts. One class goal was to give the students a taste of what working on and winning'or losing'a government contract is like, from a vendor's perspective.Lego labyrinth
The land mines, in this case, were CD-ROM disks placed randomly on an obstacle course in the classroom. An infrared port in each robot's yellow Lego belly was supposed to detect the light reflected by the shiny CD-ROM surfaces. On detection, the robot was supposed to beep.
David Brown and Robert Lightsey, both faculty members at the university, designed and taught the course. A third faculty member, Martin Falk, helped in the design. Students spent about two days of the course on the robot exercise.
Lightsey and Brown scored the teams on factors that included unit cost, lifecycle cost, number of parts and how well the robots performed during the classroom test. They scored the students just as a real program office would judge a contractor's work.
The classroom test had five parts:
- Disassembly. Each team had to disassemble its robot and lay it out on a table for inspection.
- Reassembly. Each team had to reassemble its robot in less than seven minutes.
- Drop test. A team member dropped each robot from a height of 3 inches. Each robot then had to perform a series of maneuvers to see whether it had retained its capabilities.
- Lost communications. Students were required to put down the remote controls, push a button and let go of the robots. The robots then had to follow programmed instructions to move 4 feet and make 180-degree turns.
- Obstacle course. Students used wireless joysticks to maneuver their robots through a labyrinth marked on the classroom floor with masking tape. The course, about 10 feet long and 5 feet wide, was studded with CD-ROM disks, bricks, rocks, sticks and soda bottles. Each robot was supposed to chirp when it detected a CD-ROM 'land mine,' stay within the tape boundaries, make several turns and climb over a half-inch-tall stick.
Students used the software that came with the Mindstorm kits to program the land-mine identification. They ran the software under Microsoft Windows 98 on notebook PCs.
After all the testing was complete and the teachers had tallied up the scores, the students had a chance to become part of what Lightsey called 'a source selection advisory committee' by casting a vote for any team other than their own.
After the final scores were tallied, Team 1 was the winner. Its members credited good management and focus for their success.
'We were smaller than the other groups,' said Malcolm McDannell, a DCMA quality assurance specialist. 'We had four people, and some of the other groups had six. We didn't fight over design.'
Jason Digon, a support program integrator at DCMA, agreed. 'We considered ourselves a small business,' he said.
But if students had been given points for learning from their mistakes, Team 2 would have won hands-down.Third wheel
Team 2's robot ran into trouble during the obstacle course test. The robot had a tripod design with a spindly rear swivel wheel that kept getting stuck on the CD-ROMs and the other obstacles.
'We lost focus,' said Jerry Dosdall, a DCMA support program integrator and program manager for Team 2. 'We initially thought of our design as sort of a John Deere tractor concept,' but that design cost too much, so they went back to the drawing board and came up with the tripod leg.
'We were working so hard to follow the key performance parameters in the tech manual, we missed the big picture,' Dosdall said.
Tish Perry, a DCMA engineer and Team 2 member, said the team-building part of the exercise was invaluable. Even though the team encountered difficulties, 'we didn't give up,' she said. 'I learned how important schedules and testing plans are to program managers.' And everybody kept a sense of humor, she said.
'For many of our students, this is the first time in their lives that somebody tells them, 'You have to get a contract and produce something by the end of the week,' ' said Brown, a retired Navy commander. 'And only one team is going to win.'
The class gave Kate Ryan, an industrial specialist at DCMA and the program manager for Team 3, 'an appreciation of what government and contractors go through to develop these systems,' she said. 'Even though it's on such a small scale, so much is involved. It's been a real eye-opener.'