How the Navy times its weather forecast computing workload
- By Joab Jackson
- Jul 29, 2004
'There are lots of job dependencies where certain jobs have to complete and write data to a file before other jobs can run.'
'Mike Clancy, Chief Scientist, Navy Fleet numerical meteorology and oceanography center
Courtesy of the Fleet numerical meteorology and oceanography center
Behind every Navy weather forecast is the complex task of coordinating a sea of applications at government weather centers. The job scheduling would far exceed the capability of the venerable Unix cron daemon, the standard Unix scheduler, which for decades has automated timed events.
The Navy's Fleet Numerical Meteorology and Oceanography Center instead uses a number of more sophisticated software tools to coordinate the operations of hundreds of programs that produce new weather models, said Mike Clancy, chief scientist and deputy technical director of the center in Monterey, Calif.
'We have a very complex and closely choreographed operational run, involving thousands of jobs that execute every day,' Clancy said.
The Navy center daily takes in about 7 million observations of temperature, humidity and cloud movement'about 1T worth of data. The readings come from satellites, ships, aircraft, sensors on ocean buoys and other data centers.
After crunching this data, the center delivers to Navy and intelligence agencies forecast maps, weather warnings and direct digital data for other systems, such as weapons systems. The predictions are good for up to 10 days.Fast turnaround
Many of the small, interdependent programs involved are written in Fortran and rely on models of physics governing heat, moisture and momentum. One 512-processor SGI Origin 3000 platform and two 256-processor Origins run the programs under the Trusted Irix operating system.
'There are lots of job dependencies where certain jobs have to complete and write data to a file before other jobs can run,' Clancy said. Any one job could occupy 100 processors or more, and multiple jobs have to run at the same time. Getting the jobs done is an additional challenge, because users expect to see the data at the same time each day.
'We may have one model running that requires 120 processors, another model that requires 80. The combination of LSF and RPM under the overall guidance of SMS decides how all that plugs in efficiently,' Clancy said.
As if the process were not already complicated enough, the Navy center plans to offload some of its weather modeling duties to another Defense weather modeling system, run by the Air Force Weather Agency at Offutt Air Force Base, Neb.
'We will try to distribute production runs between here and the remote site, and the Air Force will do the same,' Clancy said. The trial should be under way by fall.
The two centers will install identical IBM Corp. server clusters linked by another Platform Computing product, Platform LSF Multicluster. The software will extend the job scheduling across the two centers and balance workloads across clusters in an architecture similar to grid computing.
Joab Jackson is the senior technology editor for Government Computer News.