Grid accelerates research

Tools help deliver CERN data to labs worldwide

Related Links

GRID and bear it

Whatever its success in the marketplace, grid has been a great success in the research community. Fermi National Accelerator Laboratory, in Batavia, Ill., has been testing a grid network that will eventually distribute experimental data from the European laboratory for particle physics (CERN) in Geneva, Switzerland, to multiple research laboratories around the globe. Thanks to grid tools, data created at CERN can be distributed and then analyzed at other facilities around the world.

While electronically shepherding large amounts of information from one location to another is a difficult problem in itself, the task grows even more complex with multiple recipients, said Ian Fisk, Fermi associate scientist.

In 2007, CERN will crank up the Large Hadron Collider, which will be the world's largest particle accelerator. The physics community around this project wants to channel the collision results to labs worldwide, which could test out advanced physics hypotheses concerning supersymmetry, string theory and other theories.

This approach could tap the potential power of distributed computing. CERN itself has the most computer capacity of all the laboratories involved in the project, yet it only has 20 percent of the total computing capability involved in the project globally. The remaining 80 percent is split across the other participating partners, Fisk said.

Grid tools are essential for the job, Fisk said, because they provide the storage interfaces. Fermilab uses Storage Resource Manager, grid middleware developed in part by Lawrence Berkeley National Laboratory. 'The SRM interface allows us to describe that large group of servers as an interface,' Fisk said. CERN sends the data from multiple servers, which are received by another batch of numerous servers at Fermilab. SRM lends a hand in load balancing, traffic shaping, performance monitoring, authentication and resource usage accountability as well.

Grid software also presents uniform interfaces for local computing resources, Fisk said. Someone could submit a job-processing request using the Condor workload management system, developed by the University of Wisconsin. 'The grid interface provides a consistent view of the batch system,' Fisk said. Fermilab also uses grid tools for resource monitoring and accounting. 'Components of the Globus Toolkit itself provides us with the gatekeepers we use for the processing submission. 'We have thousands of jobs a day through the Globus toolkit,' he said.

About the Author

Joab Jackson is the senior technology editor for Government Computer News.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above