NSF seeds cloud research test beds
The National Science Foundation recently announced two $10 million projects to create cloud computing test beds – to be called Chameleon and CloudLab – that will help develop novel cloud architectures and new applications.
The awards complement private sector efforts to build cloud architectures that can support real-time and safety-critical applications like those used in medical devices, power grids, and transportation systems, NSF said in its announcement. They are part of the NSFCloud program that supports research into novel cloud architectures to address emerging challenges including real-time and high-confidence systems.
Chameleon, to be co-located at the University of Chicago and the University of Texas at Austin, will consist of 650 cloud nodes with 5 petabytes of storage. Researchers will be able to configure slices of Chameleon as custom clouds to test the efficiency and usability of different cloud architectures on a range of problems, from machine learning and adaptive operating systems to climate simulations and flood prediction.
The test bed will allow "bare-metal access," an alternative to virtualization technologies currently used to share cloud hardware, allowing for experimentation with new virtualization technologies that could improve reliability, security and performance.
Chameleon is unique for its support for heterogeneous computer architectures, including low-power processors, general processing units and field-programmable gate arrays, as well as a variety of network interconnects and storage devices, NSF said.
Researchers can therefore mix and match hardware, software and networking components and test their performance. This flexibility is expected to benefit many scientific communities, including the growing field of cyber-physical systems or the Internet of Things, which integrates computation into physical infrastructure.
"Like its namesake, the Chameleon test bed will be able to adapt itself to a wide range of experimental needs, from bare metal reconfiguration to support for ready-made clouds," said Kate Keahey, a scientist at the Computation Institute at the University of Chicago and principal investigator for Chameleon.
"Furthermore, users will be able to run those experiments on a large scale, critical for big data and big compute research.”
The CloudLab test bed is a large-scale distributed infrastructure based at the University of Utah, Clemson University and the University of Wisconsin, on top of which researchers will be able to construct many different types of clouds. Each site will have unique hardware, architecture and storage features, and will connect to the others via 100 gigabit/sec connections on Internet2's advanced platform. CloudLab will also support OpenFlow (an open standard that enables researchers to run experimental protocols in campus networks) and other software-defined networking technologies.
CloudLab will provide approximately 15,000 processing cores and in excess of 1 petabyte of storage at its three data centers. Each center will comprise different hardware, facilitating additional experimentation. In that capacity, the team is partnering with HP, Cisco and Dell to provide diverse platforms for research. Like Chameleon, CloudLab will feature bare-metal access.
Over its lifetime, CloudLab is expected to run dozens of virtual experiments simultaneously and to support thousands of researchers. "CloudLab will be a facility where researchers can build their own clouds and experiment with new ideas with complete control, visibility and scientific fidelity,” said Robert Ricci, a research assistant professor of computer science at the University of Utah and principal investigator of CloudLab.
Ultimately, the goal of the NSFCloud program and the two new test beds is to advance the field of cloud computing broadly. The awards will help researchers develop new concepts, methods and technologies to enable infrastructure design and execution.
Connect with the GCN staff on Twitter @GCNtech.