Microsoft's data center cooling a breath of fresh air
We all know that heat is the enemy of computers. In fact, if we think of computers in terms of an ecosystem, heat is probably their only naturally occurring predator. And it’s ironic that they generate it themselves, even though it’s simple physics. You bring electricity into a system to do work for you, and it gets transformed into something else without changing the total amount of energy involved — in this case, mostly heat.
Heat is a unique enemy of processors. It’s actually more akin to a poison, first slowing down its victim, then causing it lots of subtle damages and finally killing it. Data centers, which pack thousands of computers into a relatively small space, are uniquely susceptible to the risks of heat. They deal with it in various ways, and have in the past been criticized by some to be inefficient beasts of burden for it.
As agencies consolidate data centers and virtualize servers, finding new ways to keep them cool could add one more way to reduce costs and increase efficiency.
Many companies are offer examples, streamlining their data centers with new techniques and processes that IDC Government Insights says feds will need to emulate in order to increase their own effective use of computers, data centers and the cloud.
Recently, Google pulled back the curtain on how it manages the heat at some of the largest data centers in the world. The company’s techniques involve stripping almost every unnecessary component and scrap of metal away from the processors inside their data centers. The computers in the racks at Google centers are little more than pallets holding motherboards.
Even the internal walls of the data center are constructed of fabric — enough to direct air flow in the proper direction, but not enough to add to the complex problems associated with heat management. They are also cheap, and easy to reconfigure on the fly. Then Google runs a lot of cool water into the facility, sometimes having the liquid-carrying pipes within inches of the processors themselves.
That’s a pretty efficient model of doing things, but Microsoft is now going one further, literally setting their servers outside in roofless data centers. According to Data Center Knowledge, the idea behind Microsoft’s new billion-dollar roofless data center facility in Boydton, Va., came from Christian Belady, general manager of Microsoft Data Center Services. He thought that computers should be able to brave the outdoor elements, and set up a server rack in a pup tent back in 2008. It ran for eight months with 100 percent up time. That demonstrated that outdoor computer cooling and housing was theoretically possible.
Now, it’s more than just dropping a computer in a field and hoping that it doesn’t get rained or snowed on, or marked by passing animals. Because if you leave it in the elements, that will happen. And it will break.
But Microsoft has been designing smaller and smaller containers to hold its servers for years, Data Center Knowledges reports. Called IT-PACs, for pre-assembled components, the shipping-crate-like boxes can each hold hundreds of servers. Cool air from the outside is brought into the unit through vents on the side, where it passes through a wet membrane that cools the air down before being used to ultimately cool the servers. This method reportedly uses just 10 percent of the water needed to cool most data centers of the same size.
Future Microsoft data centers may be little more than concrete slabs on the ground, with the IT-PACs sitting on top. And although there is some concern that Virginia might prove too hot for this method to work — the company also has experimented with outdoor cooling on a more limited basis in Washington State, Chicago and Ireland — Microsoft seems confident that it will do just fine in The Old Dominion.
I guess we’ll see what happens when the new sparse and efficient data center meets its first brutal southern summer. But in any case, this is a great example of one possible path for agencies to follow as they strive to increase their own efficiency with data centers.
Posted by John Breeden II on Feb 08, 2013 at 9:39 AM