GCN Tech Blog

By GCN Staff

Blog archive
Data Center

New York Times creates a dustup over data centers

The New York Times created a dustup in the datacenter world with its investigative report on the impact on the environment of cloud technology and data centers – a critical element in the administration's efforts to save money and energy, and increase efficiency and security. The "yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness," the article states.

The article, Power, Pollution and the Internet, received nearly universal criticism from IT insiders, calling out the article's generalizations about a growing industry that is as diverse as it is complex.

Rich Miller, editor at Data Center Knowledge, writes that the Times' first installment "does an artful, fact-laden job of telling half the story."  He acknowledges that many data centers can be more efficient, but takes issue with the fact that the Times article doesn't mention how

companies like Google, Yahoo, Facebook and Microsoft have vastly improved the energy efficiency of their server farms by overhauling their power distribution systems, using fresh air instead of power-hungry chillers (“free cooling”) to cool their servers, and running their facilities at warmer temperatures. New design schemes for modular data centers have emerged, offering highly-efficient designs to customers with smaller operations than Google or Facebook. And we’re even seeing a growing focus on renewable energy, highlighted by Apple’s massive commitment to on-site solar energy and landfill-powered fuel cells. 

Dan Woods, a Forbes contributor, is more pointed in his criticism, taking the Times to task for a "confused and incomplete article that is unworthy of its reputation." He writes:

The next problem is the concept of utilization itself. What would be a good utilization? The article never says. It just says that utilization is 7 to 12 percent. The unstated implication is that it should be a lot higher. But how higher? Should it be 100 percent? 75 percent? 50 percent? Knowing that number would be really excellent. The fact of the matter is that with very stable workloads it is possible to get high utilization and with variable workloads lower utilization would be expected, so you have room to handle spikes.

For more information, check Data Center Knowledge's roundup of coverage.

Posted by GCN Staff on Sep 25, 2012 at 9:39 AM


Featured

  • Defense

    DOD wants prime contractors to be 'help desk' for new cybersecurity model

    The Defense Department is pushing forward with its unified cybersecurity standard for contractors and wants large companies and industry associations to show startups and smaller firms the way.

  • FCW Perspectives
    tech process (pkproject/Shutterstock.com)

    Understanding the obstacles to automation

    As RPA moves from buzzword to practical applications, agency leaders say it’s forcing broader discussions about business operations

Stay Connected

Sign up for our newsletter.

I agree to this site's Privacy Policy.