GCN Tech Blog

By GCN Staff

Blog archive
Data Center

New York Times creates a dustup over data centers

The New York Times created a dustup in the datacenter world with its investigative report on the impact on the environment of cloud technology and data centers – a critical element in the administration's efforts to save money and energy, and increase efficiency and security. The "yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness," the article states.

The article, Power, Pollution and the Internet, received nearly universal criticism from IT insiders, calling out the article's generalizations about a growing industry that is as diverse as it is complex.

Rich Miller, editor at Data Center Knowledge, writes that the Times' first installment "does an artful, fact-laden job of telling half the story."  He acknowledges that many data centers can be more efficient, but takes issue with the fact that the Times article doesn't mention how

companies like Google, Yahoo, Facebook and Microsoft have vastly improved the energy efficiency of their server farms by overhauling their power distribution systems, using fresh air instead of power-hungry chillers (“free cooling”) to cool their servers, and running their facilities at warmer temperatures. New design schemes for modular data centers have emerged, offering highly-efficient designs to customers with smaller operations than Google or Facebook. And we’re even seeing a growing focus on renewable energy, highlighted by Apple’s massive commitment to on-site solar energy and landfill-powered fuel cells. 

Dan Woods, a Forbes contributor, is more pointed in his criticism, taking the Times to task for a "confused and incomplete article that is unworthy of its reputation." He writes:

The next problem is the concept of utilization itself. What would be a good utilization? The article never says. It just says that utilization is 7 to 12 percent. The unstated implication is that it should be a lot higher. But how higher? Should it be 100 percent? 75 percent? 50 percent? Knowing that number would be really excellent. The fact of the matter is that with very stable workloads it is possible to get high utilization and with variable workloads lower utilization would be expected, so you have room to handle spikes.

For more information, check Data Center Knowledge's roundup of coverage.

Posted by GCN Staff on Sep 25, 2012 at 9:39 AM


Featured

  • Telecommunications
    Stock photo ID: 658810513 By asharkyu

    GSA extends EIS deadline to 2023

    Agencies are getting up to three more years on existing telecom contracts before having to shift to the $50 billion Enterprise Infrastructure Solutions vehicle.

  • Workforce
    Shutterstock image ID: 569172169 By Zenzen

    OMB looks to retrain feds to fill cyber needs

    The federal government is taking steps to fill high-demand, skills-gap positions in tech by retraining employees already working within agencies without a cyber or IT background.

  • Acquisition
    GSA Headquarters (Photo by Rena Schild/Shutterstock)

    GSA to consolidate multiple award schedules

    The General Services Administration plans to consolidate dozens of its buying schedules across product areas including IT and services to reduce duplication.

Stay Connected

Sign up for our newsletter.

I agree to this site's Privacy Policy.