Storage plays starring role in efficiency efforts
Although storage has always played a part in improving data center efficiency, it’s usually been treated as something of an afterthought. That’s beginning to change, though, particularly as data centers are increasingly virtualized. In most scenarios, virtualized storage becomes a point of mediation for enabling dynamic resource scheduling, failovers and other vital functions.
The importance of storage in the data center has been recognized by the Green Grid, an international industry consortium that developed the power usage efficiency (PUE) metric that has driven so much of the data center efficiency improvements over the past few years. Last year, the Green Grid proposed the Data Center Storage Efficiency (DCsE) metric as a way for data center operators to identify inefficiencies in their storage resources, in much the same way that PUE can be used to improve infrastructure energy efficiency.
Most organizations are at least sensitive to the notion that efficiency is more than just infrastructure costs, according to Mark Weis, vice president, federal sales at Spectra Logic. They increasingly look to older storage technologies such as tape libraries as a way of extending power and cooling limitations.
“The more power and cooling they can save with tape libraries, the more room they have to make decisions with other parts of the IT infrastructure that will inevitably consume more power,” he said.
Just from an overall cost perspective, agencies are looking to a variety of different storage technologies to give them the maximum performance for the investments they make in the data center. One is example is the solid state flash drive, which has come down in price to the point where it can be considered a regular part of the storage set up, particularly in virtualized environments.
“You can mix solid state with [cheaper] SAS [Serial Attached SCSI] rotating drives that enables you to hit both capacity and performance marks at a very good price point,” said Keenan Baker, inside solutions architect for servers and storage at CDW Government (CDW-G). “It ends up saving you a bundle of money. And technologies like that are emerging all around the data center as you start to virtualize the environment.”
Getting the right mix of technologies also can be used to improve energy efficiencies. Many organizations are now considering such things as tiered storage arrangements as a way of reducing costs, moving often-used data to faster and more power-hungry drives, while putting the majority of the rest of their data to slower and higher density storage.
That approach worked well for one storage-hungry but space-deficient organization that David Cappuccio, a vice president at Gartner, worked with. He ran a model assuming that 40 percent of its data would be on high-performance drives, with the rest slowly migrating to high-density Serial ATA drives. The organization’s storage density went from 700 terabytes to 2.2 petabytes in the same rack space, with an overall drop in power consumption of around 19 percent.
“Then we went even more aggressive and the total storage went to around 7 petabytes for an overall power increase of just 3 percent,” he said. “So, just using different technology produced a massive increase in performance and density-per-square-foot of space.”
What is clear that agency demand for data storage will only increase in the coming years as agencies invest more in Big Data, full-motion video and other data-intense applications. The federal storage market grew to over $1 billion in fiscal year 2011, according to GovWin Consulting, a 22 increase in just three years. With agencies under orders from OMB to cut IT spending, the drive to force data center storage efficiencies will only ratchet higher.