man optimizing data center performance

How to wring more performance out of data centers

Government IT managers under pressure to squeeze more performance out of their enterprise IT systems often have to rely on workarounds to get the job done.

That’s especially true in data center operations, which must support the full gamut of needs, from handling the data bursts common to virtual desktop programs, for example, to the more static demands of storing and packaging census or tax data.

To provide the range of services, data center managers have resorted to hybrid data center approaches, under which different types of systems are used: flash storage units to handle high input-output operations and more conventional disk drives for common storage.

Hybrid systems themselves are a kind of workaround in that the tradeoffs involved in providing more of either type of service can be expensive or inefficient, experts say.

Nevertheless, data center managers are experimenting with even more focused workarounds, including data compression techniques that enable managers to push more data onto their flash and disk drives.

One of the tricks that flash-storage provider Tegile Systems uses in its hybrid arrays is “deduplication,” a data compression technique that Tegile vice president of marketing Rob Commins calls part of the firm’s “secret sauce.”

"Its not traditionally used that way, but we have found a way to move data off of the flash drives in our hybrid systems onto spinning disks without hurting performance," Commins said.

With data deduplication, only one unique instance of a piece of data is stored. All other redundant instances of that same information are replaced with a tiny pointer showing where to look for the data.

An email system is a good example of  how it works. A typical email might contain a header and other data that is repeated in 100 other emails, and all that repeated information might amounts to 1MB in data. In a deduplication system, only one instance is saved; the 100 others are replaced by a pointer. In that case, 100MB of storage is reduced to just over a single megabyte.

"In our system we typically have 2 terabytes of flash, but we use deduplication to move data to traditional drives, resulting in a five-to-one reduction ratio," Commins said. "So it's like having 10 terabytes of flash memory storage."

The technique proved its worth for the Village of Niles, a mid-sized town of 30,000 residents north of Chicago where Steve Cusick is the systems engineer.

The village's data center supports email, a Microsoft Active Directory server, the SQL server and other services for a large part of that community, including the village hall, two fire stations, a fitness center, the office of the mayor and the historical society, among others.

The village was thinking about implementing a virtual desktop (VDI) system, but had other more pressing needs, and was looking for a way to expand its data center. At the same time  the town was running out of storage capacity.

"We are always looking for the most advanced technology, though we don’t have the biggest budget, so a hybrid system was a good option," Cusick said. "What we found was that with deduplication and compression, we got both performance and the storage space we needed."

Niles paid about $60,000 for a hybrid system from Tegile, which Cusick said might have been a little more expensive than getting a storage area network with a standard set of drives. However, he quickly found the benefits of combining performance flash with storage in managing the city’s security camera system

"We were able to run a complete NVR (network video recorder) system off it in a way that we never could with a normal SAN," Cusick said. "There are 50 IP security cameras tied into the system and all of them are recording at 1080p resolution."

Even with all that performance demand, Cusick said the village was poised to push ahead with its VDI initiative and didn't anticipate needing any extra data center capacity than it already had with its current hybrid system.

However, even with hybrid systems making great strides in government data centers, experts are unwilling to declare that the technique is here to stay. Instead, they say, the key is likely to be the development of better software that can manage multiple storage types and balance the application requirements.

"Multiple programs have different workloads," said Christian Shrauder, federal CTO for Fusion-io, which develops solid state, high performance I/O systems. "That mixes up and randomizes usage patterns. So we need to develop software to make better use of flash and other types of storage, or even multiple kinds of flash."

Commins agreed, calling the multiple demands of programs on storage an "IO blender problem," mostly brought on by the move to virtualized environments.

"I think [software] will be the focus in the near future, more so than the hardware," he said. "We've got to the point where we no longer have to throw disks at the problem, so I think for the next 10 years hybrids are the way to go, especially for small and mid-size government organizations. The advancements will come from the software."

About the Author

John Breeden II is a freelance technology writer for GCN.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above