How to ratchet up data center storage performance
- By Chip George
- Aug 19, 2015
Many public sector CIOs who are at the start of their cloud journey know that the cloud should be part of their overall storage strategy, but are uncertain just how large a role it should play. As a result, they often find it difficult to figure out where to begin.
One of the first steps is conducting a thorough inventory of the current data center infrastructure, which is often a heterogeneous mix of server, network and storage platforms. While Gartner estimates that storage capacity demands are growing at 50 percent per year, storage budgets are expected to grow at less than 10 percent. In short, IT teams will have to do more with less.
One way CIOs can address this challenge is by looking at ways to maximize current investments in the storage infrastructure and aim for higher utilization. Underutilization is inefficient and wasteful, and there are ways to wring every ounce of performance and disk space out of existing investments before considering the cloud for storage.
There are three storage efficiency technologies that can reap tremendous benefit for an agency – in both cost and space savings. Making efficient use of storage already in place frees up resources for investments in new technologies, including cloud-enabled storage. It also lets agencies actually compare the costs of cloud services to those of its own data center. Applying these technologies is a great first step on the road to the cloud.
Some of the biggest clogs in any data storage system are related to redundant data. By applying the intelligent compression made possible by deduplication, only one instance of the data is retained on disk. The redundant data is replaced by a pointer that lets the system know where to find that single stored version of the data. Depending on the workload, deduplication can mean massive savings in terms of disk storage and network traffic. Deduplication can also shrink backup windows.
But don’t think of deduplication as only for backup or archive workloads. Deduplication in primary workloads can help agencies store more “hot data” – the business-critical information that needs to be accessed more frequently – and deliver higher performance. When looking for ways to maximize efficiencies within an existing storage architecture, deduplication is always a smart place to start.
All enterprise applications are allotted a set amount of storage to operate – this is referred to as provisioning. The problems arise when these storage predictions are either estimated too low, causing performance problems, or too high, resulting in underutilization on either end of the spectrum. The latter case can also lead to fat provisioning, where IT shops buy more storage than is actually needed.
Thin provisioning lets storage teams allocate disk storage space in a flexible manner among multiple users, which saves money by protecting the storage spend from applications that ask for lots of storage up front and don’t use it. When app teams plan for several years of predicted usage, thin provisioning keeps them from locking or stranding storage where it isn’t being utilized. This results in better efficiencies and cost savings.
Next up in the storage toolkit is compression. Compressing reduces the number of bits and bytes required to represent data by encoding information more efficiently. Compression can reduce a text file to half of its original size. Smaller data is faster to transfer, cheaper to store and helps free up precious network bandwidth. And it can be enabled on primary storage volumes, secondary storage volumes or both.
These three storage efficiency technologies – deduplication, thin provisioning and compression – can combine to significantly reduce the total amount of storage needed, lowering both capital and operating expenses. Total space savings can range up to 87 percent for compression alone, depending on the application. Implementing all three can take the savings even higher.
While cloud-enabled storage can provide many avenues for savings, the best place to begin any cloud journey is with a comprehensive inventory of the data center to find places where the existing storage investments can do more. Getting back to basics is a great way to prioritize storage needs, maximize current storage assets and figure out how to eventually make cloud-enabled storage a key component of an overall storage plan.
Chip George is NetApp's director of state, local government and education.