4 ways to move from stovepipes to shared IT infrastructure
Stovepipe is one of those words in the government vernacular that simply refuses to fade away. While agencies have made great strides moving from private networks and proprietary systems to the cloud and shared services, IT infrastructure in many places remains walled off within individual agencies, between agencies, and even between bureaus and commands.
There have been legitimate reasons IT infrastructure remains stovepiped, and unsurprisingly, the way agencies are funded is front and center. If funding flows to an agency program, the natural process is for the agency to take the money and build something specific to that need or mission. For that reason, there was little incentive for an agency program to carve out budget to share those services with others. This process “worked” when funding was abundant and the pie was big enough to serve all agency programs, but today’s tight budget climate demands agencies become more efficient with IT infrastructure expenditures. The result: more agencies and intra-agency departments are shifting to shared IT infrastructure.
For agencies seeking to open up their own IT infrastructure to others, or evaluating whether it makes sense to leverage the shared services model, there are a handful of strategies to consider:
1. Ensure efficient data storage
Moving from stovepipes, where every application brought its own infrastructure, to shared infrastructure is an architectural change that requires agency decision makers to reevaluate their approach to data storage. The need for efficient data storage is compounded as agencies consolidate their own dedicated data centers and share these services with others.
Because shared services bring a mix of technologies into play, it becomes more important than ever for storage architecture to “stay on” due to the fact that it is not just one mission-critical application running on the IT infrastructure, but several. Within a shared services model, the architecture should support multiple types of storage depending on need – so that an agency or department can match the right level of storage to the right data requirements. For example, inexpensive spinning disk storage is suitable for information that an agency is not depending on regularly or has less performance demands, while mission-critical applications may require more expensive flash storage technology.
2. Efficiently manage and move data in the cloud
Historically, moving data across/within/from multiple clouds has not been a practical reality because data has mass and isn’t easy to move. This makes data stewardship and governance one of the toughest aspects of the cloud, more so than networking or bandwidth.
Key to successful shared services is adopting a “data fabric” that allows agencies to keep the right data on-site, while moving other data to the cloud and taking advantage of the tremendous capabilities offered by cloud providers. A data fabric arms agencies with the flexibility to control, integrate, move and consistently manage their data across the hybrid cloud and take full advantage of the economics and elasticity – all while maintaining control of their data across a hybrid cloud environment.
Ultimately, data placement should be determined according to cost, compliance, availability and performance requirements, which are policy-driven. With a data fabric, agencies are able to maintain data stewardship and get it back when they need to.
3. Ensure quality-of-service levels
Beyond the funding realities, some agency decision makers have been reluctant to share data centers, servers and systems for fear that doing so compromises security, privacy and ultimately control. Agencies must in effect deliver enterprise-class capabilities in a shared infrastructure environment, which means always-on availability, lock-down security and strong performance levels. Performance is critical.
Service-level agreements and quality-of-service standards must be concise and stringent for shared services to be successful. User performance and availability needs will change dynamically, and the infrastructure must be flexible enough to scale to these needs. If agencies or departments cannot be assured that data and applications will be secured and protected, they will be reluctant to move beyond existing stovepipes.
4. Extend shared infrastructure to the cloud
The cloud is a huge factor in today’s data center consolidation conversation. Agencies are determining if they are able and willing to run a private cloud, public cloud, on-premise or off-premise. Ultimately, virtually all agencies will be operating in a hybrid cloud environment, and many will begin to leverage shared services.
The payoff of a hybrid cloud exists within multiple agencies, but the greatest near-term payoff may be intra-agency shared services. To realize this, agencies must make cloud computing a seamless extension of their IT infrastructure. This means that an application must be as easy to add to the cloud as it is to the infrastructure in an agency’s physical data center.
The financial, business and bureaucratic case for IT infrastructure stovepipes – both within and across agencies – continues to weaken, and agencies able to effectively move to a shared services infrastructure that leverages the cloud are poised to reap significant cost, efficiency and organizationwide benefits.
Rob Stein is vice president, U.S. public sector at NetApp, a leading cloud storage provider.