Leaving Home: It’s Easier than You Think
One way or another, agencies need to get a handle on their data centers.
As the Office of Management and Budget marches toward its goal of closing 1,253 of the 3,133 federal data center, or 40 percent of them, by the end of 2015, the agency is not quantifying those closures against its other goal of saving $3 billion during the same period, according to “Strengthened Oversight Needed to Achieve Billions of Dollars in Savings,” a report the Government Accountability Office released in May. GAO recommended that OMB’s federal chief information officer start tracking and reporting key performance measures.
According to the report, “Until OMB begins tracking and reporting on performance measures such as cost savings, it will be limited in its ability to oversee agencies' progress against key initiative goals. Additionally, extending the horizon for realizing planned cost savings could provide OMB and data center consolidation stakeholders with input and information on the benefits of consolidation beyond OMB's initial goal.”
Meanwhile, the number of data centers keeps rising, according to the Federal Data Center Consolidation Initiative. Last year the initiative counted about 3,000 data centers including server closets, which includes rooms of less than 100 square feet. Recently, after further investigation and tracking, that number ballooned to 7,000, a number disclosed during a joint House and Senate briefing. The upshot: The federal world is poised to see data center closures and consolidations accelerate.
A Daunting Challenge
When it comes to consolidating or moving data centers, information technology teams most often cite data volume as the top issue, according to analysts. “You can’t defy the laws of physics,” said Steve Duplessie, senior analyst at Enterprise Strategy Group. “If you have terabytes or petabytes, you may not have the controls in place to move it, and any provider you’re working with may or may not be able to help you, either.”
Virtualization and a solid archiving strategy can help reduce the amount of data that needs to be moved. Start by doing an application assessment to identify which applications are in use and which will be coming to the new location and servers, and how they interact with your existing data. This will help prevent overlooked applications sitting on a server that provide all the data for another mission-critical application.
To do this, IT staff will have to contact users to ask them what they consider as key technology components, which data they are accessing on a daily basis and whether they have data and applications in the cloud or on personal technology such as a tablet or home computer.
Once you know what needs to move and have virtualized and consolidated as much as possible, breaking up the data center move into multiple subprojects makes it more manageable and minimizes problems.
In some cases, the IT team may want to evaluate whether cloud-based resources or service providers can help. In fact, federal mandates aside, one of the main reasons that organizations are looking to move out of one facility into a new one is the lure of a private or hybrid cloud, said Dale Wickizer, chief technology officer for NetApp’s public-sector business.
“Agencies can tap the cloud but still have control over their data,” he said. “They are confident with the security model and with the exit strategy that a private cloud provides.”
If you do select a cloud-based resource, you will need to factor that in during pre-migration testing. This enables you to benchmark performance so when you do post-migration testing you can demonstrate your return on investment, especially if you are migrating to alleviate performance issues.