Building a better backup
- By Greg Kushto
- Jun 05, 2017
For public sector employees who work in national security, emergency response or disease control, data access can literally mean life or death. In the event of data loss, the consequences can be substantial, even when the immediate implications seem less severe.
Fortunately, agencies can secure their information by having a backup in place, which can be accomplished quickly and with relative ease. Here are a few considerations for building and implementing an effective backup strategy.
How much data is actually required? Indiscriminately storing every email and file passing through an agency would be lunacy. Conversely, limiting the backup to only a few core components will inevitably leave out something important.
To find the sweet spot, calculate your risk tolerance level. Build a high-level list describing where data lives on the network -- just a basic outline of where information such as payroll or consumer data resides. With that roadmap you’ll have a better idea about how data is sorted and be better positioned to assess risk.
The next step is assessing risk to applications and systems. Which are custom coded? Are they easy to replicate? Agencies working with a shared drive, for instance, don’t need to waste time or energy replicating that server. They can quickly establish a new one and transfer the data and applications to it. Ultimately, it’s all about finding your risk threshold and determining what you need to keep and what you can let go.
The right medium
Every school of thought, new or old, comes with its own set of biases. It’s critical to recognize those biases -- and ideally set them aside -- in order to develop a comprehensive, multitiered backup approach that accounts for multiple scenarios and priority levels:
- Immediate: For mission-critical data that must instantly get back up and running, deploy an easily accessible backup, likely on a server housed in the agency or in the cloud.
- Intermediate: For information that can wait a day or so to restore can be housed on network-attached storage that can be separated or replicated to another server.
- Long-term: For data housed offline, either because it’s less important or because it’s so important it needs to be replicated separately from the network.
All federal agencies have restrictions around backups depending on the data’s importance and classification. Still, there are many possible routes requiring varying levels of effort.
Encryption has changed the way agencies can safely store data. For sensitive information with a limited lifespan, cloud storage is a perfectly viable option. Encryption takes years to break, after all. Even if the worst should happen, by the time it’s successfully (and maliciously) decrypted, that data will be long outdated.
By better understanding how data is classified -- along with the risks of losing it or taking too long to restore it -- agencies can make more informed decisions about how to store and manage it.
Testing and using your backups
If you’re not testing your backups, then you don’t truly have a backup. How can you know whether your backups are working unless you’re regularly testing them?
Don’t wait for a critical outage to learn your backups are faulty. It’s better to spend a little extra time verifying their functionality than to face the daunting task of rebuilding from scratch.
Your backup strategy, your chosen medium and your process for triaging information all becomes moot without regular testing.
So choose how to manage your risk, decide on a medium, have an offsite failsafe and then test the system on a regular basis. It may take effort, but it’s well worth the peace of mind.
Greg Kushto is director of security and enterprise networking at Force 3.