4 steps to secure data management
- By Henry Newman
- May 13, 2016
Data is the currency that drives the global economy and keeps organizations running, and it needs to be treated as the considerable resource it is. This entails ensuring data is secure, available on demand and able to provide value. In today’s data-driven economy, deriving value from data currency is still a work in progress -- but it's one with great potential.
Perhaps nowhere is this more the case than in the federal space, where data serves as the backbone of operations ranging from weather forecasting and intelligence gathering and analysis to energy research. At the same time, agencies are under increasing pressure to efficiently manage and store growing amounts of data and make sure it stays in the right hands. Agencies can face a variety of security challenges, including both internal and external threats, and must take these into consideration when formulating an adequate data management plan.
The hallmarks of traditional data storage methods include storing data on disparate systems, managing different security clearances and creating a “super user” who can access anything. However, these elements are no longer enough to address the needs of today’s federal agencies and the world in which they operate.
So what does such a secure data management plan look like, especially after Edward Snowden demonstrated the weaknesses associated with administrative access to various government systems?
While no silver bullet for securing data exists, establishing the right technology ecosystem can address the federal government’s data security challenges, including how and where agencies store data and how they can keep the wrong people from accessing it.
Following are key steps to consider for establishing this secure ecosystem.
1. Choose the right framework. Central to a comprehensive data security plan is the overall architecture to maintain security at all levels -- from the database and storage system to the network, compute and other supporting tools. Adopt a model that provides a security policy framework of integrated technology and keeps information in a central repository with automated mandatory access controls.
2. Consolidate systems and organize data holistically. The aforementioned framework lends itself to a holistic approach that consolidates previously isolated systems, segregates data at different security levels, enforces security access controls and captures and consolidates audit trails, among other key functions. The payoff, besides heightened data security? Agencies avoid maintaining separate systems and copying information among them, which can lead to data duplication and wasted storage.
3. Combine compute and storage. Consolidating systems also ensures compute and storage can be combined, freeing agencies from the stovepipe model that created separate compute and storage systems for each data classification level and isolated time-sensitive critical information. Consolidating systems allows for rapid information processing while following strict security policies.
4. Establish “least privilege” access. With the right framework in place, “least privilege” access can be enforced throughout the entire IT ecosystem, ensuring no one individual, team, process (computer virus) or rogue client (hacker) can have access to everything -- and that the authorized people only have access to appropriate information.
Federal agencies face a range of data management issues, from contractor access and intellectual property concerns, to accommodating data storage and collaborating with other nations in cooperative agreements with the United States. However, with the right technology infrastructure in place and an approach that incorporates role-based, automated, multilevel security, agencies can ensure adequate data security.
Henry Newman is CTO of Seagate Government Solutions.