multiple clouds (maxsattana/Shutterstock.com)

INDUSTRY INSIGHT

3 tips for multicloud migrations

While many government agencies are moving toward multicloud environments, some are still struggling with the development of even basic cloud strategies. Simply knowing where to start has led to the same sense of paralysis that is typically felt whenever a disruption to the IT status quo occurs. Factor in security concerns and the perceived headaches associated with using multiple cloud providers, and it becomes easy to understand the hesitation of even the most forward-thinking CIOs.

Still, agencies that use multiple clouds do so for many different reasons, and they are seeing numerous benefits. They have been able to achieve better reliability through redundancy as well as cost reductions. They have been able to successfully leverage the strengths of different cloud providers to get the most out of specific applications. They are connecting different cloud assets and creating unified toolchains to eliminate disparate cloud policies and wasteful development of tools that do not work across different environments.

But multicloud environments also introduce valid concerns. Lack of visibility as information passes between different cloud environments remains an ongoing problem, particularly in the government sector, which is used to keeping data under tight supervision. Insider threats are also increasing, and employees who are not well trained in working with multiple cloud providers can, intentionally or not, cause data to be compromised.

Building a solid multicloud strategy upfront can alleviate these concerns and simplify agencies’ cloud management procedures and policies. Here are three approaches that can help. 

1. Selective migration

Moving to a multicloud environment should not necessarily be an all-or-nothing proposition. There may be cases where it makes sense to move some applications to the cloud, while keeping others on-premise.

As a rule, applications should only be migrated if the move will result in operational savings. Agency teams must carefully assess their current application environments and come up with cloud strategies that are most appropriate. For instance, when looking at total cost of ownership, agencies may find it might not be worth moving certain applications to the cloud.

Most likely, IT teams will find that moving to a multicloud environment will help, especially if their agencies are using multiple software licenses. Software-as-a-service solutions like Microsoft Office 365 can effectively replace redundancies caused by a collection of software licenses. This can save significant money and time involved in managing different solutions.

2. Hardware replacement

Legacy hardware can be insecure, difficult and costly to maintain. Replacing outdated hardware with cloud-based infrastructure-as-a-service solutions can eliminate a number of headaches, including inefficiencies and potential security loopholes.

Replacing legacy hardware should be handled in the same methodical manner as the move toward SaaS solutions. The areas where teams will see the greatest return on investment will be new-build applications and areas where they have already adopted DevOps techniques. IaaS solutions will complement these efforts by allowing teams to rapidly implement development/test resources and turn those assets off when they are not being used.

3. Security implications

Simplifying the information flow and allowing security policies to move between multiple clouds is essential. All applications should have the same level of security, whether they are hosted on-premise or in the cloud. Administrators should consider employing tools that unify legacy environment and multiple-cloud platforms so they all adhere to the same consistent security policies.

Automation is also critical, as it simplifies security management and reduces the chances of human error. Agencies cannot have consistent, high-level security policies and rely on armies of IT specialists to translate those policies into specific controls for each cloud environment. That process would be inefficient and increase the likelihood of mistakes.

Consider the recent AWS breaches. Those incidents resulted from human error and signify the hazards that can occur when moving large amounts of data to cloud providers. If IT managers are not careful, data can be left open and accessible to the outside world. But if they take care and set up protocols to carefully monitor their data, they will be able to enjoy the cost and efficiency benefits of the cloud without sacrificing security.

While there is much to consider when moving to multiple clouds, the benefits are numerous and far-reaching. Agencies will be able to simplify their cloud management and policies and use automation, virtualization and orchestration to improve efficiencies, security and agility.

It just takes some planning and precaution. The end result will be well worth the effort.

About the Author

David Mihelcic is the Head of Federal Strategy and Technology supporting the Juniper Networks Federal sales, engineering, and operations teams. In this role, David is responsible for supporting the design and implementation of automated, scalable and secure networking solutions that meet government customer expectations, satisfy technical and certification requirements, and support global government missions.

David joined Juniper Networks in February 2017 following 18 years with the Defense Information Systems Agency (DISA), where he retired as Chief Technology Officer, a position he held for more than 12 years. He served as the DISA senior authority on scientific, technical, and engineering matters and developed the DoD’s enterprise-wide systems engineering (EWSE) process and plan. He also established DISA’s board for facilitating and governing cross-program integration and synchronization.

Prior to his appointment as CTO, David held positions of increasing responsibility, including Deputy Program Director and Chief Executive Engineer for the Global Information Grid Bandwidth Expansion (GIG-BE) Program. In this role he was the technical authority for the $800+ million expansion of DoD terrestrial communications and was responsible for defining the GIG-BE architecture and leading the technical aspects of the program. Previously he was Chief Executive Engineer for the Defense Information System Network (DISN), Commander of the Center for Horizontal Integration, and DISA Deputy Chief Executive Engineer for Information Processing.

David was appointed to the Federal Senior Executive Service at DISA in 1999 and in 2007 he was selected to receive the Presidential Rank Award in recognition of a sustained record of exceptional professional and technical performance. Before joining DISA, David led the Network Security Section of the Naval Research Laboratory and was a Senior Consultant with SRI Consulting.

David is a graduate of the University of Illinois at Champaign-Urbana where he earned a Bachelor of Science degree in Electrical Engineering.


inside gcn

  • analytics (Wright Studio/Shutterstock.com)

    3 data strategies to help crackdown on internal corruption

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group