The new cyber landscape
- By Patrick Marshall
- Apr 13, 2018
For IT managers at government agencies, it might seem like a perfect storm is gathering: tight budgets and demands to upgrade aging network infrastructure; mandates to move applications into the cloud and beyond their traditional network security perimeter; calls to support rapidly growing numbers of users on mobile devices; and, above all, warnings from intelligence agencies to expect ever greater cyberthreats.
“We are constantly figuring out how to whack the next mole and beat the bad guys at the next level of the game,” said Susannah Schiller, deputy CIO at the National Institute of Standards and Technology. “It is a constantly evolving thing, and it has been getting worse every year.”
According to agency staff, vendors and analysts alike, there has been a major shift in network security strategy. Trying to contain data behind network perimeters secured by firewalls and other tools is giving way to data-centric and user-centric strategies that aim to protect data wherever it resides -- whether it’s within the agency’s perimeter, in the cloud or on a mobile device.
“As we move to more of a distributed data mode, with users on nongovernment-furnished mobile devices, we have to move the security protection closer to the asset, and we have to tie security policies to the user,” said Chris Townsend, Symantec’s vice president for federal. “That approach aligns more to a risk mitigation strategy where we quantify the value of the assets [and] risk of loss and then align scarce security resources accordingly.”
While pushing the federal government in that direction, the 2017 “Report to the President on Federal IT Modernization” noted that “agencies have attempted to modernize their systems but have been stymied by a variety of factors, including resource prioritization, ability to procure services quickly, and technical issues.”
Still, the report recommends that agencies “prioritize modernization of legacy IT by focusing on enhancement of security and privacy controls for those assets that are essential for federal agencies to serve the American people and whose security posture is most vulnerable.” The report also urges agencies to adopt a layered defensive strategy and emphasize application and data-level protections.
Although the report’s goals were well received, some analysts were disappointed by the lack of specific measures agencies should take. “It’s a great shift in focus,” said Rick Holgate, a former federal agency CIO who is now research director for the public sector at Gartner. “But in terms of how they’re going to get there and how they’re going to make that happen, there is some detail missing.”
Even agencies that are committed to adopting new cybersecurity tools and policies as called for in the White House’s IT modernization report face significant challenges.
Tight budgets, of course, remain a constant concern. And according to Holgate, the complexity of securing hybrid networks has blurred the lines between agencies’ network administrators and DevOps teams.
In a hybrid environment, he said, “responsibility shifts significantly more to the DevOps team to be more responsible for security. They need to have the skills that enable them to do that well.”
NIST cybersecurity specialists say guidance from legislation such as the Federal Information Security Management Act and programs such as the Federal Risk and Authorization Management Program are a big help.
“FISMA and FedRAMP provide frameworks that we can work from,” said James Fowler, NIST's acting deputy CIO. He added that his agency had already begun screening web platforms and services, “so when FedRAMP rolled out, we were just thrilled that what we were doing was now being done by some other agency.”
At the same time, “FedRAMP is not going to get every tool that NIST wants, but we already have processes in place on how to deal with that so we are not limited to just what is in FedRAMP,” he said.
Identity in hybrid networks
A major challenge that comes with agencies moving more activities to the cloud and mobile devices is ensuring that users are who they say they are. The new infrastructure landscape “is certainly going to require some evolution of the concept behind identity and access management,” Holgate said.
Although many on-premises networks have long relied on two-factor authentication for identifying users and controlling access, it’s not clear that traditional multifactor authentication is practical with users accessing the network via mobile devices and the cloud.
“You can either have an identity as a service that operates in the cloud environment, or you can have kind of a blended model where you connect your on-premises Active Directory with a cloud-hosted version of Active Directory,” Holgate said. Choosing the best fit “requires some thought based on the portfolio of applications and the extent to which you need to have something like a single sign-on.”
According to NIST officials, despite their use of cloud services for applications and platforms, the agency is still relying on on-premises Active Directory so that they can use two-factor authentication. That strategy, however, has limitations. Until NIST extends authentication services between on-premises databases and cloud and mobile users -- which officials are considering -- those users are not allowed to reach back inside the perimeter to access or change data in the on-site data center.
Agency cybersecurity officials also say the multiple levels of security required in today’s computing environments and the plurality of tools involved at each level can be problematic. “Working to integrate them is always a challenge,” Schiller said. “There is complexity and there is cost. But I think the thing that our users get most aggravated about is length of time to implement new technologies.”
Holgate said that is certainly true of the suite of some 169,000 tools being assembled by the Department of Homeland Security for the Continuous Diagnostics and Mitigation program. Although the tools are welcome, there are shortcomings in integration and interoperability, he added. And in some cases, the tool development is not keeping pace with changes in technology.
In fact, according to a DHS official, although the CDM program has always focused on moving from protecting data in on-premises networks to protecting data wherever it is located, those data-centric tools are part of Phase 4 of CDM, which is still in the planning stages.
In the meantime, agencies must do the best they can with the best tools available.
“This is the modern-day arms race,” said Brian Hussey, vice president for cyberthreat protection and response at Trustwave. “It’s a multibillion-dollar criminal enterprise that we are fighting, and they are constantly coming up with new and innovative ways to penetrate the networks. We are constantly coming back against them.”
Patrick Marshall is a freelance technology writer for GCN.