Using the NSA Intrusion Lifecycle to bolster security
- By William Jackson
- Mar 31, 2016
IT systems in both the public and private sectors are woefully unprepared for an environment in which cyberthreats are becoming more constant and complex, according to Curtis Dukes, director of the National Security Agency’s Information Assurance Directorate.
Dukes, speaking at the recent Cyber Resilience Summit hosted by the Consortium for IT Software Quality, gave disappointing grades for the nation’s cybersecurity. The government’s national security systems -- his primary customers -- are at 70 to 75 percent, a C, he said. The government as a whole receives only a D, and the nation as a whole, including industry, gets a failing grade.
“We’re never going to be 100 percent effective, no matter how good we are,” he said, but there are ways to improve.
In a threat landscape in which attacks cannot be prevented and successful intrusions are almost inevitable, IT systems must be resilient enough to mitigate damage and minimize the impact of attacks. Resilience involves more than well-executed code; it requires software components that are designed to work securely, efficiently and reliably as part of a complex system.
Participants from government, industry and academia discussed the opportunities and challenges in achieving cyber resilience at the summit.
“We’ve learned a great deal about the adversary” by investigating recent breaches, Dukes said. Among the fruits of this education is a set of Top 10 Mitigations -- including a description of the intrusion lifecycle -- to help organizations “reduce the chance of a significant intrusion occurring and evolve more resilient networks overall.”
Although both the techniques used by intruders and the mitigations change over time, the steps of the lifecycle persist, providing defenders with a model for mapping the appropriate mitigation at each step in the process.
The essential steps of an intrusion described in the NSA lifecycle are:
Scout the target: Collecting basic information about networks, systems and users that could be exploited.
Initial exploit: Using social engineering or malware -- including zero-day exploits if necessary -- to gain a foothold in a device or system.
Establish persistence: Escalating privileges or exploiting services to “burrow” into a computer to make detection and removal difficult.
Install tools: Implanting backdoors and enabling communications with command and control servers that allow attackers to install malicious tools.
Move laterally: Gathering administrative credentials and exploiting trust relationships between machines and networks to expand attacks to other target computers or networks.
Execute the mission: Collecting, exfiltrating and/or destroying data.
Each of these steps can be countered with the appropriate tools, techniques and practices, most of which are probably in use by or available to organizations already. These include:
- Enabling vendor utilities such as Microsoft’s Enhanced Mitigation Experience Toolkit.
- Enlisting basic tools such as whitelisting, reputation services and antivirus software.
- Controlling the implementation of software architecture.
- Practicing prompt software patching.
- Setting a secure baseline configuration.
- Properly managing access control.
The NSA's basic message is to be aware and be up-to-date -- with both patches and the most current versions of software.
“Companies are more and more taking the quality of products to heart,” Dukes said. New applications and new versions of operating systems are more secure and reliable as security is being built in from the earliest stages of development.
Keeping patches current is critical to protecting systems, not only because they fix vulnerabilities, but because once a patch is publicly released it is available to all malicious actors as well as to the legitimate users. Dukes said that adversaries have reverse-engineered security patches with 48 hours of release to identify the root vulnerability it fixes.
Yet testing and deploying a patch across and enterprise can take weeks or months, because the patches are developed in isolation and can have unintended consequences as they interoperate with other critical applications. The result is that known vulnerabilities continue to be exploited, giving criminals a window in which to achieve a foothold in target systems.
And although newer versions of operating systems generally are more secure, updating an OS across an enterprise is no trivial matter. Organizations often maintain long outdated operating systems because they support critical operations and cannot be easily replaced.
“Nobody runs legacy software because they are lazy,” said Phyllis Schneck, deputy undersecretary in the Homeland Security Department’s National Protection and Programs Directorate. They run them, she said, because they have to.
Dukes acknowledged that although software security is improving, backwards compatibility -- the ability of new software to operate effectively with other legacy applications -- remains a challenge. This is a major hurdle in achieving the resiliency needed to protect systems and data in complex environments.
The recent decision by the Defense Department to upgrade more than 4 million seats to Windows 10, the latest Microsoft OS, by 2017 will be a major test of how well Microsoft has addressed the challenge of backwards compatibility and resiliency. The goal of the transition is to establish a more secure baseline of software across the department. In theory, a more secure standard operating system should result in better enterprise security.
Whether it can provide the reliability, performance, efficiency, security and maintainability needed to achieve resilience in a massive production environment remains to be seen. Dukes is optimistic, although he said he does not expect the department to achieve a 100 percent upgrade to Windows 10 by the end of the year. He said he would be happy to see 80 percent of the national security systems upgraded, which should make them more secure. “That means I will only have to worry about 20 percent.”
Note: Jackson, a former GCN staff writer covers cybersecurity for a wide range of publications, was commissioned by the Consortium for IT Software Quality to cover this conference.
William Jackson is a Maryland-based freelance writer.