The path to outsmarting advanced cyberattacks

It is a truism that static defenses are not effective against agile, dedicated attackers. This was the case with France’s Maginot line in 1940, and it is true today of signature-based perimeter cybersecurity tools.

What many administrators and security professionals are looking for today is actionable intelligence that can predict attacks, spot them in progress and react to them in near-real time. The challenge for government and private-sector organizations is how to extract that intelligence from the huge amounts of data that already are being produced by sensors and monitors on global networks.

“From a theoretical standpoint, it sounds like a wonderful idea,” said Dave Ryan, chief technology officer of the Navy and Air Force division at General Dynamics IT. It already is being done in the command and control and intelligence domains for the Defense Department, he said. “We are using that model to build that loop into the cybersecurity domain.”

Related stories:

Energy lab released open-source tool for tracking cyberattacks

Offense must be the new defense, RSA chief says 

How cloud can improve intell community's big data analyses

The technology to gather and analyze data that can identify potentially troublesome patterns already exists. What is lacking at this point, Ryan and others say, is a level of confidence that would allow systems and administrators to act quickly on that analysis.

“Many security tools are good at generating alerts,” said Rob Gillen, research scientist at the Energy Department’s Oak Ridge National Laboratory in Tennessee. “But providing actionable intelligence to network administrators is a different question.”

Gillen is a network analyzer specializing in process automation and a member of the Oak Ridge Cyber Analytics team, a research and development project funded in part by Lockheed Martin. ORCA is developing tools and algorithms to automate advanced analysis for cybersecurity. One of the project’s tools is the Attack Variant Detector, which evaluates network traffic based on patterns of “good” and “bad” behavior and makes decisions based on what it sees. AVD is addressing the problem of confidence with what Gillen calls “pretty good success.”

“We’ve proved that on a small scale, it can work,” he said. But AVD is not quite ready as a production tool. Operating at network speeds of 100 megabits/sec., AVD can produce results pretty quickly, within 30 to 60 seconds. “As far as being operational on a real network, I would consider it interesting, but not practical,” he said.

The immediate goal of the AVD project is to bring down the level of false positives it produces while increasing the scale at which it operates, Gillen said. “Our target in the next year or two is to get it working on a network with thousands of nodes at speeds in the 1 to 10 gigabit range.”

At that point, it could be incorporated into commercial products. “If we get it to scale, I think that the commercialization folks would have an easy time of it,” he said.

Actionable intell

The focus on actionable intelligence as a cybersecurity tool coincides with a series of high-profile attacks over the past two years. These have ranged in sophistication from simple smash-and-grab data thefts and defacements such as that of the CIA website to what are being called advanced persistent threats against leading security companies such as RSA.

Although many simpler breaches could be prevented by the elimination of known vulnerabilities, it is becoming apparent that it is difficult for organizations that have been targeted by a dedicated adversary to defend themselves. This has led embarrassed officials to proclaim that compromise is inevitable and that attention should be focused on detection and mitigation inside the perimeter, as well as on spotting attacks as they form outside the perimeter.

DOD has been using tools to analyze and detect complex patterns of data in command and control applications for some time now, Ryan said, and would like to expand this capability to cyber defense. The department is on the front line of targeted cyberattacks where traditional signature-based tools are least effective, he said.

“The worst case is zero-day attacks,” for which no attack signatures or prior behavior patterns are available. “We tend to get hit first” with those, he said of General Dynamics IT’s DOD customers.

But for intelligent cybersecurity to be most effective, attacks need to be identified well outside an organization’s own networks. Data analysis in near-real time is not impossible inside the enterprise with existing tools, but the scale can become unmanageable when the view is external and global.

New models needed

“You really can’t use the same model you used in the past,” Ryan said. “You want to be as all-encompassing as possible,” but the task of gathering data, identifying significant elements and providing meaningful analysis in time to respond might be more than any one organization can handle.

Ryan said he envisions a collaborative hierarchy with a number of organizations gathering and analyzing data, sharing both raw data and results of analysis as needed.

The greatest challenge in a collaborative model might not be timely analysis but access to data, said Dmitry Kagansky, CTO at Quest Software Public Sector. The computing power to sift information out of data is becoming more affordable, and organizations are getting better at consolidating data for analysis.

But securely and rapidly sharing the data remains difficult. The challenge is magnified because sharing copies of the data across multiple organizations would mean that some of the data would quickly become out of date and different organizations would be working with different datasets. Real-time analysis means that multiple organizations would need simultaneous access to the same repository of data, rather than shared copies of it.

“I don’t think we’re there yet,” Kagansky said of this model.

Kagansky said he thinks the current model of using proprietary and expensive tools for security data analysis should shift to an open-source commodity infrastructure that costs less and allows wider access. This model calls for nonrelational databases in which all data is replicated in each entry rather than having common data elements consolidated separately and referenced as needed. This would ease the use of the data by various parties asking different questions. “This is where the industry hasn’t gone yet,” he said.

Focus on actors

Another advocate of using big data for cybersecurity is George Kurtz, former CTO of McAfee, whose latest startup venture is CrowdStrike. The company, which expects to launch its first product late this year, will focus not on attacks, but on the humans behind them, Kurtz said.

“A lot of the security industry has been malware-focused,” he said, concentrating on blocking traffic through the use of signatures and known malicious addresses. “That model is being taxed” by malware that can quickly morph and sites that can shift quickly from one host to another. “Big data is a natural tool that can fill in the gaps.”

The company intends to use that data to identify the human “fingerprint” behind attacks, said CrowdStrike CTO Dmitri Alperovitch. Inventing new attacks and finding new channels is time-consuming and difficult. “Human beings are creatures of habit,” he said. “When something works, they will use it again and again.”

That can make the attacker vulnerable. By identifying and recognizing the characteristics of a particular attacker, “you can focus on the things behind the technologies and raise the costs [of an attack] dramatically,” he said.

This will not necessarily identify the attacker or stop the activities, but increasing the effort raises the costs of an attack, and it can buy targets some breathing space as a new attacks are prepared. Although it is not a real-time process, “the goal is to be predictive,” Alperovitch said. “It’s a strategic approach rather than a tactical response.”

Of course, how to identify individual attack patterns is the challenge. “We believe we understand how to do it,” said Kurtz, who added that the company’s first offering will be a product rather than a service. He would not provide details yet but said “big data is the tool used to do it. We think we’ve got it sorted out.”

Despite the difficulty of assembling and analyzing data to quickly and reliably identify threats, companies are pursuing solutions, said Quest Software’s Kagansky.

The organizations that face the greatest risks tend to be those who have the most resources to apply to the problem, he said. “People are experimenting with this stuff very actively today.”

Within a few years the capabilities will appear in commercial products, although on what scale and for what environments is not yet clear. But, “it’s something that we have to do,” Kagansky said. “We’re going to figure it out.”


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected