NEWS FROM THE 2009 BLACK HAT BRIEFINGS

New weapon revealed for defense against zero-day attacks

LAS VEGAS — Signature-based malware detection tools are passé. “Everyone knows how to do it,” said Brad Wilson, vice president of product development for Trusted Computer Solutions Inc.


More news from the Black Hat Briefings:

Microsoft calls for united front in war against malware, hackers

Microsoft progam to help quantify costs, risks, returns of patch management

New tool could help computer forensics move off the disk and into memory

Exploiting routers can be a high value, and a high effort, activity


That does not mean they are unnecessary. No network or PC is likely to be secure if it is not protected by signature-based tools that can quickly detect and block known malicious code for which signatures are available. But even the best-maintained systems remain vulnerable to zero-day exploits and that window of opportunity between the time a threat is identified and the signature is actually updated. These gaps are addressed by behavioral anomaly detections tools that identify previously unknown threats because they are behaving badly.

TCS is announcing at the Black Hat security conference this week the release this fall of the first new version of the CounterStorm network anomaly detection tools since acquiring the parent company a year ago.

Quantifying the relative risks of zero-day attacks and known threats for which signatures are available but which still penetrate networks through increasingly sophisticated delivery systems probably is not possible, Wilson said. But any threat a network is not protected against presents a risk.

CounterStorm was created from technology developed at Columbia University, with funding from the advanced research projects agencies at the Defense and Homeland Security departments. It works by “learning” what is normal on a network alerting administrators to behavior outside of those parameters. One of its primary selling points is the ability of the algorithms that detect the anomalies in near real time.

“That’s the biggest strength of the product,” said TCS Chief Operating Officer Ed Hammersla. “If you’re putting together a system to detect first-time attacks, you’d better be fast.”

The company also claims a false-positive rate of less than 10 percent for the product. False positives — behavior that is falsely flagged or blocked as malicious — is a primary concern with behavior-based tools.

The current version of CounterStorm primarily uses two engines to detect anomalies, the Volumetric Anomaly Detector and the Enhanced Behavioral Engine. The volumetric engine identifies clients or servers producing unusually high levels of network activity and looks for characteristic traits of insider activities and exploited compromised systems. The behavioral engine detects patterns of malicious network activity such as worm-like malware and also provides visibility into attackers targeting specific high-value systems.

The Version 4.0 release in late September will include two new engines. A Statistical Payload Analysis Detection Engine uses deep packet inspection to look at the bytes in network traffic and builds models of normal content. It detects malicious or atypical data traffic that falls outside the norms.

“When you see something that is embedded in the data stream, it will jump out at you,” Wilson said.

A Rogue Detection Engine searches for botnet activity inside a network and for data improperly leaving it by looking for clients communicating with servers that they do not normally access. It also can detect clients that exhibit unauthorized behavior that could indicate they have been compromised.

About the Author

William Jackson is freelance writer and the author of the CyberEye blog.

Reader Comments

Thu, Jul 30, 2009 Eirik Iverson Chantilly, Virginia

As someone whose been into network security for over a decade, and someone that works for a company called Blue Ridge Networks, which has secured network communications for over a decade, I believe the enterprise must shift its focus away from network-centric tools and more toward endpoint-centric tools. That is not to say that one should go extreme. Rather, there's perhaps too much emphasis on network-centric this and that at present, whereas more must be applied to the endpoints.

Monitoring network communications is a good practice. However, compromised endpoints are employing ever-more stealthy communications and/or encrypted communications.

The endpoints possess loads of information assets within them. The endpoints interact with the most critical of mission critical servers. And many of the endpoints are mobile and get exposed to unknown risks and employed by end-users with little to know understanding of security risks and mitigations. For the enterprise to better counter its risks to its information, it must improve its protection, control, and auditing of all of its endpoints in near real-time wherever they are located.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above