Dennis Cox, co-founder of BreakingPoint and formerly with TippingPoint, says IDS and IPS have become commodities and have gone stale while threats against the network continue to evolve.
Dennis Cox, co-founder and chief technology officer of BreakingPoint, has 15 years of experience in developing high-performance network devices such as intrusion detection and prevention systems and DSL devices. He also has worked with TippingPoint/3 Com and NetSpeed/Cisco Systems. He is an authority on content-aware networks and network equipment testing and is the author of several patents on subjects including IP spoofing, denial of service to network processors, dynamic construction of packet classification rules, and multilevel packet screening. Cox spoke recently with GCN senior writer William Jackson about the challenges of network defense.GCN: What is a content-aware network?
Cox: It is a network that makes a decision based on data. This is most networks nowadays. Whereas old networks would make decisions on the header, nowadays any [quality of monitoring] device, IPS device, e-mail device, firewall, they all make decisions on data. Even routers do. They use content to make decisions rather than standard headers.
Why does it matter to a network what the content of the traffic is?
If the network is content aware, it can remove spam. Spam grew in volume, so therefore you have e-mail devices that will look inside and throw away spam. You can have devices that will look inside and throw out viruses in attachments. When you combine voice and data on a network you want [quality of service] to make sure the voice is good quality. You are also going to want to make sure that the wrong information does not go out across your network, so you look at it to make sure it is not transmitting them, and you have firewalls to block things from coming in. All of these devices have been built up around content-aware services.
What are the trade-offs in making a network aware and how do you overcome them?
There are trade-offs in every one of these things I have talked about. The more things you look for, the slower the network gets; that’s just a fact of life. With spam filtering, for example, the trade-off is: How important is your e-mail? There is a level of how much spam you are willing to put up with. On security, it’s the same thing. If you are a bank, you probably care more about security than anything else, so you are willing to put up with latency and not giving people access to all services. But if you are Google, you are probably going to be less secure because you want everybody to have access to everything.
Each business has to make the decision on what they are willing to accept on their own, and each device changes that equation. As for how good content-aware devices are nowadays, they are all over the map. There are some that are amazing and there are some that are just a joke. The worst thing is there is not even consistency within vendors. A company might come out with one device that is great and then bring out another device that might be horrible. So that is tricky.
What is the current state of deep packet inspection (DPI)? Is that just another way of saying “content-aware,” or is there something more specific about it?
It is a marketing term, more than anything else. It is a way of saying “content-aware plus we’re going deeper.” There are two modes of it: There is out-of-line processing and in-line processing. In terms of in-line processing there are certain vendors that are phenomenal at it, and they have been around a long time, six to 10 years. The younger guys probably have a little trouble with it. The out-of-line guys are pretty impressive in terms of what they can get at. That technology is not new — it is 20 years old, and it’s very mature. So even when a new company comes out, everyone knows how to process it correctly. Speeds are the only thing that throw it off. When you start getting to 10 gigabit speeds with big pipes, they tend to fall down a little bit. The out-of-band guys tend to mostly be software. They use a little hardware acceleration to do buffering, but when they get to 10-gig speeds, they have trouble. So they are looking to go hardware much more than they already are.
On the in-line mode there are hardware and software vendors. With software vendors, it’s only going to run as fast as the software does, it’s going to cause your network to slow down and will have all sorts of bugs. The hardware guys don’t have any trouble keeping up with the network and they can look for really good stuff, especially the dirty-word stuff. Like looking for the words “airplane” and “bomb,” etc. But recording a voice call and figuring out what was said, the hardware guys aren’t good at that.
How can privacy concerns be reconciled with DPI and content awareness?
To be honest, there is no issue. It’s a good story, but it isn’t reality. People think with DPI [that] someone is tapping into the network. No, they’re not. No one is making decisions on it, no one could store that much data. It’s not really a scary thing, and the faster things go and the more things become content-aware, the more they get routed differently, which makes it even harder to spy on stuff — much more difficult than people realize. I don’t think there is any system on the market that can keep up, so it isn’t a problem as long as the speeds keep getting faster. It just won’t happen. And no one can look through 10 gigabit speed data.
What is the current state of the art for intrusion detection and prevention? Why do we still see so many intrusions?
They have gotten worse, not better, over time. And this is coming from a guy who started an IPS company, TippingPoint. The reason is they got adopted. It’s like any industry: As soon as people adopt it and it is considered normal, they stop innovating in that area. Sales takes over and engineering ceases. They give up on innovating. It’s sad, but firewalls are the same way. There is a myth that firewalls and IPSes will one day merge, and so far, that technology has not really done very well. And the reason why is that firewalls stopped innovating years ago, so now you have two people who aren’t innovating any more. They don’t really go together. Someone’s got to lead the way.
What needs to be done to improve the situation, and where do we look for innovation?
There has to be some kind of standardization. The security industry is unregulated. Health care is regulated, and finance, but the IPS vendors and the firewall vendors and the network equipment vendors are unregulated. They can put out anything they want. You plug it in and get attacked, and you have no recourse. There is no Underwriter’s Laboratory for network equipment. You couldn’t buy any appliance for your house and plug it in without a UL seal, but you can get network equipment without any seal at all. The only way to stop it is for the government to pass a law. It might be unpopular, and it might cost money, but we should start testing these devices, validating that they live up to their marketing and give them a stamp.
How can government and industry defend their systems from what appear to be sophisticated attacks by other countries?
That’s tough, because with criminals you know their motive, where with governments, the motive is not so straightforward. The only way to do it is to create a security posture, a set of policies that you will follow and not waver from. You’re going to have to set some minimum boundaries that you are willing to put up with. For example, outsourcing. I’m not worried about China as much as I’m worried about India. If everybody outsources to India, where is your source code? Say the American government buys equipment from Cisco to protect its national networks, and Cisco outsources to India. It’s really scary, and there is no regulation around that. For example, IBM sold its laptop business to Lenovo, and the prime minister of Germany ordered a laptop from IBM, and the Chinese government installed a keylogger on it. How do you know what you’re getting is good? The only way to do it is testing. Trust, but verify.
Government IT security requirements are moving toward more continuous monitoring and real-time risk management, is this a good move?
I think they’re doing the right thing. Continuous monitoring is the way to go. The snapshot approach doesn’t work.