We are living in world of increasingly smart devices. Not really intelligent; just smart enough to be dangerous.
As more devices become IP-enabled, they contribute to the pool of things that can be recruited into botnets or other platforms used for distributed attacks. Distributing attacks make it more difficult to trace the source of the attack and also makes it easier to overwhelm a target. In the past year, distributed denial of service has become the attack of choice for activists and blackmailers.
Prolexic, a DDOS security company, has published a white paper on Distributed Reflection Denial of Service (DrDOS) attacks that focuses on a handful of protocols, including the Simple Network Management Protocol. SNMP is an application layer (Layer 7) protocol commonly used to manage devices with IP addresses.
“Unlike other DDOS and DrDOS attacks, SNMP attacks allow malicious actors to hijack unsecured network devices — such as routers, printers, cameras, sensors and other devices — and use them as bots to attack third parties,” the report points out.
This is a concern not only because it increases the number of possible devices that can be compromised, but also because remote devices such as printers and sensors of every kind often are less likely to be properly managed and secured, leaving them open to exploit.
For public-sector agencies, this can include such devices as sensors used in weather observations, control valves at power plants, door locks in prisons, traffic signals and any number of other connected devices. A search engine such as Shodan can reveal those connected devices, many of which are completely without security,
SNMP uses the User Datagram Protocol, a stateless protocol that is subject to IP spoofing. A Reflection DOS attack using SNMP is a type of amplification attack, because an SNMP request generates a response that typically is at least three times larger. Boiled down to its basics, an attacker can port-scan a range of IP address to identify exploitable SNMP hosts. He sends an SNMP request to these hosts using the spoofed IP address of the target server, and the hosts’ replies saturate the target’s bandwidth, making it unavailable.
“The raw response size of the traffic is amplified significantly,” the report says. “This makes the SNMP reflection attack vector a powerful force.”
The best way to protect yourself from being shanghaied into such an attack is to identify all of the devices accessible on your network, whether or not they appear to be sensitive, and properly manage them. Prolexic offers a list of mitigations in its paper.
Remote management of and access to otherwise dumb devices can be a great convenience, but the trade-off is that it adds to the list of things that must be managed and secured.
Posted by William Jackson on May 03, 2013 at 9:39 AM0 comments
Arati Prabhakar, director of the Defense Advanced Research Projects Agency, has an interesting description of DARPA’s mission: Its job is to prevent technological surprises to the U.S. military and to create surprises of its own.
In its 55-year history DARPA occasionally has surprised itself, and the agency now is working to stay ahead of its own technologies. Take the Global Positioning System for example. DOD’s weapons systems depend on GPS today, but everyone else has it as well. Over the last 30 years it has evolved from an exclusive and exotic military tool to a consumer service embedded in millions of smart phones.
“This dependency creates a critical vulnerability for many U.S. munitions systems,” DARPA says. And now the agency is searching for an alternative.
DARPA was created in 1958 in the wake of Russia’s launch of Sputnik, a surprise that the United States did not want repeated. Its job is to keep this country ahead of the game.
Coming off a decade of two concurrent wars and rapid technological advances, the agency took some time to assess its role going forward.
At a recent press briefing Prabhakar outlined three trends shaping the new environment DARPA finds itself in:
- The threats facing the country have shifted from a monolithic nation-state adversary to a complex of nations, terrorist and criminal organizations and individuals, all with access to advanced cyber technology.
- The U.S. military is critically reliant on this technology, which is being produced globally.
- Money for national security is likely to be tight for the foreseeable future.
“These three factors create a very challenging environment,” Prabhakar said. And this puts pressure on DARPA to keep producing asymmetric technologies — tools that have an impact far beyond the cost of development.
A case in point is a next-generation positioning system that would supplement, if not replace, GPS. The Micro-Technology for Positioning, Navigation and Timing program aims to produce self-contained chip-based systems that do not depend on GPS signals. A significant step toward this has been produced by DARPA researchers at the University of Michigan who have developed a small timing and inertial measurement unit that integrates many of the needed functions in a device smaller than a penny.
DARPA also is working to get out in front in cyberwarfare. The ominously named Plan X is an effort to move offensive cyberwar capabilities beyond the current generation of handcrafted weapons (nobody mentioned Stuxnet during the briefing) and fully integrate them into the portfolio of tactical options on the battlefield.
“I think that will be extraordinarily powerful,” Prabhakar said.
Ironically, such offensive weapons, whether launched by or against the United States, will use the Internet, another technology developed by DARPA. Once again, the agency is racing to keep ahead of itself.
Posted by William Jackson on Apr 25, 2013 at 9:39 AM0 comments
Attacks against government systems dropped sharply in 2012 compared with the year before, according to the latest Internet Security Threat Report from Symantec, but that does not mean that the pressure is off. Attackers are just changing their tactics by targeting upstream companies in the government supply chain.
“There has been a marked shift” in targeting, said Paul Wood, Symantec’s cybersecurity intelligence manager. Attackers seem to be shifting their sights to the manufacturing sector, and often to smaller companies that offer softer targets, he said.
The most recent report analyzes attack data gathered during 2012 calendar year from Symantec’s Global Intelligence Network and its cloud-based Web and e-mail security services.
The shift is evident in the lists of most commonly targeted sectors for the last two years. In 2011 government was the most-targeted sector, with 25 percent of identified attacks. In 2012 it moved to fourth place, with just 12 percent. In the same period, the manufacturing sector went from third place to the top of the list, accounting for 24 percent of attacks last year.
But “manufacturing” is a broad classification and the figures become more interesting when you break them down. “The vast majority seem to be in the defense realm,” Wood said. Six of the 10 most frequently targeted companies are defense industry contractors.
In an increasingly global, off-the-shelf IT environment, supply chain security has become a major concern for agencies and steps are being taken to identify trusted suppliers. In addition to the risk of counterfeit or compromised products and components, vendors and private-sector partners also can be back doors into well-defended government systems. Homeland Security and the Defense Department address this issue in the Defense Industrial Base program to streamline the sharing of intelligence with supply chain partners.
But protecting the entire chain with sensitive information can be difficult. The percentage of small to medium-sized businesses being targeted has increased sharply in the last year, from 18 percent in 2011 to 31 percent in 2012.
“When you look at the supply chain, the small business is perhaps the weakest link,” Wood said. A small upstream partner could provide the access and information an attacker could use to successfully social engineer an attack against a larger partner.
It is difficult, if not impossible, to identify the source of many attacks, and because those being analyzed were the ones that were identified and blocked, it is hard to say for sure what the attackers would have done had they been successful. But the shift shows that the attackers are motivated, disciplined and persistent. The worst kind of attacker.
Posted by William Jackson on Apr 17, 2013 at 9:39 AM0 comments
The evolution of IT can take place at revolutionary speed, and when systems don’t keep up with the pace of change they can become vulnerable to serious risks, says retired Lt. Gen. William T. Lord, former Air Force CIO.
“I think that the next Achilles’ heel is legacy software,” Lord said.
A combination of unsupported software, well-known vulnerabilities and new applications that expose old platforms to networks can create unnecessary complexity and open critical systems to threats, he said.
Not every piece of old software is a risk, however. “Some of the things we use in our nuclear command and control are so old, but so reliable and unconnected to anything else, that it probably does not pose a threat,” Lord said. “But our problem is that most of our legacy systems in government are 20 or 30 years old,” and need to be updated.
Fixing this installed problem will requires more flexible contracting to let government take advantage of smaller, more nimble contractors. Lord, who now is an IT systems and services consultant, is making legacy software something of a crusade in his post-military career, calling it the greatest obstacle to IT progress in government.
Defining “legacy software” can be difficult. Some would argue that any software in use can be called legacy, because if you’re using it, it’s already old. Most would agree that any software still in use that is not supported by its developer or vendor could be classed as legacy. There is a huge installed base of this. A recent analysis by the Web Security company Websense, for example, found that three quarters of government computers are running unsupported versions of Java.
Getting rid of legacy software is even harder than defining it. Wholesale programs can be expensive and often end in failure. The Air Force in 2004 began a program to replace 240 outdated systems in its Expeditionary Combat Support System with an Enterprise Resources Planning system. A contact was awarded to Computer Sciences Corp. in 2006 and terminated six years and $1 billion later. “The effort got stopped,” Lord said.
The problems included “budget doldrums,” which complicates almost any kind of project, and the difficulty of finding a good time for replacing operational systems. This can be particularly difficult with combat support systems when the combat never stops, Lord said. “In my experience in the Air Force, there was no end to the battle.”
The skills needed to update, modernize or replace legacy software can come from non-traditional service providers, he said — smaller software companies that often do not have the resources to compete in the government market. It would help to have major league contractors partner with the minor league companies for government contracts, but there often is little government incentive for this.
Agencies are supposed to make small and minority-owned business contracts, but accounting policies give contracting officers little credit for acquiring services from small companies through a larger contractor, Lord said.
Another problem is a lack of dedicated money for fixing vulnerabilities in old applications. The Air Force sets aside money for hurricane damage, but not for software bugs, so that maintaining old software is difficult. Government needs to realize that vulnerabilities are as inevitable as bad weather, Lord said. “We haven’t caught up with that kind of thinking.”
Posted by William Jackson on Apr 09, 2013 at 9:39 AM4 comments