The need for timely sharing of information about both potential and actual attacks has been considered a prime focus for government and industry cybersecurity for at least the past decade. The 9/11 Commission report first brought to light the lack of intell sharing among agencies, for example, and that lack was seen extending into the cybersecurity realm.
The language used in the report, though aimed at terrorism, speaks as much to the problems surrounding cybersecurity today. The events of 9/11 showed "an enemy who is sophisticated, patient, disciplined, and lethal," and also the "fault lines within our government (and the) pervasive problems of managing and sharing information across a large and unwieldy government."
The Obama administration's most recent push to improve U.S. cybersecurity tries to ratchet up efforts to improve information sharing both within government and with the private sector. Shortly after, the administration announced the formation of a new Cyber Threat Intelligence Integration Center that's intended to be the government's focus for rapid collection and dissemination of information on cyberthreats.
How far this will go is an open question. While some have welcomed the new proposals, others wonder if the new center will just add to the organizational confusion. The National Security Agency, the Department of Homeland Security, the FBI and the military already have responsibility for collecting this kind of information and, after years of acrimony and pushback, they've managed to develop cohesion about sharing it.
Technically, the tools for sharing have also progressed, leading to a number of acronymic specs such as TAXII (the Trusted Automated eXchange of Indicator Information), STIX (the Structured Threat Information eXpression) and the Cyber Observable eXpression (cybOX). Joining them recently is the Data Aggregation Reference Architecture (DARA), a first response to the 2012 National Strategy for Information Sharing and Safeguarding.
These and other tools all perform important roles. DARA, for example, is aimed at providing a model for how various groups can pull data sets together in order to improve security while also protecting individual privacy, which has been one of the big stumbling blocks to sharing of information.
But is all of this enough? If 2014 showed anything, it's that cybersecurity efforts are falling behind the speed and the level of sophistication attackers apply to the way they get threats into the cyber infrastructure. President Obama in fact mentioned the attack on Sony Pictures late last year as just the latest reason behind his new legislative proposals.
Industry looks to the government for a lead on many aspects of cybersecurity, but the fact is that government is not noted for its speed in dealing with cyber threats, or for convincing industry to share information about attacks with it. However, it is trying. The FBI, for example, released an unclassified version of its Binary Analysis Characterization and Storage System (BACSS) as an additional incentive to public/private sharing.
Now industry seems to be expanding its own efforts to improve sharing. Facebook has launched a framework for "importing information about [threats] on the Internet in arbitrary formats, storing it efficiently, and making it accessible for both real-time defensive systems and long-term analysis." Early partners already include Bitly, Dropbox, Pinterest, Tumblr, Twitter and Yahoo.
Microsoft last year also introduced Interflow, its own attempt to collaborate more closely with the cybersecurity community. That adds to a number of other international collection and sharing efforts, as well as the global infrastructures that individual security companies have established to collect information about threats.
There are still major barriers to sharing, particularly privacy and the need for encryption. How government manages to live within, and profit from, this growing sharing ecosystem while improving how fast it reacts to threats is the real question it has to address.
Posted by Brian Robinson on Feb 13, 2015 at 9:41 AM3 comments
It’s become an article of faith that you can’t accomplish real advances in cybersecurity until you get the executive suite involved, and that applies as much to government as to private industry. Well, you can’t get much higher on the ladder than the rarified atmosphere at Davos in Switzerland, where the elite’s elite gather yearly to discuss issues affecting the world’s economic health.
Cybersecurity was a main topic this year. It’s been on the agenda before, or at least has been talked about in the hallways, but after last year’s horrendous breaches and Black Hat successes, the consensus was that cyber requires an urgent focus.
Among the most critical topics, security experts at Davos warned about the increasing dangers from The Internet of Things – dubbed the Internet of Threats by Kaspersky Labs’ Eugene Kaspersky – and the fact that cybercrime is becoming much more professionalized, with both criminal and terrorist activists now in the game.
As much as technology offers promise for economic growth, they seemed to say, it also dramatically increases the ways attackers can threaten the viability of public and commercial enterprises.
The same message can be taken from a number of wide-ranging studies recently published. Cisco’s 2015 Annual Security Report warned that attackers are, indeed, getting better at what they do and that users are becoming unwitting enablers.
For example, said Jason Brvenik, principal engineer with Cisco’s Security Business Group, online spam had been decreasing recently but showed a 250 percent increase in 2014 compared to the previous year. Attackers are now also using it for phishing to directly target users.
“With the emphasis by organizations now on protecting data and IT assets, attackers are increasingly challenged in attacking those kinds of targets,” he said. “So, they are paying more attention to users who may have the (network) credentials that will let them get inside the enterprise.”
Despite the warnings, organizations are still not up to speed with what’s needed for this emerging ecosystem of cyber threats. Only 38 percent of those surveyed by Cisco admitted to using patching and system configuration updates to boost their defensive capabilities, which Brvenik said is considered an effective security practice. And more than half of all versions of Open SSL – whose vulnerabilities led to the catastrophic Heartbleed bug – were found to be older than 50 months, and therefore still wide open to attack.
Here’s another frightening statistic: As organizations actually start moving their data to the cloud, they may be opening up other avenues for attackers to exploit, according to the Cloud Security Alliance. In its 2014 Cloud Adoption, Practices and Priorities Survey Report (CAPP), it found that three quarters of respondents were aware of the need for security, but nearly as many admitted that they didn’t know the number of ‘shadow IT’ apps within their organization.
Shadow IT is a very modern dilemma. It refers to the users within organizations who tend to use whatever technology they can find to make themselves more productive. So they reach for software that is easily downloadable and configurable – without informing the IT department. What’s more, each line of business uses what apps it deems appropriate, without coordinating with other groups in the same organization.
That leads to situations such as one described by Kamal Shah, vice president of CSA member Skyhigh Networks. One of its customers ended up with 27 different file sharing services being used among its 80,000 plus employees, many of which didn’t meet the company’s security and use policies.
“We’re seeing more and more CSOs and CIOs having cloud security discussions (with senior executives) in both the government and private sectors,” Shah said. “However, while there’s been a lot of talk recently about the increase in sophistication of threats, and cloud security is starting to get to where it needs to be, we’re only in the early innings of this.”
One of the problems could be that, according to a survey by EMC Corp., data protection still warrants only a standalone focus as a way of making sure data is always available, and not a practice that’s seen as a part of overall cybersecurity measures.
“But we definitely think it should be, since we are talking about (overall) protection of data, and cybersecurity is a part of that,” said Gregg Mahdessian, EMC’s director of federal sales, Data Protection Solutions.
What’s the takeaway from all of this? As Davos and the various surveys show, there’s now no shortage of awareness of the magnitude of the problem. Executives at the highest level in both public and private organizations know that attackers are getting better at what they do, and therefore security also needs to get better. The disconnect is in understanding how technologies and tools can best be deployed to make that happen.
“The trend in days past was that the more invisible security was, the more effective it was being,” said Cisco’s Brvenik. “That led, in many cases, to organizations putting security technology in place and then forgetting it was there.”
That needs to change, he said. Now, there needs to be a high visibility into what security tools can provide and why, which will lead to a greater understanding from the executive level down to individual users about those tools and the secure processes they embody.
If the user is the new focus for attackers, an informed user will be the best defense.
Editor's note: This blog was changed Feb. 2 to correct the spelling of Gregg Mahdessian's name.
Posted by Brian Robinson on Jan 30, 2015 at 10:54 AM2 comments
If there’s any one thread that can be cultured from the cybersecurity stories of 2014, it has to be the increasing sophistication of attacks that are being made against both public and private organizations. That only looks to continue in 2015, with potentially staggering losses for the victims.
A recent study commissioned by EMC Corp., with research carried out in August and September last year, found that companies on average had lost 400 percent more data since 2012, with losses and downtime costing enterprises some $1.7 trillion.
The “good” thing was that the number of data loss incidents had decreased over time, though this was balanced by the fact that the volume of data lost in each incident “is growing exponentially.”
In what is becoming a core focus for cybersecurity, the report pointed out that a confluence of factors, including big data, mobile and hybrid cloud technologies – all of which are central to most organizations’ business plans – are creating new challenges for data protection.
And it seems many organizations are not ready for these inroads, with over half saying they have no disaster recovery plan for any of the three environments, and nearly two-thirds rating them as difficult to protect.
One of the more recent cyber attacks to be uncovered showed just how clever attackers have become and consequently how dangerous it is for organizations to be unprepared. At the same time that bad guys were siphoning off Sony Pictures secrets, Blue Coat researchers discovered new Android malware targeting high-profile victims in government, finance, military and engineering in at least 37 countries .
The techniques used in the attack are well beyond those typically seen in Android malware. They were designed to record the audio of mobile phone calls and, given the list of government and embassy targets, the attack, “appears to be a well-executed plan to get access to confidential or insider information from high-profile targets across critical sectors.”
This comes just weeks after the announcement of the Regin Trojan, another highly complex and very patient attack aimed at monitoring the phones and networks that use the Global System for Mobile Communications (GSM) standard, which has more than a 90 percent share of the world’s mobile market.
What’s clear is that the cyber threat ecosystem is becoming much more diverse and much deeper than in the past, with criminals as expert on some levels as the state-sponsored threats that have received so much press lately.
It’s also become much easier and cheaper for attackers to get hold of malware and cyber weapons they can use, with the rise of a professionalized marketplace for cybercrime tools and stolen data.
The high-profile Sony attack actually says little about what the future holds for cybersecurity. Whether North Korea, China or other state actors were or were not involved, it seems the attack itself was not highly sophisticated. The difference is that it was aimed at holding the company ransom over a film’s release rather than potential monetary gains targeted by other hig- profile attacks against Target, Home Depot and the like.
The Sony attack also succeeded because the company itself was unprepared, slow to detect it once it was underway and then slow to react and close it down. As the EMC report showed, if organizations are not prepared to ward off relatively simple attacks such as this, what is going to happen once the far more sophisticated and focused attacks on big data/mobile/cloud infrastructures, with their much greater potential payback, are let loose?
The story for 2015 and beyond will be what both public and private sector organizations can do to shore up their defenses against what attackers see as increasingly attractive, and highly vulnerable, targets.
Posted by Brian Robinson on Jan 09, 2015 at 8:27 AM0 comments
President Obama this month proposed a $263 million program for training and equipment to help make police departments more accountable after recent high-profile incidents of police violence in Ferguson, Mo., New York City and elsewhere. It includes $75 million over three years to help purchase 50,000 wearable body cameras for officers.
The cameras are light-weight, high-resolution alternatives to dashboard-mounted video systems already in use in many police cruisers. But the new cameras are only the first step in supporting a new video system for police. Once the cameras have been bought, departments will have to store, manage and secure terabytes of data, sometimes for decades. And because the quality of the video the cameras produce is improving, the amount of storage needed is likely to be much greater.
Many departments are familiar with the requirements of older black and white surveillance and dash-mounted video systems. But new body cameras can produce high-definition, full motion, wide-area color images. This is great for evidence but can quickly overwhelm current storage systems.
One camera can produce 2.3 gigabytes per hour, or 18.4 gigabytes per shift, said Dave Frederick, senior director of product marketing for Quantum, a storage solution provider. Because most cameras will be activated only during an incident, they probably will not be used eight hours at a stretch. But they still could produce 9 gigabytes of data per shift, he said.
How much storage this will require depends on a number of variables: The number of cameras in use, their format and resolution and the video retention policies. Department policies can call for saving video for anywhere from a month to a year, but if the video is used in court, rules of evidence can require that it be kept for years and – in the case of a conviction – possibly for the lifetime of the defendant.
One department with 1,500 officers found that it would need 700 terabytes of storage to accommodate body cameras, Frederick said, more than double what was needed by its older dash-cam system.
Quantum proposes a tiered-storage solution that balances the cost of storage with performance. Such a system typically would include a high-performance ingest system, typically using more expensive spinning disks to quickly take in new video and make it accessible.
Subsequent storage tiers could include lower-performance spinning disks and tape for archival storage, which is not as fast but is less expensive. Making such a system practical depends on the ability to automatically move data from one tier to another as needed, without violating chain of possession rules for video used as evidence.
Another consideration is that video being archived as evidence might have to be readable 25 years or more from now, when technology is likely to have changed dramatically. Anyone who has been stuck with a shelf of Betamax tapes can appreciate the challenge of future-proofing video archives.
The cloud could be an attractive short-term solution for storing police video, as long as security and management requirements can be met. But at some point, the long-term cost of renting storage space is likely to overwhelm the initial savings, and any department expecting to use video for the long haul will have to decide how best to acquire and manage its own storage system.
These challenges do not mean that police departments cannot or should not take advantage of new video technology to better document activities on the street. But they should remember that the camera is only the front end of a larger system, and once the “record” button is pushed, there will be an obligation to manage the video for years to come.
Posted by William Jackson on Dec 19, 2014 at 11:39 AM4 comments