2015 augurs newer, more devastating exploits on the unprepared

2015 augurs newer, more devastating exploits on the unprepared

If there’s any one thread that can be cultured from the cybersecurity stories of 2014, it has to be the increasing sophistication of attacks that are being made against both public and private organizations. That only looks to continue in 2015, with potentially staggering losses for the victims.

A recent study commissioned by EMC Corp., with research carried out in August and September last year, found that companies on average had lost 400 percent more data since 2012, with losses and downtime costing enterprises some $1.7 trillion.

The “good” thing was that the number of data loss incidents had decreased over time, though this was balanced by the fact that the volume of data lost in each incident “is growing exponentially.”

In what is becoming a core focus for cybersecurity, the report pointed out that a confluence of factors, including big data, mobile and hybrid cloud technologies – all of which are central to most organizations’ business plans – are creating new challenges for data protection.

And it seems many organizations are not ready for these inroads, with over half saying they have no disaster recovery plan for any of the three environments, and nearly two-thirds rating them as difficult to protect.

One of the more recent cyber attacks to be uncovered showed just how clever attackers have become and consequently how dangerous it is for organizations to be unprepared. At the same time that bad guys were siphoning off Sony Pictures secrets, Blue Coat researchers discovered new Android malware targeting high-profile victims in government, finance, military and engineering in at least 37 countries .

The techniques used in the attack are well beyond those typically seen in Android malware. They were designed to record the audio of mobile phone calls and, given the list of government and embassy targets, the attack, “appears to be a well-executed plan to get access to confidential or insider information from high-profile targets across critical sectors.”

This comes just weeks after the announcement of the Regin Trojan, another highly complex and very patient attack aimed at monitoring the phones and networks that use the Global System for Mobile Communications (GSM) standard, which has more than a 90 percent share of the world’s mobile market.

What’s clear is that the cyber threat ecosystem is becoming much more diverse and much deeper than in the past, with criminals as expert on some levels as the state-sponsored threats that have received so much press lately.

It’s also become much easier and cheaper for attackers to get hold of malware and cyber weapons they can use, with the rise of a professionalized marketplace for cybercrime tools and stolen data.

The high-profile Sony attack actually says little about what the future holds for cybersecurity. Whether North Korea, China or other state actors were or were not involved, it seems the attack itself was not highly sophisticated. The difference is that it was aimed at holding the company ransom over a film’s release rather than potential monetary gains targeted by other hig- profile attacks against Target, Home Depot and the like.

The Sony attack also succeeded because the company itself was unprepared, slow to detect it once it was underway and then slow to react and close it down. As the EMC report showed, if organizations are not prepared to ward off relatively simple attacks such as this, what is going to happen once the far more sophisticated and focused attacks on big data/mobile/cloud infrastructures, with their much greater potential payback, are let loose?

The story for 2015 and beyond will be what both public and private sector organizations can do to shore up their defenses against what attackers see as increasingly attractive, and highly vulnerable, targets.

Posted by Brian Robinson on Jan 09, 2015 at 8:27 AM0 comments


Police body cameras are only one piece of the video equation

Police body cameras are only one piece of the video equation

President Obama this month proposed a $263 million program for training and equipment to help make police departments more accountable after recent high-profile incidents of police violence in Ferguson, Mo., New York City and elsewhere. It includes $75 million over three years to help purchase 50,000 wearable body cameras for officers.

The cameras are light-weight, high-resolution alternatives to dashboard-mounted video systems already in use in many police cruisers. But the new cameras are only the first step in supporting a new video system for police. Once the cameras have been bought, departments will have to store, manage and secure terabytes of data, sometimes for decades. And because the quality of the video the cameras produce is improving, the amount of storage needed is likely to be much greater.

Many departments are familiar with the requirements of older black and white surveillance and dash-mounted video systems. But new body cameras can produce high-definition, full motion, wide-area color images. This is great for evidence but can quickly overwhelm current storage systems.

One camera can produce 2.3 gigabytes per hour, or 18.4 gigabytes per shift, said Dave Frederick, senior director of product marketing for Quantum, a storage solution provider. Because most cameras will be activated only during an incident, they probably will not be used eight hours at a stretch. But they still could produce 9 gigabytes of data per shift, he said.

How much storage this will require depends on a number of variables: The number of cameras in use, their format and resolution and the video retention policies. Department policies can call for saving video for anywhere from a month to a year, but if the video is used in court, rules of evidence can require that it be kept for years and – in the case of a conviction – possibly for the lifetime of the defendant.

One department with 1,500 officers found that it would need 700 terabytes of storage to accommodate body cameras, Frederick said, more than double what was needed by its older dash-cam system.

Quantum proposes a tiered-storage solution that balances the cost of  storage with performance. Such a system typically would include a high-performance ingest system, typically using more expensive spinning disks to quickly take in new video and make it accessible.

Subsequent storage tiers could include lower-performance spinning disks and tape for archival storage, which is not as fast but is less expensive. Making such a system practical depends on the ability to automatically move data from one tier to another as needed, without violating chain of possession rules for video used as evidence.

Another consideration is that video being archived as evidence might have to be readable 25 years or more from now, when technology is likely to have changed dramatically. Anyone who has been stuck with a shelf of Betamax tapes can appreciate the challenge of future-proofing video archives.

The cloud could be an attractive short-term solution for storing police video, as long as security and management requirements can be met. But at some point, the long-term cost of renting storage space is likely to overwhelm the initial savings, and any department expecting to use video for the long haul will have to decide how best to acquire and manage its own storage system.

These challenges do not mean that police departments cannot or should not take advantage of new video technology to better document activities on the street. But they should remember that the camera is only the front end of a larger system, and once the “record” button is pushed, there will be an obligation to manage the video for years to come.

Posted by William Jackson on Dec 19, 2014 at 11:39 AM4 comments


Cyberattack ‘platforms’ call for defense in depth – and breadth

Cyberattack ‘platforms’ call for defense in depth – and breadth

It’s getting a lot harder to be impressed by the latest piece of malware or cyber threat that hits the streets, given the already formidable arsenal that has been created for hackers to choose from. The every day distributed denial of service (DDoS) threat now seems almost quaint. Then along comes Regin.

To be more precise, along comes Backdoor.Regin, recently discovered and described in detail by Symantec. What astounds about this Trojan is not just its complexity, but the time it’s taken for it to mature into its current state.

Symantec has traced attacks back to at least 2008, and some reports suggest components of Regin go as far back as 2003.

That takes the definition of Advanced Persistent Threat (APT) to a new level. And it may go even further since Symantec warns that analysis of it will probably reveal much more.

“Threats of this nature are rare and only comparable to the Stuxnet/Duqu family of malware,” it said. “Many components of Regin remain undiscovered, and additional functionality and versions may exist.”

The company describes Backdoor.Regin as a multi-staged threat, with all but the first stage hidden and encrypted. It also uses a modular approach and can be tailored with custom features for specific targets. Based on what’s been discovered so far, it has dozens of potential payloads.

In its own analysis, security researcher Kaspersky Labs said malware is not an accurate description of Regin. It should instead be seen as a cyberattack platform, which attackers deploy to gain total remote control of networks at all levels. According to Kaspersky, Regin is one of the most sophisticated it has analyzed.

“The ability of this [Regin] group to penetrate and monitor [Global System for Mobile] networks is perhaps the most unusual and interesting aspect of these operations,” the company said. “Although GSM networks have mechanisms embedded that allow entities such as law enforcement to track suspects, there are other parties which can gain this ability and then abuse it to launch other types of attacks against mobile users.”

GSM is the most widespread mobile standard, and has over a 90 percent share of the world’s mobile market. Other than the United States, which primarily uses the Code Division Multiple Access (CDMA) standard, most countries that have mobile networks use GSM.

At first glance, one would think that makes the United States safe from Regin attacks. Looking at the list of infections so far, big countries such as Russia and Germany are among the victims, along with some smaller ones. The United States is notably absent.

But as it turns out, that shouldn’t necessarily offer any comfort. As a recent column here indicated, many of the most sophisticated attacks now come through the exploitation of privileged network accounts. That means that while government organizations may not be direct victims of an attack, if  attackers get into the network of a trusted partner, they can eventually get to government data.

With the kind of global reach that government agencies now have to have to do business – even at the state and local level – no one should presume they are safe from bad guys getting into their networks and systems and stealing data.

And even if they haven’t been directly attacked, that doesn’t mean their partners have not been, nor the trusted partners of those partners and so on down the line.

Defense-in-depth has become the solution du jour for protecting data from malware and Trojans such as Regin that organizations now have to assume will penetrate their networks. Perhaps that should now be extended to a “defense-in-breadth” in order to cover vulnerabilities posed by threats outside the organization.

Modern organizations, including government agencies, have to do business with those lateral partners, so it should make sense to have such protections in place.

Posted by Brian Robinson on Dec 12, 2014 at 10:27 AM0 comments


Cyberecurity’s not done until the paperwork is finished

Cybersecurity’s not done until the paperwork is finished

The Veterans Affairs Department has been dinged once again by the Government Accountability Office for  lack of follow-through in its cybersecurity operations. In a recent report, VA Needs to Address Identified Vulnerabilities,  the GAO warned that unless VA’s security weaknesses are fully addressed, “its information is at heightened risk of unauthorized access, modification and disclosure, and its systems at risk of disruption.”

The problem cited in the report is not so much that VA is doing a bad job securing its networks and systems, but that it has not properly documented security activities and has not developed action plans and milestones for correcting problems.

Documentation and planning are more than busywork. Although it is true that checking boxes and creating reports will not by themselves improve IT security, without them it can be difficult if not impossible to assure what has been done, that it has been done properly and that it can be repeated if necessary.

These processes can make the difference between constantly fighting brushfires and being able to effectively protect an agency enterprise and improve  its security posture.

To quote a rule well-known to every government worker: The job’s not finished until the paperwork is done.

Because of its size and the amount of personal and other sensitive information it maintains, the VA is a high-value target. In January, a defect in VA’s web-based eBenefits system exposed personal data of thousands of veterans and their dependents. And in 2010, a nation-state-sponsored attack took advantage of weak technical controls to gain “unchallenged and unfettered access” to VA systems, the GAO said.

These were fairly recent hits, but the fact remains that development of an effective information security program has been a major management challenge for the department since the late 1990s.

This does not mean that VA has no information security. VA’s Network Security Operations Center in 2012 responded to an attack by outsiders, analyzing the scope of the incident and documenting its responses. Even so, “VA could not provide sufficient documentation to demonstrate that these actions were effective,” GAO said.

This problem is not limited to VA. A recent governmentwide review by GAO found that agencies were not able to document effectiveness of their incident response about 65 percent of the time.

In the case of the 2012 VA incident cited, forensics analysis data was not available because of a lack of storage space. The department’s incident response policies also did not provide the incident response team with access to systems logs needed to fully assess the extent of the breach, which raises questions about the effectiveness of the response.

The problems are part of a vicious circle in government cybersecurity. Incident response teams are stretched thin, and their top priority is responding to the problem at hand. Documentation and policy enforcement often take a back seat. But without effective documentation and policies, it can be hard to move beyond crisis management to effectively managing risk.

As I have said before, regulatory compliance does not equal security, but it can provide an essential baseline for achieving more effective security.

Posted by William Jackson on Dec 05, 2014 at 1:08 PM0 comments