What’s worse: Living with legacy systems or replacing them?

What’s worse: Living with legacy systems or replacing them?

The recent revelation of a breach at the Office of Personnel Management, which could have resulted in the theft of personal information of millions of government employees, also points up the broader problem government has with legacy systems -- whether it’s worth spending the money to secure them.

Not that securing the OPM’s systems would have done much good in this case --  according to the Department of Homeland Security Assistant Secretary for Cybersecurity Andy Ozment, the systems were not directly penetrated.  Instead, attackers obtained OPM users’ network credentials and got to the systems and data from the inside.

Donna Seymour, the OPM’s CIO, told a recent House Committee on Oversight and Government Reform  that the department was implementing database encryption, but that some of legacy systems were not capable of accepting encryption.

Some of the OPM’s systems are over 20 years old and written in COBOL, she said, which would require a full rewrite to include encryption and other security such as multi-factor authentication.

This is a government-wide problem. Many of the financial and administrative systems that are central to the agencies’ daily operations use the nearly 60-year old COBOL. Most agency CIOs have targeted those systems for replacement, but it’s not a simple rip-and-replace job -- any mistake could have a severe impact on the agency’s ability to fulfill its mission.

For that reason, many agencies have chosen to maintain those systems for now, but that’s not cheap, either. The OPM itself said last year that maintaining its legacy systems could cost 10-15 percent more a year as people with the right kind of expertise retire. And throughout government, legacy systems account for over two-thirds of the annual IT spend.

That expertise is unlikely to be replaced. Colleges aren’t turning out COBOL-trained coders anymore, and, with COBOL way down the list of popular languages, that won’t change. Agencies could bring in consultants to rewrite the code. But, again, not cheap.

And COBOL is unlikely to disappear anytime soon. Because of its ubiquity and utility, many organizations will continue to use COBOL until it’s pried out of their cold, dead hands. Meanwhile, old mainframe companies that have recently refocused on the cloud continue to update their COBOL tools to keep pace with current IT trends.

It’s not as if problems with legacy systems were the only reason for the breaches at OPM. Lawmakers also berated agency officials for their lack of attention to security governance issues that had been brought up years ago and were highlighted yet again last year in an OPM Inspector General report.

But the legacy issues are real and, according to some reports, extend even to “legacy” security systems such as signature-based firewalls, intrusion prevention systems and other widely installed devices that are just not capable of stopping modern, fast, sophisticated and chameleon-like threats.

However, at least the situation with the federal government is probably not as bad as that of a public school district in Grand Rapids, Mich., which is still running the air conditioning and heating systems for 19 schools using a Commodore Amiga -- as in the 1980s-era personal computer that was popular for home use -- because a replacement system reportedly will cost up to $2 million.

At least, we hope not.

Posted by Brian Robinson on Jun 19, 2015 at 10:55 AM7 comments


IT security’s blind spot

IT security’s blind spot

Good network and data security is made up of several parts. Technology and culture are certainly important elements, but so is perception. If you don’t know what’s going on in your IT infrastructure, then how can you be so sure that you are protected as well as you think? Hubris has been a big reason for many of the most serious breaches over the past few years.

That seems to be true also for emerging infrastructures that include cloud services. Many organizations and government agencies have not made the move to cloud yet, or have done so only hesitantly. And perhaps they are fooling themselves in thinking they have this transition under control, and that they’ll be able to manage the security implications

Skyhigh Networks took a look at this, using anonymized, actual usage data collected from public sector organizations in both the United States and Canada. In its Cloud Adoption and Risk in Government report for the first quarter of this year, Skyhigh discovered, among other things, that government on average was underestimating the use of cloud services by its employees more than ten-fold.

“That’s startling because we tend to think the government sector is very locked down,” said Kamal Shah, vice president of product and marketing at Skyhigh. “In reality, employees are finding and using cloud services to help them get their jobs done, regardless of what the official policies are.”

When they asked government IT officials what services they thought employees were using, they’d come up with anything in between 60 and 80, he said. The Skyhigh study found the average public sector organization uses 742 separate and unique cloud services.

If that sounds like a lot, compare that to the fact that SkyHigh already tracks some 12,000 unique services in its database and is adding around 500 new services each month. There’s a lot of room for that average to climb still higher in the future.

This is all part of the unregulated, shadow IT mess that government already faces. That threatens to become much worse over the next few years with the rise of the Internet of Things -- dubbed the Internet of Threats by some -- and that’s spooking many organizations around the world into trying to figure out answers. If agencies thought they had a problem with BYOD, they haven’t seen anything yet.

The kind overconfidence surfaced by Skyhigh is showing up in other reports also. Cybersecurity certification company RedSeal recently produced its own survey of 350 C-level company executives, of whom a solid 60 percent said they could “truthfully assure the board beyond a reasonable doubt” that their organization was secure. As RedSeal pointed out, those assertions were made at the same time that many reports showed a high incidence of network breaches in up to 97 percent of all companies.

What seems clear from the RedSeal survey is that most executives have no clue about what’s really happening in their networks. In what’s a clear repudiation of that two-thirds “beyond a reasonable doubt” number, 86 percent of respondents  acknowledged they had gaps in their network visibility, and almost as many admitted that makes it impossible to effectively secure their networks.

Even outside of the security implications, this lack of knowledge by executives about how IT is being used in their organizations causes problems. As a part of its study of usage data, for example, Skyhigh found that 120 of those 742 total services used on average by agencies were for collaboration purposes. That puts a lot of overhead on IT to deliver all of those unique services, and actually injects confusion into what should be a very organized affair. Fewer services actually aid in collaboration since it means more people would likely be on the same page.

As far as security is concerned, all this shadow IT greatly increases the chance that networks will be breached and, particularly, that users will have their network identification and authentication information stolen. The average public sector employee’s movements online are being monitored by an average of 2.7 advertising and web analytics tracking services, Skyhigh pointed out, and those are increasingly being used by cyber criminals as a base for so-called watering hole attacks.

This is important, Shah said, because the cloud is increasingly being used by attackers to move data out of organizations. In one agency, the company noted over 86,000 tweets coming from just one ID address in a day. When the machine and that address were isolated, the agency found a bot that was exfiltrating data at 140 characters at a time.

That’s an example, Shah said, of the fact that if you can’t analyze data that can show anomalous behavior, you won’t find it. “That’s a blind spot for most organizations,” he said.

All of this is fueling the emergence of new security services, focused on visibility into cloud data traffic, called cloud access security brokers (CASBs). They emerged in 2012, Gartner said, and are set to become an essential component of software-as-a-service deployments by 2017.

Hype aside -- and not forgetting Skyhigh, RedSeal and others have a vested interest in selling these services -- these reports and surveys at a minimum indicate that overconfidence by those charged with providing security for IT is at least as big a problem as the technological challenges they face. It might also be the toughest to fix.

Posted by Brian Robinson on Jun 05, 2015 at 12:36 PM0 comments


More bad news: The bad guys are getting better

More bad news: The bad guys are getting better

If there’s one lesson to be gained from all the security breaches and revelations of major bugs in security protocols in 2014, it’s that attackers are upping their game and finding more opportunities. That’s only reinforced by several new studies.

German security company G Data, for example, reported a huge increase in the number of new malware strains in the second half of the year -- on average, a new type was discovered every 3.75 seconds! For the year as a whole, just under six million new malware strains were seen in the wild, some 77 percent more than 2013's total.

Not all kinds of malware saw an increase. Those using backdoor vulnerabilities in software fell, for example, and worms and spyware remained relatively flat. But rootkits, while still a very small percentage of the overall number of malware, jumped more than ten-fold in the second half of the year.

Rootkits are software included in malware that help to embed the malicious part of the package in a system and ensure the persistence of additional attacks by helping the malware evade the scanners and monitors now used to detect it.

Not surprisingly, malware developers are mainly targeting the ubiquitous Microsoft platforms, with malware programmed as .NET applications continuing to rise. Overall, new variants for Windows platforms made up 99.9 percent of the new malware variants.

More problems could arise with Microsoft’s withdrawal of support for Windows XP in April last year, G Data said, because systems still using this operating system are “unprotected against attacks on existing or newly discovered security holes going forward.”

Akamai Technologies' most recent State of the Internet survey similarly reported more than double the number of distributed denial of service attacks in the first quarter of 2015 compared to first quarter 2014, and over 35 percent the number in the final quarter.

DDoS attacks may not be such a big deal for the public sector, which gets only around two percent of the total. But Akamai noted a potentially dangerous trend in the 2015 attacks, with peak DDoS attacks of 100 Gbps making up a significantly bigger part of the total. That suggests attackers have been developing better ways to maximize the impact of their work.

At the rate attacks are progressing, Akamai said, security researchers are concerned about what attackers may be able to accomplish by this time next year. Add to that the fact that employing current attacks techniques “has not required much skill,” and even relatively inexperienced attackers could be capable of creating major damage as more potent tools enter the picture and attack bandwidth increases.

And what, then, to make of the recent news that the Defense Department is going to take a “no holds barred” approach with users who threaten security with sloppy cyber habits? Bad cyber hygiene “is just eating our shorts,” according to David Cotton, deputy CIO for Information Enterprise at the Pentagon.

Users will be given a very short time to comply with DOD password-security policies or to change behavior that invites phishing attacks while using  third-party social media accounts. The Pentagon is also pushing vendors to come up with more timely patches for security vulnerabilities, though recent research also points to the need to make sure patches are updated on all hosts at the same time.

The DOD, along with the intelligence agencies, is considered to be better at security than most other parts of the government, so it’s a little startling to read that the Pentagon’s crackdown as aimed at giving department leadership “a consolidated view of basic network vulnerabilities.”

Isn’t this supposed to be the very first thing organizations do when assessing security needs? And if the DOD doesn’t even have this bit of the puzzle sorted out, how is it ever going to successfully defend against the threats indicated by the G Data and Akamai reports?

Perhaps it’s finally time for government organizations to give up on security that is user focused. The Cloud Security Alliance’s “Dark Cloud” project could be one way of doing that.

Posted by Brian Robinson on May 22, 2015 at 8:37 AM1 comments


Identity as a Service

The new perimeter and the rise of IDaaS

Identity management has been a major focus in security for a long time, and in government that stretches at least as far back as the implementation of HSPD-12 in 2005. The Obama administration ratcheted the effort even higher in 2012 when it released the National Strategy for Trusted Identities in Cyberspace (NSTIC).

Strong identity solutions have become even more vital following the rash of high-profile breaches of both public and private industry sites last year. An executive order from President Barack Obama duly followed late last year, requiring agencies to cut down on identity-related crimes by issuing credentials with stronger security.

And identity will become even more of an issue as agencies finally start moving more of their IT needs to the cloud. Critical data will stay behind agency firewalls in private clouds, but other services and applications will migrate to the public cloud. And “extending an organization’s identity services into the cloud is a necessary prerequisite for strategic use of on-demand computing resources,” according to the Cloud Security Alliance.

That’s easier said than done. Agencies are tightly wedded to their onsite identity and access management (IAM) systems, which generally use Active Directory (AD) and Lightweight Directory Access Protocol (LDAP) and  over time have become shaped by the individual policies and specific needs of agencies. What’s needed is federated identity management for hybrid clouds that allows agencies to extend these AD/LDAP systems into the cloud.

Cue the rise of identity-as-a-service (IDaaS). It’s a generic term that, according to CSA, covers a number of services needed for an identity ecosystem, such as policy enforcement points, policy decision points, policy access points,  as well as related services that provide entities with identity and provide reputation.

Cloud providers such as Microsoft and Amazon already offer cloud-based directories that synch with on-premises systems. But Gartner expects full-blown IDaaS to make up a quarter of the total IAM market in 2015, versus just four percent in 2011, as spending on cloud computing becomes the bulk of all IT investment by 2016.

That’s driving development of new, native cloud-based identity solutions. Centrify, for example, which already has a fair number of government agencies as customers for its current “cloud savvy” identity management product, recently launched its Centrify Privilege Service,  which it claims is the first purely cloud-based, privileged identity solution.

Privileged accounts in particular have become a favorite target of cyberattacks since, once gained, they allow bad guys almost unlimited freedom to roam across an organization’s systems and steal data or disrupt operations. Centrify said CPS offers a way to manage and secure privileged accounts that legacy IAM cannot do in hybrid IT environments.

However, the company still doesn’t expect it to be an easy sell, particularly in government. Though fears about the security of cloud solutions is easing, and budget pressures make the cloud an increasingly attractive answer, agencies are still doubtful about giving up key assets such as privileged accounts to the cloud.

Centrify chief marketing office Mark Weiner said that, so far, seven or eight agencies have begun playing with CPS to see what it might do for them, “though not the largest military or intelligence agencies.”

Parallel to the growing demand for IDaaS is the use of the phrase “identity is the new perimeter” to describe the brave new world of IT. Again, it’s something that was coined years ago but, as mobile devices proliferate and the cloud becomes the primary way of delivering apps and services, the former hard edge of the network is becoming much fuzzier.

Single logons that grant users access across these soft-edged enterprises will become ubiquitous as agencies work toward business efficiency. Making sure the identities used for that access stay secure will be key.

Posted by Brian Robinson on May 08, 2015 at 10:14 AM0 comments