Windows XP: The undead OS

Windows XP: The undead OS

It’s been one of the longest retirement parties in the IT world, but finally we should be able to say that Windows XP is now gone. Except that it isn’t, and what that means for the security of a – still – large slab of government XP users is an open question.

Microsoft officially ended its support for XP in April 2014, meaning it would not provide any more versions of the venerable operating system, introduced way back in 2001. At that time it also stopped providing security patches for XP through its Microsoft Security Essentials, though it continued to deliver anti-malware signature updates for a time.

That very last grace period finally came to an end on July 14 when Microsoft finished with XP signature updates along with the use of its Malicious Software Removal Tool for XP. If your XP machine gets infected with malware from now on… tough.

OK, you say, why should that bother me since every agency must have figured this out a long time ago and ditched XP in favor of another operating system that is regularly updated?

If only that were so. There still seems to be plenty of these old systems around. Market analyst firm Net Applications earlier this year said XP makes up nearly 17 percent of the total worldwide desktop operating system market share. Other analysts come in lower, but they still suggest over 10 percent of desktop users work with XP.

There are no overall figures for government, but occasional revelations indicate it’s not insubstantial. The Labor Department’s CIO was quoted earlier this year as saying there were still some 10,000 XP users in her agency, while the Navy last month signed a two-year, $9.1 million contract with Microsoft for its direct support of 100,000 mission critical systems, including thousands of XP computers.

No one expects these systems to be used forever. Labor and the Navy are both trying to transition away from these kinds of legacy systems, and so must the other agencies still running the aged operating system. However, no one knows how vulnerable the machines are.

And given the example of the recently announced breaches at the Office of Personnel Management, where OPM executives admitted the attacks on their systems could have been going on for at least a year, there’s a good chance that at least some XP systems still in use have been successfully penetrated. Attackers need only one infected machine to access other systems in the enterprise, from which they can cause damage and grab valuable data.

So, you say, at the very least agencies must be targeting these old XP systems as a priority for replacement? Again, that’s hard to say. And some recent reports and surveys indicate that desire and reality is a hard union to consummate in government IT.

The Professional Services Council in its recent CIO survey, for example, reported that cybersecurity remains the top priority for government CIOs, but that modernizing the IT environment in a way that could aid their cybersecurity efforts remains a challenge for many, because the predominant portion of their IT budgets goes to maintaining legacy systems. The Defense Department, for one, said only 20 percent of its budget is available for investing in next-generation solutions.

The situation out in state and local government is no better. In a study of state IT investment management strategies, a National Association of State CIOs  report said nearly half of state CIOs spend 80 cents of every IT dollar on maintaining existing systems.

What all of this suggests is that old Windows XP systems, particularly if they get lost in the intense competition of IT priorities, could be a problem for cybersecurity for some years yet.

According to Microsoft, Windows XP is now dead, dead, dead. Except when it’s not.

Posted by Brian Robinson on Jul 17, 2015 at 9:51 AM0 comments


Government credentials show up on paste sites

Government credentials show up on paste sites

While much attention has been paid to the very public attacks on government agencies, particularly the breach at the Office of Personnel Management, less has been made of the whereabouts of the exfiltrated data. So how easy is it for John Doe to get his hands on the information let loose in these attacks? Extremely, it seems, according to one recent report.

Security threat analyst Recorded Future, using open source intelligence on 17 paste sites over a single year ending in November 2014, discovered possible exposure of 47 U.S. government agencies across 89 unique domains. The Energy Department alone had email/password combinations on the sites for nine different domains.

A paste site gives users -- usually programmers and coders -- a place to store and share short items in plain text. Pastebin is the best known of these, though there are dozens of others. Anyone on the web can access them, and large companies such as Facebook have started to mine them for information to make their own sites more secure.

Credentials that grant access to agency networks have become a major target for Black Hats because they more easily open up an organization’s data. In fact, most of the sophisticated attacks on government agencies were enabled by attackers who had privileged account information.

Hackers in search of credentials often target agency contractor or business partner sites, as those organizations' employees are given agency access privileges for certain uses. And Recorded Future, in fact, found that most of the exposures at the paste sites were from these kinds of third-party websites, along with government employees using their government email accounts to register for web-based services -- a growing security concern in itself.

The Recorded Future study can’t specify the actual damage from all of this posted information, but it’s easy to infer the possibilities.

Much of the potential damage could be significantly lessened with the use of fairly simple security steps such as requiring two-factor authentication for network access. However, as the Recorded Future report pointed out, OMB has found that many major agencies don’t employ this safeguard for privileged access. The OPM breach was directly tied to this lack of two-factor authentication.

Recorded Future shared the results of its analysis with the government and agencies last year, well before it made them public. It also made a list of helpful suggestions for agencies to protect  themselves against the effects of the paste site exposures:

  • Enable multi-factor authentication and/or VPNs.
  • Require government employees to use stronger passwords that change with greater regularity.
  • Gauge and define use of government email addresses on third-party sites.
  • Maintain awareness of third-party breaches and regularly assess exposure.
  • Ensure Robot Exclusion Standard (robots.txt) is set for government login pages to prevent listing of webmail/web-services in search engines.

All good suggestions. How many would you guess will be standard operating procedure at agencies a year from now?

Mudge to the rescue!

One of the other problems that plague government, along with industry at large, is being able to gauge the quality and reliability of the software it acquires. As last year’s Open SSL Heartbleed affair showed, even well established software can be vulnerable.

Peter Zatko, known affectionately in security circles by his hacker handle Mudge, is leaving his job at Google to help the government create a CyberUL, a cyber version of the famous Underwriters Laboratory that is considered a stamp of approval for the worthiness of many products. He first made his announcement on Twitter.

Zatko went to Google via the Defense Advanced Research Projects Agency, where he was developing technical skills and techniques for use in cyber combat. Before that he was with BBN Technologies and other security research companies.

Not much is yet known of what Zatko will be doing for the government, but he was reportedly a member of the L0pht hacker collective in the 1990s, which published a paper that described a possible model for a CyberUL.

Posted by Brian Robinson on Jul 06, 2015 at 11:06 AM0 comments


What’s worse: Living with legacy systems or replacing them?

What’s worse: Living with legacy systems or replacing them?

The recent revelation of a breach at the Office of Personnel Management, which could have resulted in the theft of personal information of millions of government employees, also points up the broader problem government has with legacy systems -- whether it’s worth spending the money to secure them.

Not that securing the OPM’s systems would have done much good in this case --  according to the Department of Homeland Security Assistant Secretary for Cybersecurity Andy Ozment, the systems were not directly penetrated.  Instead, attackers obtained OPM users’ network credentials and got to the systems and data from the inside.

Donna Seymour, the OPM’s CIO, told a recent House Committee on Oversight and Government Reform  that the department was implementing database encryption, but that some of legacy systems were not capable of accepting encryption.

Some of the OPM’s systems are over 20 years old and written in COBOL, she said, which would require a full rewrite to include encryption and other security such as multi-factor authentication.

This is a government-wide problem. Many of the financial and administrative systems that are central to the agencies’ daily operations use the nearly 60-year old COBOL. Most agency CIOs have targeted those systems for replacement, but it’s not a simple rip-and-replace job -- any mistake could have a severe impact on the agency’s ability to fulfill its mission.

For that reason, many agencies have chosen to maintain those systems for now, but that’s not cheap, either. The OPM itself said last year that maintaining its legacy systems could cost 10-15 percent more a year as people with the right kind of expertise retire. And throughout government, legacy systems account for over two-thirds of the annual IT spend.

That expertise is unlikely to be replaced. Colleges aren’t turning out COBOL-trained coders anymore, and, with COBOL way down the list of popular languages, that won’t change. Agencies could bring in consultants to rewrite the code. But, again, not cheap.

And COBOL is unlikely to disappear anytime soon. Because of its ubiquity and utility, many organizations will continue to use COBOL until it’s pried out of their cold, dead hands. Meanwhile, old mainframe companies that have recently refocused on the cloud continue to update their COBOL tools to keep pace with current IT trends.

It’s not as if problems with legacy systems were the only reason for the breaches at OPM. Lawmakers also berated agency officials for their lack of attention to security governance issues that had been brought up years ago and were highlighted yet again last year in an OPM Inspector General report.

But the legacy issues are real and, according to some reports, extend even to “legacy” security systems such as signature-based firewalls, intrusion prevention systems and other widely installed devices that are just not capable of stopping modern, fast, sophisticated and chameleon-like threats.

However, at least the situation with the federal government is probably not as bad as that of a public school district in Grand Rapids, Mich., which is still running the air conditioning and heating systems for 19 schools using a Commodore Amiga -- as in the 1980s-era personal computer that was popular for home use -- because a replacement system reportedly will cost up to $2 million.

At least, we hope not.

Posted by Brian Robinson on Jun 19, 2015 at 10:55 AM7 comments


IT security’s blind spot

IT security’s blind spot

Good network and data security is made up of several parts. Technology and culture are certainly important elements, but so is perception. If you don’t know what’s going on in your IT infrastructure, then how can you be so sure that you are protected as well as you think? Hubris has been a big reason for many of the most serious breaches over the past few years.

That seems to be true also for emerging infrastructures that include cloud services. Many organizations and government agencies have not made the move to cloud yet, or have done so only hesitantly. And perhaps they are fooling themselves in thinking they have this transition under control, and that they’ll be able to manage the security implications

Skyhigh Networks took a look at this, using anonymized, actual usage data collected from public sector organizations in both the United States and Canada. In its Cloud Adoption and Risk in Government report for the first quarter of this year, Skyhigh discovered, among other things, that government on average was underestimating the use of cloud services by its employees more than ten-fold.

“That’s startling because we tend to think the government sector is very locked down,” said Kamal Shah, vice president of product and marketing at Skyhigh. “In reality, employees are finding and using cloud services to help them get their jobs done, regardless of what the official policies are.”

When they asked government IT officials what services they thought employees were using, they’d come up with anything in between 60 and 80, he said. The Skyhigh study found the average public sector organization uses 742 separate and unique cloud services.

If that sounds like a lot, compare that to the fact that SkyHigh already tracks some 12,000 unique services in its database and is adding around 500 new services each month. There’s a lot of room for that average to climb still higher in the future.

This is all part of the unregulated, shadow IT mess that government already faces. That threatens to become much worse over the next few years with the rise of the Internet of Things -- dubbed the Internet of Threats by some -- and that’s spooking many organizations around the world into trying to figure out answers. If agencies thought they had a problem with BYOD, they haven’t seen anything yet.

The kind overconfidence surfaced by Skyhigh is showing up in other reports also. Cybersecurity certification company RedSeal recently produced its own survey of 350 C-level company executives, of whom a solid 60 percent said they could “truthfully assure the board beyond a reasonable doubt” that their organization was secure. As RedSeal pointed out, those assertions were made at the same time that many reports showed a high incidence of network breaches in up to 97 percent of all companies.

What seems clear from the RedSeal survey is that most executives have no clue about what’s really happening in their networks. In what’s a clear repudiation of that two-thirds “beyond a reasonable doubt” number, 86 percent of respondents  acknowledged they had gaps in their network visibility, and almost as many admitted that makes it impossible to effectively secure their networks.

Even outside of the security implications, this lack of knowledge by executives about how IT is being used in their organizations causes problems. As a part of its study of usage data, for example, Skyhigh found that 120 of those 742 total services used on average by agencies were for collaboration purposes. That puts a lot of overhead on IT to deliver all of those unique services, and actually injects confusion into what should be a very organized affair. Fewer services actually aid in collaboration since it means more people would likely be on the same page.

As far as security is concerned, all this shadow IT greatly increases the chance that networks will be breached and, particularly, that users will have their network identification and authentication information stolen. The average public sector employee’s movements online are being monitored by an average of 2.7 advertising and web analytics tracking services, Skyhigh pointed out, and those are increasingly being used by cyber criminals as a base for so-called watering hole attacks.

This is important, Shah said, because the cloud is increasingly being used by attackers to move data out of organizations. In one agency, the company noted over 86,000 tweets coming from just one ID address in a day. When the machine and that address were isolated, the agency found a bot that was exfiltrating data at 140 characters at a time.

That’s an example, Shah said, of the fact that if you can’t analyze data that can show anomalous behavior, you won’t find it. “That’s a blind spot for most organizations,” he said.

All of this is fueling the emergence of new security services, focused on visibility into cloud data traffic, called cloud access security brokers (CASBs). They emerged in 2012, Gartner said, and are set to become an essential component of software-as-a-service deployments by 2017.

Hype aside -- and not forgetting Skyhigh, RedSeal and others have a vested interest in selling these services -- these reports and surveys at a minimum indicate that overconfidence by those charged with providing security for IT is at least as big a problem as the technological challenges they face. It might also be the toughest to fix.

Posted by Brian Robinson on Jun 05, 2015 at 12:36 PM0 comments