skeleton of IT worker in a data center

With shutdown, can agencies manage insider threats?

Well, they’ve done it; Congress has shut down the federal government. On the bright side, it means less traffic on the streets and highways for Washington-area workers who do have to go into the office. But managing traffic on networks with a skeleton staff could be more of a challenge, especially if your organization has let the number of accounts with elevated access privileges get out of hand.

The insider threat has received a lot of attention in the wake of leaks of embarrassing information from the State Department and the National Security Agency. Following disclosures of classified information by contractor Edward Snowden, NSA Director Gen. Keith Alexander announced that the agency was reducing the number of its systems administrators by 90 percent from around 1,000. A reasonable move, if maybe a little late. But it raises the question, why did NSA have 1,000 administrators in the first place?

“Everyone in the world has the same problem,” said John Worrall, chief marketing officer for CyberArk,  which sells tools to help manage privileged accounts. “It’s not just the NSA.”

Privileged accounts tend to accrete over time. Expanded access is granted and never revoked. People leave and accounts remain. Over time it is not impossible for organizations to find that they have a one-to-one ratio of users to elevated accounts.

“It’s a huge challenge,” said Eric Noonan, chief executive officer of CyberSheath Service International. Often it results from a desire by administrators to be helpful. “A lot of times it is easier to provided elevated access to end users,” to allow them to fix their own problems, he said.

“It creates a multiplicity of accounts you didn’t know you had, and each one becomes an attack vector,” Worrall said.

Any account can be an attack vector, of course, but privileged or administrative accounts create more risk because they give users the ability to make fundamental changes in the configuration of network and enterprise elements. This kind of access is necessary for administrators to keep systems up and running, but they also can abuse the access by opening and closing doors, installing and removing software, accessing and exporting data and then covering up tracks afterwards.

People with these accounts now are being sent home, most likely with their accounts and privileges intact. There will be a skeleton staff on duty at most IT shops to provide support for exempted workers who remain on the job, and a minimum level of staffing is required for security monitoring and incident response. But it will be a tough job for them to monitor and lock down all of the accounts that could be open to abuse.

The solution is to begin managing the proliferation of accounts before there is a crisis. The most direct way to do this is what Noonan calls the brute force method: Eliminate all elevated access, and give the privileges back one at a time only as they are needed. “It’s painful,” he said. “A lot shy away from the problem.” But if the issue becomes serious enough, organizations can be compelled to use brute force.

A more managed way is to first discover all of the accounts and audit them for need. To do this, agencies need a comprehensive policy defining what privileges are to be granted under what conditions. But policy without an enforcement mechanism is meaningless. Controls must be put in place and activity monitored -- not only to enforce policy, but to investigate incidents after the fact. Tools for discovering, monitoring and managing accounts are available (CyberArk is just one vendor; there are others as well).

Access control, like all good security practices, is an ongoing process. If management is confined to discovering and cleaning, it is inevitable that the number of accounts with elevated privileges will creep back up.

This is not much help for anyone babysitting a network with reduced staff now. But it is one more thing for the to-do list when things return to normal.

Posted by William Jackson on Oct 01, 2013 at 1:12 PM3 comments


turtle

The slow but steady progress of FISMA

The Federal Information Security Management Act, the framework for cybersecurity in the federal government, has come in for a lot of criticism since its enactment in 2002. Some say it is hopelessly out of date; others that it never was adequate. But the law has proved remarkably resilient in the face of an IT landscape and threat environment that has changed almost beyond recognition in the last 11 years.

This is due in large part to the continually and rapidly evolving body of cybersecurity guidance being produced by the National Institute of Standards and Technology – the meat on the bones of FISMA.

Assessments of FISMA’s success remain cautious, at best. A recent report from the Government Accountability Office shows “mixed progress” from fiscal 2011 to 2012. Some security elements improved across agencies while some declined, and “23 of 24 of the major federal agencies had weaknesses in the controls that are intended to limit or detect access to computer resources.”

Government IT security professionals questioned in a recent survey by MeriTalk gave a positive but cool assessment of the law. Although just 27 percent of respondents reported being fully compliant with FISMA, 62 percent believe increased compliance would improve security, and 53 percent say it already has improved security.

But they still have reservations about FISMA. Twenty-eight percent said it focuses on compliance rather fixing problems, 21 percent say it is insufficient for today’s threats and 11 percent say it is antiquated. Still, 27 percent say it is improving with requirements such as continuous monitoring.

So, how to shift opinions of FISMA from cautious to enthusiastic? GAO focuses its recommendations for improvement on metrics. Current reporting does not address all FISMA requirements and is focused on compliance rather than outcome. GAO recommends looking at periodic assessments of risk and developing metrics for inspectors general so that they can report on the effectiveness or security programs.

NIST will need to continue updating its guidance to reflect new demands and capabilities, such as continuous monitoring of IT systems and automation of assessments.

And everyone will have to accept that cybersecurity is a moving target and that even the best-protected systems will quickly become out of date if ignored for a short time.

“You are never done, you are never there,” said Vincent Berk, CEO of FlowTraq. “We are talking about an amazingly complex problem.” But government has made great strides in addressing the task by making security a priority.

If gaps remain, it is not necessarily the fault of FISMA. If the Office of Management and Budget and the Homeland Security Department can learn to measure the right things and give credit for what works, the existing legal framework can continue to help.

Posted by William Jackson on Sep 27, 2013 at 1:36 PM0 comments


Gen. Keith Alexander

Getting harder to trust Alexander's NSA

“I need you,” National Security Agency Director Gen. Keith Alexander said several times to his audience at the National Press Club Wednesday. He needs the support of industry and the public in order to protect the nation from cyberattacks and terrorism in the face of growing concern over his agency’s wholesale collection of domestic data.

Alexander also spoke about the need to migrate the Defense Department’s 15,000 network enclaves to a more defensible architecture based on a thin, virtual cloud environment and about the need for legislation spelling out clear rules of engagement for protecting civilian cyber infrastructure and for cyber threat information sharing. But most of his talk focused on troublesome media leaks that threaten to hogtie the agency.

Data culled from the nation’s telephone and Internet carriers is crucial to thwarting foreign attacks, he said, but these programs are being threatened by what he called sensationalized and inflamed stories coming from the leaks.

“Talk about the facts,” he pleaded. “We need to get the facts out about why we need these tools.”

He then proceeded to give his latest version of the facts. But it is getting harder to trust him when his version has to be updated every month in the wake of new revelations about NSA activities. This is a shame, because it is getting in the way of the NSA’s genuinely important work of gathering foreign intelligence and protecting the government’s cyber infrastructure.

“I promise you the truth,” Alexander said back in July during his opening keynote address at the Black Hat Briefings. One of those truths was that “no one at NSA has ever gone outside the boundaries we’ve been given,” in its collection and analysis of domestic data.

Well, not exactly. Two months later, speaking at the Billington Cybersecurity Summit in Washington on Sept. 25, he acknowledged 12 willful violations of the agency’s legal authority. However, “we held ourselves accountable and we reported it,” he said. But not to the American people or to Congress until after it was publicly reported in August.

And then there were the 2,776 “incidents” that came out in the August release of declassified secret court records. These were just mistakes, he said in September, and “if we make a mistake, we self-report it in every case.”

“Self-reporting” at the NSA apparently means reporting to itself, because it didn’t report this to the public or to Congress.

Alexander, as usual, mentioned his 15 grandchildren during his talk. If you can’t trust a guy with 15 grandchildren, who can you trust? But he seemed unusually subdued. It could have been the recent dental surgery that had left one side of his jaw a little swollen. But it also might have been the three months of stress from the drip, drip, drip of revelations from those leaks. Publicly defending an agency that has spent decades in the shadows must be unnerving.

“We do the right thing in every case,” he said. “We’re trying to be more transparent.” That would be easier to believe if he didn’t have to update his version of the truth every month.

Posted by William Jackson on Sep 26, 2013 at 8:18 AM3 comments


Man with New Year

Congress to IT security: Happy fiscal New Year

Priorities for securing government’s IT infrastructure for the coming fiscal year include defending against insider threats posed by unmanaged privileged access and expanded continuous monitoring to address the growing complexity of outsider threats. But these issues could be dwarfed by the challenge of just keeping the lights on come Oct. 1.

“Security is probably the biggest issue we’ve got, because it underlies so much of the other things we are trying to do,” said Paul Christman, public sector vice president at Dell Software. “It can’t go on hiatus.”

Yet the fools on the Hill see the world spinning ’round toward the new budget year without any serious plans for enacting a budget to support critical operations. No doubt essential personnel will remain at their desks in the event of a shutdown, but without updated technology to support them, security will suffer.

“We’re finding it very challenging to assess and predict priorities, because our customers cannot assess and predict their priorities,” Christman said. “Funding has become chaotic and erratic.”

If there is any budget for fiscal 2014, insider threats are likely to be top-of-mind for administrators. A steady drumbeat of stories raises the question of how to manage the physical and logical access given to people agencies have decided to trust. On the IT side, systems administrators and others with privileged accounts often have way too much freedom, putting systems and the information they contain at risk.

The first step in controlling this access is effective policy. Most agencies and offices probably already have a good policy in place, Christman said. But there often are few if any controls to enforce it. Technology must match policy with the ability to monitor, track and audit the activity of those who are given the keys to the kingdom. This has been driven home by the activities of Chelsea (nee Bradley) Manning and Edward Snowden. The National Security Agency, smarting from the Snowden leaks, has responded by reducing the number of systems administrators and instituting a two-man rule requiring separate sets of credentials for access to sensitive resources.

This process would be burdensome and unnecessary for most agencies, which could effectively monitor activity with software. But that requires money, and money requires a budget.

The government also is in the process of moving from static assessments of IT security to continuous monitoring -- or continuous diagnostics and mitigation. This process is necessary to respond to a rapidly evolving threat landscape, and suites of automated tools are available to enable it. The Homeland Security Department is offering continuous monitoring as a service through blanket purchase agreements. But here again, a budget will be necessary to allow agencies to take advantage of the service in fiscal 2014.

Budget uncertainties are being compounded by the attrition of experienced procurement personnel. Because of retirements and sequester-powered furloughs, there is a shortage of officials with the know-how to effectively wend their way through acquisition regulations to take advantage of needed technology.

“I think this is going to make the next two weeks really, really strange,” Christman said of the year-end rush to spend out 2013 budgets. “I don’t see it getting any better next year.”

Posted by William Jackson on Sep 20, 2013 at 12:07 PM2 comments