Most government IT professionals − by a wide margin − would rather be trapped in an elevator for 24 hours than have their networks hacked, according to a recent survey.
This could explain why cybersecurity is listed as the top area for expanded IT spending in the coming year, with 59 percent saying they expect increased security spending, topping cloud computing by 14 percentage points.
The results from a survey of 400 federal, state and local government officials conducted for Cisco underscore the foundational importance of cybersecurity. Being stuck in an elevator would ruin your day. A breach of your network or data could ruin your career − and 71 percent said they’d rather be stuck in the elevator. If your security does not work, nothing else really matters.
Feds tend to be more conscious of this than those in state and local government. Improving security is the second place technology goal in the overall survey at 22 percent, behind reducing costs (28 percent), but security is tops in the federal sector. Budget constraints are the top threat to IT infrastructure, at 35 percent overall, and cyberattacks come in second, at 17 percent, but attacks are seen as a bigger threat in the federal sector than among state and local organizations. This does not necessarily mean that federal networks are more vulnerable than those in state and local systems, but the U.S. government is a high-profile target for hacktivists, criminals looking for valuable intellectual property and other nations engaged in espionage.
Cybersecurity professionals are in an almost no-win situation. In just about every assessment of security they come up looking bad. If they are graded on compliance with regulations, they are told that they are ignoring real-world security. If they focus on practical security, compliance is likely to slip. And complete security is impossible in a dynamic environment in which the functionality and configuration of hardware and software change on a daily basis. The best they can do is manage an acceptable risk. But no risk looks acceptable after a breach.
The professionals surveyed know that there is no simple answer to improving cybersecurity. Twenty-one percent of them listed better technology as the most effective way to improve security, followed by better enforcement of policies at 18 percent and better employee training at 15 percent. But most of them refused to single out one factor for improvement; 42 percent said that all three were equally important.
One factor not addressed in the survey is stability. It is hard to secure a system while ensuring its operational availability to users when you don’t know from day to day, let alone year to year, what financial and manpower resources are going to be available. The chaotic state of government over the last few years, illustrated most recently by the government shutdown forced by political hostage-taking, erodes IT security along with every other measure of performance. I imagine that if it had been offered as a choice in the survey, a rational Congress would top the wish list for IT professionals.
Posted by William Jackson on Oct 07, 2013 at 11:21 AM2 comments
Well, they’ve done it; Congress has shut down the federal government. On the bright side, it means less traffic on the streets and highways for Washington-area workers who do have to go into the office. But managing traffic on networks with a skeleton staff could be more of a challenge, especially if your organization has let the number of accounts with elevated access privileges get out of hand.
The insider threat has received a lot of attention in the wake of leaks of embarrassing information from the State Department and the National Security Agency. Following disclosures of classified information by contractor Edward Snowden, NSA Director Gen. Keith Alexander announced that the agency was reducing the number of its systems administrators by 90 percent from around 1,000. A reasonable move, if maybe a little late. But it raises the question, why did NSA have 1,000 administrators in the first place?
“Everyone in the world has the same problem,” said John Worrall, chief marketing officer for CyberArk, which sells tools to help manage privileged accounts. “It’s not just the NSA.”
Privileged accounts tend to accrete over time. Expanded access is granted and never revoked. People leave and accounts remain. Over time it is not impossible for organizations to find that they have a one-to-one ratio of users to elevated accounts.
“It’s a huge challenge,” said Eric Noonan, chief executive officer of CyberSheath Service International. Often it results from a desire by administrators to be helpful. “A lot of times it is easier to provided elevated access to end users,” to allow them to fix their own problems, he said.
“It creates a multiplicity of accounts you didn’t know you had, and each one becomes an attack vector,” Worrall said.
Any account can be an attack vector, of course, but privileged or administrative accounts create more risk because they give users the ability to make fundamental changes in the configuration of network and enterprise elements. This kind of access is necessary for administrators to keep systems up and running, but they also can abuse the access by opening and closing doors, installing and removing software, accessing and exporting data and then covering up tracks afterwards.
People with these accounts now are being sent home, most likely with their accounts and privileges intact. There will be a skeleton staff on duty at most IT shops to provide support for exempted workers who remain on the job, and a minimum level of staffing is required for security monitoring and incident response. But it will be a tough job for them to monitor and lock down all of the accounts that could be open to abuse.
The solution is to begin managing the proliferation of accounts before there is a crisis. The most direct way to do this is what Noonan calls the brute force method: Eliminate all elevated access, and give the privileges back one at a time only as they are needed. “It’s painful,” he said. “A lot shy away from the problem.” But if the issue becomes serious enough, organizations can be compelled to use brute force.
A more managed way is to first discover all of the accounts and audit them for need. To do this, agencies need a comprehensive policy defining what privileges are to be granted under what conditions. But policy without an enforcement mechanism is meaningless. Controls must be put in place and activity monitored -- not only to enforce policy, but to investigate incidents after the fact. Tools for discovering, monitoring and managing accounts are available (CyberArk is just one vendor; there are others as well).
Access control, like all good security practices, is an ongoing process. If management is confined to discovering and cleaning, it is inevitable that the number of accounts with elevated privileges will creep back up.
This is not much help for anyone babysitting a network with reduced staff now. But it is one more thing for the to-do list when things return to normal.
Posted by William Jackson on Oct 01, 2013 at 1:12 PM3 comments
The Federal Information Security Management Act, the framework for cybersecurity in the federal government, has come in for a lot of criticism since its enactment in 2002. Some say it is hopelessly out of date; others that it never was adequate. But the law has proved remarkably resilient in the face of an IT landscape and threat environment that has changed almost beyond recognition in the last 11 years.
This is due in large part to the continually and rapidly evolving body of cybersecurity guidance being produced by the National Institute of Standards and Technology – the meat on the bones of FISMA.
Assessments of FISMA’s success remain cautious, at best. A recent report from the Government Accountability Office shows “mixed progress” from fiscal 2011 to 2012. Some security elements improved across agencies while some declined, and “23 of 24 of the major federal agencies had weaknesses in the controls that are intended to limit or detect access to computer resources.”
Government IT security professionals questioned in a recent survey by MeriTalk gave a positive but cool assessment of the law. Although just 27 percent of respondents reported being fully compliant with FISMA, 62 percent believe increased compliance would improve security, and 53 percent say it already has improved security.
But they still have reservations about FISMA. Twenty-eight percent said it focuses on compliance rather fixing problems, 21 percent say it is insufficient for today’s threats and 11 percent say it is antiquated. Still, 27 percent say it is improving with requirements such as continuous monitoring.
So, how to shift opinions of FISMA from cautious to enthusiastic? GAO focuses its recommendations for improvement on metrics. Current reporting does not address all FISMA requirements and is focused on compliance rather than outcome. GAO recommends looking at periodic assessments of risk and developing metrics for inspectors general so that they can report on the effectiveness or security programs.
NIST will need to continue updating its guidance to reflect new demands and capabilities, such as continuous monitoring of IT systems and automation of assessments.
And everyone will have to accept that cybersecurity is a moving target and that even the best-protected systems will quickly become out of date if ignored for a short time.
“You are never done, you are never there,” said Vincent Berk, CEO of FlowTraq. “We are talking about an amazingly complex problem.” But government has made great strides in addressing the task by making security a priority.
If gaps remain, it is not necessarily the fault of FISMA. If the Office of Management and Budget and the Homeland Security Department can learn to measure the right things and give credit for what works, the existing legal framework can continue to help.
Posted by William Jackson on Sep 27, 2013 at 1:36 PM0 comments
“I need you,” National Security Agency Director Gen. Keith Alexander said several times to his audience at the National Press Club Wednesday. He needs the support of industry and the public in order to protect the nation from cyberattacks and terrorism in the face of growing concern over his agency’s wholesale collection of domestic data.
Alexander also spoke about the need to migrate the Defense Department’s 15,000 network enclaves to a more defensible architecture based on a thin, virtual cloud environment and about the need for legislation spelling out clear rules of engagement for protecting civilian cyber infrastructure and for cyber threat information sharing. But most of his talk focused on troublesome media leaks that threaten to hogtie the agency.
Data culled from the nation’s telephone and Internet carriers is crucial to thwarting foreign attacks, he said, but these programs are being threatened by what he called sensationalized and inflamed stories coming from the leaks.
“Talk about the facts,” he pleaded. “We need to get the facts out about why we need these tools.”
He then proceeded to give his latest version of the facts. But it is getting harder to trust him when his version has to be updated every month in the wake of new revelations about NSA activities. This is a shame, because it is getting in the way of the NSA’s genuinely important work of gathering foreign intelligence and protecting the government’s cyber infrastructure.
“I promise you the truth,” Alexander said back in July during his opening keynote address at the Black Hat Briefings. One of those truths was that “no one at NSA has ever gone outside the boundaries we’ve been given,” in its collection and analysis of domestic data.
Well, not exactly. Two months later, speaking at the Billington Cybersecurity Summit in Washington on Sept. 25, he acknowledged 12 willful violations of the agency’s legal authority. However, “we held ourselves accountable and we reported it,” he said. But not to the American people or to Congress until after it was publicly reported in August.
And then there were the 2,776 “incidents” that came out in the August release of declassified secret court records. These were just mistakes, he said in September, and “if we make a mistake, we self-report it in every case.”
“Self-reporting” at the NSA apparently means reporting to itself, because it didn’t report this to the public or to Congress.
Alexander, as usual, mentioned his 15 grandchildren during his talk. If you can’t trust a guy with 15 grandchildren, who can you trust? But he seemed unusually subdued. It could have been the recent dental surgery that had left one side of his jaw a little swollen. But it also might have been the three months of stress from the drip, drip, drip of revelations from those leaks. Publicly defending an agency that has spent decades in the shadows must be unnerving.
“We do the right thing in every case,” he said. “We’re trying to be more transparent.” That would be easier to believe if he didn’t have to update his version of the truth every month.
Posted by William Jackson on Sep 26, 2013 at 8:18 AM3 comments