Sextant against digital sea background

How can cybersecurity improve if the problem can't be measured?

How can you tell if you are making any progress if you don’t know where you are or where you’re going? That is the situation cybersecurity professionals find themselves in, according to a paper being released this week by the EastWest Institute (EWI).

It’s impossible to know whether security is working with no reliable measurements for the scope of the cybersecurity problem. “We do not have even an order-of-magnitude estimate of some of the most basic aspects of the cybersecurity problem that can be validated,” say the authors of the paper, “Measuring the Cybersecurity Problem.”  The paper proposes an international voluntary scheme for gathering and interpreting meaningful statistical data about attacks, breaches and incidents in cyberspace.

“While these recommendations are primarily for the private sector, governments can benefit significantly from their implementation,” the authors say.

The paper is being released at the World Cyberspace Cooperation Summit IV, being held this week at Stanford University in California. Although this is the fourth annual summit produced by EWI, the name has changed this year to reflect changes in focus. What had been cybersecurity summits is now a cyberspace cooperation summit.

“We are discussing key areas of cyberspace cooperation,” said Harry Raduege, chairman of Deloitte LLP's Center for Cyber Innovation. “We are discussing what is possible.”

The EastWest Institute is an international think tank focusing on multilateral cooperation. Cybersecurity was identified about four years ago as a critical international issue and the cyber summits were initiated in Dallas in 2010. High-level industry and government officials attended subsequent summits in London and New Delhi, and this year’s summit returns to the United States in the Silicon Valley.

The gatherings have produced a number of papers on subjects ranging from the reliability of undersea cables to rules for government conflicts in cyberspace, but the most important result to date has been the relationships established, said Raduege, a retired Air Force general and former director of the Defense Information Systems Agency. “Just the fact that we’re getting to know each other is important,” he said. “The first step is figuring out who the key players are who can make things happen.”

He described the summits as track 2 diplomacy, informal talks that identify areas of international agreement that can be passed on to traditional diplomatic channels for development. Issues this year include critical infrastructure protection as well as the economic and legal impacts of cybersecurity.

Determining impact requires metrics, and despite the billions of dollars being spent on it there are no adequate metrics for cybersecurity. That lack spurred the proposal for setting up a way to measure the problem. The paper makes three recommendations:

  • The private sector should establish a trusted environment for gathering worldwide statistical data that supports measurements of the cybersecurity problem.
  • Private-sector companies should voluntarily provide statistical data to this trusted entity, which could use the data to produce meaningful statistics.
  • Qualified subject-matter experts should develop statistical methods for analyzing this data. This could provide a quantitative framework for reliable benchmarks.

One of the most interesting topics likely to come up at this year’s summit is not on the formal agenda: Friction between the United States and much of the rest of the world generated by reports of National Security Agency surveillance of cyberspace. “What the impact of these reports will be has yet to be learned,” Raduege said. “It will be very revealing to see and hear from those who are attending.”

Posted by William Jackson on Nov 06, 2013 at 11:34 AM1 comments


HealthCaregov test checklist

Lesson from HealthCare.gov: A launch is no time for a beta test

So, just how bad was the roll-out of the Affordable Care Act portal, HealthCare.gov?  It depends.

As a production launch, it was really bad. Failures on the site frustrated users and burned up a lot of goodwill that the administration could ill afford to lose.

Other views

Media got it wrong: HealthCare.gov failed despite agile practices

Developer documentation shows agile development practices were used in building the site. Here's where that could have hurt. Read more.

Why Agile can work for complex systems like HeathCare.gov

Development of large, complex systems requires a method that also allows a progressive discovery of project scope and dependencies, along with identification and mitigation of key risks. Read more.

As a beta test, it still was pretty bad; but that’s what beta tests are for. There is no disgrace in finding problems in an application during testing. “This is pretty standard,” said David Lindsay, senior security product manager at Coverity, a development testing company. The failure with HealthCare.gov is that the beta test started Oct. 1 rather than two months earlier.

Any large application project requires testing throughout the planning and development phases. This begins with the individual components as they are developed and continues as they are assembled into working parts. “There is still going to be a need for production testing,” when the entire application is completed, Lindsay said. “A final pass before things go out the door.”

There is a paradox in large project developments, however. The more complex the application, the more testing it needs. But the more complex it is, the later all of the parts come together and the less time there is for final testing. And it is not enough to squirt bits at the software on a laboratory testbed. It should be exposed to real users who can put it through its paces to expose the unintended consequences of development decisions. This is the beta test, the last stop before production release and the last chance to fix problems before you disappoint users and delight antagonists.

HealthCare.gov certainly qualifies as a complex application. It serves as a gateway for 36 state programs and must securely access sensitive information in multiple state and federal databases. The results will determine whether and what kind of health care millions of people have access to, so the stakes are high.

Doing a beta test for a large-scale public Web application is not simple, but it can be quietly and slowly rolled out in a number of locations. Given the current projected fix date of Nov. 30 for the site, this testing phase should have begun no later than Aug. 1 in order to be ready for production by Oct. 1. But the current fix is being done on a crash basis, which is not a good way to develop and fix software. So a July date for beta release would have made better sense.

This would not have been simple to do. All of the states involved and the insurance companies offering coverage would have needed to have their parts ready three months early. That kind of scheduling has to be built into the project from the beginning, not brought up as an afterthought. Synchronizing the release date with the launch date is a recipe for failure, and that is what happened with HealthCare.gov.

There was time to do it right. The Affordable Care Act was passed in early 2010. The task was complicated, however, by opponents of the program who helped to talk many states out of offering their own healthcare exchanges, making the federal portal that much more complex. Those opponents share much of the blame for the botched launch of the site. But that does not excuse those who planned and developed the system from doing it right.

Posted by William Jackson on Nov 01, 2013 at 6:20 AM15 comments


end user

Security meets the enemy, and it's the users

It is no shock to learn that end users and IT security people often do not see eye to eye. If the security shop had its way, everything would be locked down, and there would be no end users. Users see security as an impediment to doing their jobs. And a recent survey indicates that the divide between users and defenders could be undermining federal cybersecurity.

The survey — of 100 federal security professionals and 100 end users in agencies — was conducted by MeriTalk in August and contains a few telling data points:

  • 31 percent of end users admit to regularly circumventing what they see as unreasonable security restrictions.
  • Security people estimate that 49 percent of agency breaches are caused primarily by a lack of user compliance.
  • User frustration equals security risks. The greatest pain points for users — Web surfing and downloading files — produce the most agency breaches.

The sample size isn’t large, but the survey claims a margin of error of less than 10 percent and a 95 percent level of confidence.

The results are not surprising, said Tom Ruff, public sector vice president at Akamai Technologies, which commissioned the study. It confirms a disconnect that has long existed. Ensuring a user-friendly experience ranked last among the priorities of security professionals, and that probably is as it should be, Ruff said. “At the end of the day the cyber team has got to protect the agency’s mission. That’s job one.”

But with 50 percent of the threat coming from insiders, either intentionally or accidentally, bridging the gap between users and defenders is becoming more important to the security of government networks and systems.

This is not a new idea. Government cybersecurity policy has been moving toward a closer integration of security with IT operations in an effort to provide better real-time visibility into the activity and status of systems. This is, in part, what the focus on continuous monitoring is all about. But the integration also could help move the security shop closer to the users, giving it a better view of just what it is the users are trying to do, what their pain points are and why they are responsible for so many breaches.

It is not a one-way street, of course. The users are going to have to learn to accommodate security when necessary. Just because something can be done doesn’t mean that it should be, and some inconveniences are legitimate trade-offs for improved security.

Awareness training is supposed to be a part of agency cybersecurity programs, and lack of awareness does not seem to be the root of the problem. According to the MeriTalk/Akamai survey, 95 percent of users believe that cybersecurity is an absolute necessity. As long as users understand the reason for a specific policy or process, they probably will accept it.

“The more transparent the security policy is, the easier it will be to address the divide,” Ruff said.

Bridging the divide at a time when challenges are growing faster than budgets and everyone is struggling to make ends meet is not easy. But if agencies can find time to focus on this challenge it could be a cost-effective way to help improve security.

Posted by William Jackson on Oct 25, 2013 at 1:22 PM9 comments


bomb

Is the shutdown promoting a false sense of cybersecurity?

We’re approaching the end of the second week of the federal shutdown and so far there have been no cyber crises. This is the point in the movie where the hero says, “It’s quiet out there. Almost too quiet.”

We should not assume that because we haven’t seen major actions against our IT systems that nothing is happening. If we have learned anything from experience it is that the breaches we don’t see are far worse than the ones we do, and there’s no reason to believe that stealthy intrusions are less likely now that staff, funding and other resources have been cut to the bone.

The United States is the number one target in an ongoing global cyber cold war and that is not going to stop because Congress will not pass a budget.

“It is wishful thinking that in the current environment we are not going to be targeted and that a few people can manage all of that infrastructure,” said Vijay Basani, CEO of EiQ Networks, which provides security intelligence tools and services to the government.

Since Oct. 1, shuttered websites have been sending the wrong message to our enemies and our friends about our commitment to cybersecurity. A particular concern: Online versions of the National Institute of Standards and Technology’s cybersecurity guidance are unavailable and NIST’s work on a cybersecurity framework for critical infrastructure, due Oct. 10, has been halted, unfinished.

Yet our IT systems have not disappeared. Patching and monitoring cannot get the same level of attention as during normal operations and dealing with cybersecurity as a crisis rather than a process is bad policy and bad security.

Essential crews remain at work, but the morale of IT and security professionals still on the job without pay cannot be very good and the prospect of hiring qualified professionals in the future becomes bleaker by the day. What competent worker would choose to go to work for a dysfunctional government that won’t pay its bills as long as there are jobs in the private sector?

Basani warned that the impact of gridlock began even before the shutdown. The sequester cut into budgets before the end of the fiscal year, when  many  procurements and acquisitions are done. And contracts that were in place by the end of the year cannot be implemented, so upgrades and replacement of systems, components and security tools are  delayed. Meanwhile, the Homeland Security Department’s Continuous Diagnostics and Mitigation program, which was to be spurred by the award of 17 blanket purchase agreements in August, has been essentially put on hold until government can get back to business.

In short, as Basani said, “as much as politicians talk about cybersecurity, I don’t think they really understand the implications of the shutdown on cybersecurity.”

The best we can hope for is that those in charge learn from this experience and realize that cybersecurity should be outside the scope of political spitting matches.

The worst we can fear is that nothing is learned because there is no obvious cyber Armageddon and we do not see the cancer working its way through out systems.

Posted by William Jackson on Oct 11, 2013 at 1:00 PM1 comments