So, just how bad was the roll-out of the Affordable Care Act portal, HealthCare.gov? It depends.
As a production launch, it was really bad. Failures on the site frustrated users and burned up a lot of goodwill that the administration could ill afford to lose.
As a beta test, it still was pretty bad; but that’s what beta tests are for. There is no disgrace in finding problems in an application during testing. “This is pretty standard,” said David Lindsay, senior security product manager at Coverity, a development testing company. The failure with HealthCare.gov is that the beta test started Oct. 1 rather than two months earlier.
Any large application project requires testing throughout the planning and development phases. This begins with the individual components as they are developed and continues as they are assembled into working parts. “There is still going to be a need for production testing,” when the entire application is completed, Lindsay said. “A final pass before things go out the door.”
There is a paradox in large project developments, however. The more complex the application, the more testing it needs. But the more complex it is, the later all of the parts come together and the less time there is for final testing. And it is not enough to squirt bits at the software on a laboratory testbed. It should be exposed to real users who can put it through its paces to expose the unintended consequences of development decisions. This is the beta test, the last stop before production release and the last chance to fix problems before you disappoint users and delight antagonists.
HealthCare.gov certainly qualifies as a complex application. It serves as a gateway for 36 state programs and must securely access sensitive information in multiple state and federal databases. The results will determine whether and what kind of health care millions of people have access to, so the stakes are high.
Doing a beta test for a large-scale public Web application is not simple, but it can be quietly and slowly rolled out in a number of locations. Given the current projected fix date of Nov. 30 for the site, this testing phase should have begun no later than Aug. 1 in order to be ready for production by Oct. 1. But the current fix is being done on a crash basis, which is not a good way to develop and fix software. So a July date for beta release would have made better sense.
This would not have been simple to do. All of the states involved and the insurance companies offering coverage would have needed to have their parts ready three months early. That kind of scheduling has to be built into the project from the beginning, not brought up as an afterthought. Synchronizing the release date with the launch date is a recipe for failure, and that is what happened with HealthCare.gov.
There was time to do it right. The Affordable Care Act was passed in early 2010. The task was complicated, however, by opponents of the program who helped to talk many states out of offering their own healthcare exchanges, making the federal portal that much more complex. Those opponents share much of the blame for the botched launch of the site. But that does not excuse those who planned and developed the system from doing it right.
Posted by William Jackson on Nov 01, 2013 at 6:20 AM15 comments
It is no shock to learn that end users and IT security people often do not see eye to eye. If the security shop had its way, everything would be locked down, and there would be no end users. Users see security as an impediment to doing their jobs. And a recent survey indicates that the divide between users and defenders could be undermining federal cybersecurity.
The survey — of 100 federal security professionals and 100 end users in agencies — was conducted by MeriTalk in August and contains a few telling data points:
- 31 percent of end users admit to regularly circumventing what they see as unreasonable security restrictions.
- Security people estimate that 49 percent of agency breaches are caused primarily by a lack of user compliance.
- User frustration equals security risks. The greatest pain points for users — Web surfing and downloading files — produce the most agency breaches.
The sample size isn’t large, but the survey claims a margin of error of less than 10 percent and a 95 percent level of confidence.
The results are not surprising, said Tom Ruff, public sector vice president at Akamai Technologies, which commissioned the study. It confirms a disconnect that has long existed. Ensuring a user-friendly experience ranked last among the priorities of security professionals, and that probably is as it should be, Ruff said. “At the end of the day the cyber team has got to protect the agency’s mission. That’s job one.”
But with 50 percent of the threat coming from insiders, either intentionally or accidentally, bridging the gap between users and defenders is becoming more important to the security of government networks and systems.
This is not a new idea. Government cybersecurity policy has been moving toward a closer integration of security with IT operations in an effort to provide better real-time visibility into the activity and status of systems. This is, in part, what the focus on continuous monitoring is all about. But the integration also could help move the security shop closer to the users, giving it a better view of just what it is the users are trying to do, what their pain points are and why they are responsible for so many breaches.
It is not a one-way street, of course. The users are going to have to learn to accommodate security when necessary. Just because something can be done doesn’t mean that it should be, and some inconveniences are legitimate trade-offs for improved security.
Awareness training is supposed to be a part of agency cybersecurity programs, and lack of awareness does not seem to be the root of the problem. According to the MeriTalk/Akamai survey, 95 percent of users believe that cybersecurity is an absolute necessity. As long as users understand the reason for a specific policy or process, they probably will accept it.
“The more transparent the security policy is, the easier it will be to address the divide,” Ruff said.
Bridging the divide at a time when challenges are growing faster than budgets and everyone is struggling to make ends meet is not easy. But if agencies can find time to focus on this challenge it could be a cost-effective way to help improve security.
Posted by William Jackson on Oct 25, 2013 at 1:22 PM9 comments
We’re approaching the end of the second week of the federal shutdown and so far there have been no cyber crises. This is the point in the movie where the hero says, “It’s quiet out there. Almost too quiet.”
We should not assume that because we haven’t seen major actions against our IT systems that nothing is happening. If we have learned anything from experience it is that the breaches we don’t see are far worse than the ones we do, and there’s no reason to believe that stealthy intrusions are less likely now that staff, funding and other resources have been cut to the bone.
The United States is the number one target in an ongoing global cyber cold war and that is not going to stop because Congress will not pass a budget.
“It is wishful thinking that in the current environment we are not going to be targeted and that a few people can manage all of that infrastructure,” said Vijay Basani, CEO of EiQ Networks, which provides security intelligence tools and services to the government.
Since Oct. 1, shuttered websites have been sending the wrong message to our enemies and our friends about our commitment to cybersecurity. A particular concern: Online versions of the National Institute of Standards and Technology’s cybersecurity guidance are unavailable and NIST’s work on a cybersecurity framework for critical infrastructure, due Oct. 10, has been halted, unfinished.
Yet our IT systems have not disappeared. Patching and monitoring cannot get the same level of attention as during normal operations and dealing with cybersecurity as a crisis rather than a process is bad policy and bad security.
Essential crews remain at work, but the morale of IT and security professionals still on the job without pay cannot be very good and the prospect of hiring qualified professionals in the future becomes bleaker by the day. What competent worker would choose to go to work for a dysfunctional government that won’t pay its bills as long as there are jobs in the private sector?
Basani warned that the impact of gridlock began even before the shutdown. The sequester cut into budgets before the end of the fiscal year, when many procurements and acquisitions are done. And contracts that were in place by the end of the year cannot be implemented, so upgrades and replacement of systems, components and security tools are delayed. Meanwhile, the Homeland Security Department’s Continuous Diagnostics and Mitigation program, which was to be spurred by the award of 17 blanket purchase agreements in August, has been essentially put on hold until government can get back to business.
In short, as Basani said, “as much as politicians talk about cybersecurity, I don’t think they really understand the implications of the shutdown on cybersecurity.”
The best we can hope for is that those in charge learn from this experience and realize that cybersecurity should be outside the scope of political spitting matches.
The worst we can fear is that nothing is learned because there is no obvious cyber Armageddon and we do not see the cancer working its way through out systems.
Posted by William Jackson on Oct 11, 2013 at 1:00 PM1 comments
Federal efforts to create cybersecurity frameworks for government and for critical private infrastructures have had an impact on international views about cybersecurity, says J. Paul Nicholas, Microsoft’s senior director of global security and diplomacy.
“When I meet with customers in other parts of the world, it always surprises me how much they know about FISMA and FedRAMP,” Nicholas said, referring to the Federal Information Security Management Act and the Federal Risk and Authorization Management Program.
But there still is no common template for cyber policies, and various international development efforts are progressing separately. In the United States, the National Institute of Standards and Technology is creating the Cybersecurity Framework, a set of voluntary security recommendations for critical infrastructure. Across the ocean, the European Commission is creating the Network and Information Security Platform. And as nations develop strategies for securing their cyber environments, there is a risk that unaligned policies could create a fragmented or poorly secured global infrastructure.
Some differences among national policies are inevitable, Nicholas said. “Cybersecurity is going to vary country by country,” because each nation faces a unique set of risks and has its own needs. To help create a common foundation on which policies can work together, Microsoft has produced a whitepaper, “Developing a National Strategy for Cybersecurity.” The paper advises focusing on the basics and building on established best security practices. It advises that any strategy be:
- Outcome focused
- Respectful of privacy and civil liberties
- Globally relevant.
Although the Government Accountability Office has rated federal IT security as a high-risk area since 1999, Nicholas, co-author of the Microsoft paper, praised the progress being made in this country to establish a regulatory regime for cybersecurity, including FISMA.
“FISMA has really been a journey,” and important work is being done under it, he said. “Could it be better? Yes. But it is being fine-tuned to improve risk management.”
NIST has come through in providing guidance in its 800-series of reports on IT security, Nicholas said. Although FISMA and the NIST guidance are aimed at the U.S. government, their influence extends well beyond. “There is a framework and mentality that did not exist 10 years ago. FISMA better enables the U.S. government to have a risk dialog with the private sector. They are able to discuss things with a similar set of experiences.”
This is not to say that FISMA, which is far from perfect, is or should be the model for national strategies. The challenge to come up with some kind of functioning global system for securing cyberspace involves as much diplomacy as technology. “It’s about deciding what needs to be done and how to move forward,” Nicholas said.
Posted by William Jackson on Oct 09, 2013 at 11:39 AM4 comments