FISMA from the inside out

Roundtable focuses on the details of improving security

Talking about the grades, we think it's a way for us to set a benchmark and measure milestones. It's also a way of catching the agencies' attention.'

'Victoria Proctor, House Government Reform Committee

At USAID, the cost of getting on our network is you have to read a tip and answer a question. Every day.'

'Phil Heneghan, AID CISO

FISMA is the furthest thing in the world from a paperwork drill. These are very serious standards. Security controls are being implemented.'

'Ron Ross, NIST

You're as secure as the next vulnerability. So if you don't have a good process in place, and you don't constantly monitor the environment, when a big hack or vulnerability comes out ' you're very vulnerable.

'Karen Evans, OMB

Back in March, it happened again. The House Government Reform Committee issued its annual security report card per the Federal Information Security Management Act. And for the fifth straight year, the government as a whole earned the kind of mark that might get a school-age kid grounded [GCN.com/583].

But lost in the consternation over feds' cumulative D+ is the fact that some agencies actually pulled up that average. What are the Agency for International Development, Environmental Protection Agency, Labor Department, Office of Personnel Management and Social Security Administration'all A+ recipients'doing right in securing their systems?

In late April, federal officials convened at GCN's offices in Washington to discuss the role of FISMA and what agencies need to do to improve their grades.

Around the table were Karen Evans, administrator of e-government and IT for the Office of Management and Budget; Glen Schlarman, OMB's chief of the information policy and technology branch; Victoria Proctor, a professional staff member for the House Government Reform Committee; Ron Ross, a senior computer scientist and information security researcher at the National Institute of Standards and Technology's FISMA implementation project; and Phil Heneghan, chief information security officer at AID.

Offering industry's perspective were Terri Allen, a senior vice president at Cloakware Corp. of Vienna, Va., and Dave Steidle, director of product management at netForensics Inc. of Edison, N.J.

The discussion was wide-ranging, but frequently came back to management issues. In short, participants agreed, for security (and FISMA grades) to improve, business leaders within each agency'not technologists'must understand what's at stake and drive security efforts.

As Evans put it, 'The real motivating factor is not ending up on the front page of the Washington Post.'

Ultimately, there may be better days ahead for FISMA. As Ross pointed out, NIST has only just finished all the guidance mandated by the act.

When agencies absorb those controls and best practices, he said, security should improve. Just don't call FISMA a paperwork exercise'at least not in this company.

The state of FISMA

GCN: Recently Rep. Tom Davis indicated he was open to updating or recalibrating FISMA. What's the state of FISMA?

Proctor: We started opening up the discussion. What we're trying to do is reach out to the active stakeholders in FISMA and talk about where improvements need to be made ... and if there are ways to clarify the act. We're actually planning on talking to Karen and Glen a bit later on, but wanted to do some background work on our own, then reach out to them and see if there are areas of the legislation that need to be changed, or if something needs to be done with the guidance. We're very much at the beginning stages right now.

GCN: Congressman Davis has addressed the feeling that FISMA has been seen too much as a paperwork exercise. What's the committee's reaction?

Proctor: It's certainly a comment that has come from different places. We've heard it from private-sector companies and others. For us, we see FISMA as a major cultural change in the agencies, and if an agency is viewing FISMA as a paperwork exercise, then clearly there is a problem and they haven't grasped the fundamental essence of the act.

Evans: It is a cultural issue. If you look at FISMA as a paperwork exercise then you haven't really grasped the intent of the legislation, which is that you have to manage the risk associated with your information and the technology that you're deploying.
[IMGCAP(2)]
Allen: There was an interesting quote in GCN: 'FISMA compliance does not necessarily mean improved security, but improving security will lead to FISMA compliance,' and I think that's the essence of what FISMA is about, not the other way around.

Ross: Exactly. From our perspective at NIST, we were responsible for developing all the implementing standards and guidelines that make compliance possible. The legislation is broad, it's sweeping, it is changing the culture across the entire federal government, and I think we don't have a lot of patience sometimes. We're only three years in, and we're just now completing the standards and guidelines.

It takes an awful long time for it to be understood by the agencies, propagated, assimilated into their culture, and so this is a long-term process, and what I see from our perspective is people working real hard. It's the furthest thing in the world from a paperwork drill. These are very serious standards. Security controls are being implemented. And it's a major investment, building this security foundation for what is the largest IT infrastructure in the world. So it's a tremendous undertaking, and we are making good progress, I believe.

The risk issue

GCN: Phil, AID has done exceptionally well under FISMA. What are the challenges you've identified and handled correctly, and what others might you be struggling with?

Heneghan: It is a risk issue for us. The big problem we had was communicating the risk to the business owners and putting all the decisions on them. When we began in earnest, three and a half years ago, to deal with the FISMA stuff, our primary focus was getting technologies in place to collect data so we could inform business owners of that risk.

I deal with our CFO on a regular basis; she's a big stakeholder in the risk business. And I communicate with about 100 people every month on the status of their risk. These aren't techies; these are mission directors overseas, CFOs, HR and folks like that. Initially, we didn't have the tools to give anyone real concrete data on what the situation was, but that was how we turned it around. We can manage risk, because we understand a lot more of what's happening.

GCN: Phil mentioned getting technologies in place. What is industry's role in this?

Allen: It's interesting being on the vendor side, because sometimes when we go and talk to customers and say we're here to help you get to green, or help you improve on your report card, oftentimes what we get is, 'Please don't wave my report card at me, because that's not what this is about.'

The battlefield is changing every single day. Initially we're trying to protect the perimeter, the network and the devices. Now we're trying to protect the software. [Government's] job is so tough because the field just keeps changing.

Steidle: Our organization is big on security metrics, so one of the things we talk about is how control frameworks have to adapt. What is often missed is that control frameworks also have to morph. What we set out to initially accomplish may not be what we are getting, so we have to step back and reinvent the wheel.

Evans: But it's not software we're protecting, it's information. The government has more information than any other entity.

We manage people's privacy; we have all kinds of information. ... We use software to manage it and make decisions, we use networks to transport it around ... so we have to have good configuration management in place, and we have to have good lifecycle methodologies.

All of these things have been around ever since we started using computers. It's just about making sure those disciplines are in place and then having the discussion, 'If I want to do this service, and I'm collecting this type of information, how much risk am I willing to live with?'

Once you make the decision that it's network-enabled, all the other pieces come into place. You really have to get the most senior managers of your organization focused on protecting information, managing people's privacy. Information is an asset'everything else, they're just tools of the trade.

Leadership buy-in

GCN: So when agencies get Ds or Fs in FISMA, how much of it is a technology issue and how much of it is process?

Evans: A fool with a tool is still a fool. It's not about the tools. It's really about understanding what you're managing. ... The real motivating factor is not ending up on the front page of the Washington Post. That's happened to me [at the Justice Department].

When you're a technologist providing operations, you don't want policies or laws'all of that gets in your way because you've got to get those services out there. ... But when the FBI investigates, they ask where are your security policies? Were they documented? Who made this decision to do what?

When you go through that and you have to answer those questions, the reports and the paperwork are just a byproduct of really understanding what you need to do.

Heneghan: As long as it's not just you and your team making the decisions. A culture shift for us was that the [designated approving authorities] and the accreditors of all the systems are not technologists. They are the business owners.

I'm the certifier across the enterprise to make sure there's acceptable risk at the enterprise level. And I'm usually willing to accept a lot more risk. But then it goes to the DAA, who's the business owner, and they say, 'Get it fixed before I sign this.' And all of a sudden stuff starts happening. And it's not the techies saying, 'You have to do this; you need these tools.' It's the DAA saying, 'I'm worried about the Washington Post and I'm willing to pay to fix it.'

Evans: The DAA is the business owner who says, 'This service is important to me, and I've analyzed everything and I understand the risk.' Plus, what's really powerful, is that when you have the discussion, it's not that we're not going to be compromised, it's that when we are compromised, here's what we will do to recover.

When your senior managers know that, they're willing to put their names on the line ... because they know who is accountable and responsible for what. That's why we make sure we look at who's testing their contingency plans. It's great to have them on paper, but when [something] happens is not a time to be testing the plan.
[IMGCAP(3)]
Heneghan: To make sure we don't have a rubber-stamp process, it's a sit-down session with every DAA on accreditation and I'm explaining the risks to them. It's all there on paper, but I don't want to treat this as a paperwork exercise either, and they understand the risks.

How secure are we'really?

GCN: Whenever FISMA grades come, it seems the government as a whole hovers around a D+. But we don't hear about things like Social Security numbers walking out the door, or mission-critical systems being taken down by hackers. How secure are government systems really?

Evans: That's the heart of the issue. Are the grades really representative of the state of affairs in the federal government? You have to look at what we're measuring in FISMA. There's a lot of detail that goes into the grades. To [Davis'] credit, he doesn't just take our word for it. They look at all the work that the CIO organization says they're doing, and what the [inspector general] says.

There are varying levels of expertise in the IG community and that's one of the controversies associated with this. If IGs are focused on a particular area that they know or like, they might not necessarily evaluate everything else. We've tried to normalize this, but you have varying skill levels in IGs just like you have varying skill levels of CIOs.

Proctor: Talking about the grades, we think it's a way for us to set a benchmark and measure milestones. It's also a way of catching the agencies' attention. But we understand there has to be a cutoff point for the grading period, and sometimes agencies are improving in certain areas after the fact. ... So we understand when agencies sometimes get frustrated.

Evans: One of the things you should look at when you look at the [FISMA] report is the plan-of-action and milestones process. Where the controversy is, is what we would term situational awareness. You have to have a process of constant monitoring. ...

You're as secure as the next vulnerability. So if you don't have a good process in place'even if you've done the paperwork exercise'and you don't constantly monitor the environment, when a big hack or vulnerability comes out ... you're very vulnerable.

Most of the things that happen to agencies, because we've hardened our perimeter, are internal [based on] rules of behavior. What kind of services are you allowing agencies to [execute]? Are you allowing laptops to plug into your network? Are you allowing [flash] drives to be connected to your network? When you do that, there's a certain amount of risk associated with that service. How do you mitigate that risk? It's a simple question but a complex answer.

Heneghan: How we solved the laptop issue was that in our remote locations, when people need to [connect to the network], we scan every box every three days. People found out that when they plugged it in, they'd get scanned and we'd send grades to the mission directors. After one month of that, mission directors now enforce the rule of no laptop gets plugged in until after the sys admin people have patched it.

FISMA standards and guidelines

Ross: It's the combination of policies and procedures and technology. We have the technology and we've had it for a long time, but until there is effective management oversight in enforcing those kinds of policies'plugging a laptop in'it all has to come together.

To the question of whether the networks are secure, we have a whole bunch of security controls that we've identified in the new NIST standards and guidelines, [Federal Information Processing Standard] 200, and those are mandatory.

We've tried to recommend a set of controls that are appropriate to go online with [OMB Circular] A-130. What is adequate security? Well, it depends on what that system does and how it supports a particular mission. So the FIPS-199, which was the first standard Congress asked us to produce, is really important because it requires every agency to categorize and prioritize their systems.

From that point, everything else flows downhill. ...

But this notion about the grades is an important one because when you put those standardized controls into place, you then have to evaluate. Are those controls effective in their application, are they implemented correctly, are they operating as intended, supporting your security policy?

That becomes important, because what criteria do you use to assess the controls? We just put out the second draft of the assessment procedures ... to help agencies really test and evaluate those controls but have some standard basis for doing it.
Right now every IG, every auditor, uses their own criteria, so it's no wonder the grades can vary. One of the factors is there's no consistency in grading criteria. Hopefully the NIST 800-53a publication [released May 4] will start to normalize that a little bit.
[IMGCAP(4)]
Schlarman: There is now portability across the federal government. You don't have to hire and train up an individual'each and every individual'and certify them as being security experts. But when you have a clearly defined set of controls and test procedures, it's portable. You learn that and you apply it to any organization across the federal government.

Then you know what to address, what the issues are, what to fix. Right now we see a patchwork quilt of capabilities. Part of that is thinking, 'What should I test, and how should I test it?' We're removing that unknown.

Perhaps a more interesting question is how are we, the federal government, secure relative to the world? I think we are as open as anybody at revealing our problems.

Evans: We're very transparent on this. ... It's very global when this goes out. There are all kinds of articles written all over the world. They're shocked that we also rank all our agencies on the President's Management Agenda, and that we're very open about what agencies are doing and what they're not doing.

We really believe in a lot of transparency for our citizens so that they know of what's going on. Private industry isn't going to do that because that's going to affect their stockholder price. Who's going to invest in a company that's vulnerable?

Ross: About international comparison, I don't think there is any set of standards anywhere in the world that is as integrated or comprehensive [as what] we have now.

Evans: [to Ron] They're all using yours.

Ross: From the definition of security controls, to the testing controls, to the [certification and accreditation] process. That was the power of legislation. It really empowered NIST to do the framework that integrates all these things.

Training and trust

GCN: The Security Line of Business is going to focus first on user training and FISMA reporting tools. What is the significance of that?

Evans: The most effective tools are the people. The people who are doing the work, dealing with the information, that is our most effective defense. If our own employees are aware of issues, like viruses, and we have a common base of training across the board, [we can be more secure]. We saw when we studied the lines of business, training ranged from 'I read an e-mail, click here' to 40 hours of extensive offsite training. We have to normalize that across the board.

Heneghan: At AID, the cost of getting on our network is you have to read a tip and answer a question. Every day.

If you are a business executive, you are going to get questions dealing with risk issues, things that are actionable and relevant to you. If you're a regular user, you're going to get the virus stuff, the password stuff. These are IT security tips for every single person. ... I saw that as people started getting these, in my interactions with accreditors, they started to better communicate and understand, because they did it every day.

GCN: Are there guidelines to deal with the actions and behaviors of people you trust?

Ross: Everything we do is reflected in our security controls. We view training and awareness as a very high priority.

It's a main requirement of FIPS-200 and there's an entire family of controls to support it. ... We have a family to support rules of behavior, talking about personal responsibility. What is the person's responsibility to do due diligence within the context of an information system?

Heneghan: Something we've done that's been effective is [memoranda of understanding] between systems. As an example, the CFO controls who has access to her systems. We would detect an anomaly given our other monitoring processes. But when someone wants the data from there and they set up an MOU, we have lively discussions about the risk that poses, and I let them fight it out as business owners.

They've both done their security plans and they both have to reveal to each other all of their faults.

How much risk are you willing to inherit from somebody else? Because now there's a new insider threat, if you will. That is the biggest risk we all face. The insider. The trusted individual who will act differently.

GCN editor in chief Thomas R. Temin, news editor Jason Miller and technology editor Brad Grimes conducted the roundtable interview.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above