Security efforts misguided, experts say

Rep. Tom Davis calls FISMA 'a step in the right direction.'

Henrik G. de Gyor

The government's keystone information assurance law is misdirecting scarce resources and failing to secure legacy systems, an industry expert believes.

The 2002 Federal Information Security Management Act 'runs the risk of becoming a paperwork exercise,' said Kenneth Ammon, president of NetSec Inc. of Herndon, Va. 'If you look at the reporting that is being done under FISMA, there are virtually no objective measures of agencies' real-world security posture.'

Ammon testified late last year at a House Government Reform Committee hearing on the state of Internet security. Also testifying was F. Thomas Leighton, chief scientist at Akamai Technologies Inc. of Cambridge, Mass., who suggested that .gov Web sites should not be hosted on government servers.

FISMA requires agencies to include security in budget proposals for new systems and programs and to periodically evaluate the effectiveness of security policies.

The Office of Management and Budget also requires that existing systems be certified and accredited.

Karen Evans, OMB's administrator for e-government and IT, called FISMA a 'critical mechanism to enforce protection of federal systems' because it requires security to be considered at every stage of planning and implementation.

'No decision is made without assessing what the impact of the security investment will be,' Evans said.

But Ammon criticized FISMA's certification and accreditation process, saying it is valuable for new systems but 'provides little value when applied to existing systems. Agencies are slavishly spending scarce resources to produce reports that merely state the obvious'the legacy system is not secure and can't be secured'in page after page of gory detail.'

Committee chairman Rep. Tom Davis (R-Va.), who sponsored FISMA, said the law 'is a step in the right direction. But the threat is still great.'

Ammon showed examples of sensitive government personnel information and detailed data about suspected terrorists accessible through the Google Internet search engine.

Access to such data can be blocked with simple configuration changes, but 'only through thorough end-to-end application testing can the full scope of such vulnerabilities be identified.'

One way to prevent access to sensitive information would be to change the hosting of government sites, Leighton said.

'It could make sense to remove public-facing sites from government networks altogether,' he said.

About the Author

William Jackson is freelance writer and the author of the CyberEye blog.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above