Challenges still face federal administrators in implementing established priorities for IT

Challenges still face federal administrators in implementing established priorities for IT

Regardless of the direction of the new administration, unfinished work remains

By William Jackson

GCN Staff

The new year brings with it new challenges and opportunities. But regardless of what the calendar says or the agenda of the incoming administration, work remains from the old year that will have to be addressed.

Some of the initiatives facing IT administrators in the coming year come in the form of mandates from Congress or the Office of Management and Budget and have deadlines associated with them. Some are technical issues that could help agencies to more effectively carry out their missions. Here is a short list of items that likely will be demanding the attention of administrators and executives in 2009:

· DNSSEC: DNSSEC is a tough nut to crack and IT administrators within the .gov top level domain have until the end of the year to crack it. DNS Security Extensions help to ensure the integrity of Internet names and addresses resolved by the Domain Name System by digitally signing requests and responses from servers, and it must be enabled on .gov name servers this year. But doing it right will not be easy, said Paul Parisi, CTO of DNSStuff. “The first couple of times you try it, it’s not going to work.”

· Encrypting Data at Rest: The Office of Management and Budget warned civilian agencies back in 2006 that they should be encrypting sensitive data on all mobile devices, “within the next 45 days.” But more than a year later only about 30 percent of laptops and handheld devices had been protected. DOD had a Dec. 31, 2008 deadline for encrypting all laptops and removable media, which David Hollis, program manager for a governmentwide effort to implement encryption, said the department would not completely meet. “There has been a lot of progress,” Hollis said. “That being said, there is still a long way to go.”

· Information Sharing: The failure to “connect the dots” has been cited repeatedly as one of the faults that led to the terrorist attacks of Sept. 11, 2001. But the federal Information Sharing Environment, mandated by Congress in 2004, remains a work in progress four years later. Overseen by a program manager in the Office of the Director of National Intelligence supported by council of senior representatives from 16 departments and agencies, the ISE implementation guide sets a goal of June 2009 for completion. But the ISE is behind schedule. “There is no ‘school solution’ to the problem of information sharing,” said Program Manager Thomas E. McNamara, “one size does not fit all; and implementation plans must be flexible and dynamic to adjust to the unforeseen and the unintended.”

· Identity Management: The ability to identify a user and associate that identity with access privileges is necessary for effectively protecting and sharing data. The government has established a technical platform for this with its PIV and CAC cards, but the technology is the easiest part, said Gary R. Gordon, a senior scholar in identity management at the Indiana University School of Law. But there still are issues of policy, process and education as well as technology to be addressed, and a trusted environment is needed in which law enforcement and government can share data. “Government is looking for game-changing ideas to move forward in the next couple of years,” Gordon said.

And finally, one opportunity rather than a requirement:

· Putting IPv6 to work: The deadline is past and government backbones are enabled for IPv6. How can you use it to get your job done? Think in terms of video, VOIP, and RFID; anything that benefits from multicasting, mobility, quality of service, peer-to-peer connections and more endpoints. “The intent is to use IPv6 so that you can be expansive,” said David Rubal, regional manager for federal unified communications at Cisco Systems.

This list is not comprehensive but it covers a number of primary areas of concern administrators are likely to be dealing with in the coming year as they wrap up work from last year (and before).

Enabling DNSSEC

The Domain Name System is a critical element underlying the Internet, translating plain language domain names into IP addresses so that requests and data can be routed. Unfortunately it can be vulnerable to attacks that can allow hackers to poison cached data on name servers or intercept requests so that that traffic can be misdirected.

Vulnerabilities in some implementations of DNS software have been around for quite a while, but the discovery by researcher Dan Kaminsky last summer of a flaw in the DNS protocols themselves raised concerns about the viability of the system. The fix for this vulnerability, which adds additional randomization to requests to make them harder to spoof, is admittedly a stopgap, which has prompted the federal government implement DNSSEC within the .gov domain that it administers. DNSSEC uses digital certificates to digitally sign requests and responses so that they can be trusted.

The Office of Management and Budget will sign the authoritative .gov root zone servers in January, and agencies must have plans in place to deploy DNSSEC to all of their systems by December.

“It’s going to be bad,” Parisi said. “Unfortunately, DNSSEC even without a deadline is very complex,” and there is not a lot of expertise with DNS in most IT shops because it typically requires little attention. “DNSSEC once it runs will run fairly automatically. Getting it to run is not trivial.”

The industry needs to produce step-by-step guides to deployment to help customers along. “We’re working on that,” he said.

Using DNSSEC pretty much requires using version 9 of BIND, the most widely deployed DNS server. “Windows doesn’t really support DNSSEC right now,” he said. Few organizations use Windows for outside DNS servers, but fears that when administrators start looking at their DNS infrastructure they will find more Windows servers running inside their enterprises than they expected.

One company has eased the problem by building DNSSEC into DNS servers. SolidDNS from Infoweapons Inc. comes as an appliance running BIND and includes support for IPv6 as well as DNSSEC, said Infoweapons chief software architect Lawrence Hughes.

“You can set up signing requirements when setting up the appliance,” Hughes said. “Check one box, the domain is signed. You’re done.” The task of managing digital certificates and signing keys is simplified by using the parent key used to sign the root servers. “We have it set up to inherit the parent key, so only the root certificate is needed.”

If an organization cannot afford to replace its DNS servers, implementing DNSSEC is likely to be stressful and will compete with other work. But Parisi is confident agencies can make the deadline.

“It will be painful, but it will get done,” he said.

Encrypting Data at Rest on Mobile Devices

A number of high-profile data breach incidents prompted OMB to issue memo M-06-16 in June 2006 reminding agencies of basic security practices outlined by the National Institute of Standards and Technology that they already should be following. These included encrypting “all data on mobile computers/devices which carry agency data unless the data is determined to be non-sensitive, in writing, by your Deputy Secretary or an individual he/she may designate in writing.”

Because these were things that already should be done, a quick 45 days was given for compliance. The Defense Department chimed in the next year, setting a Dec. 31, 2008, deadline for similar encryption. But the Government Accountability Office reported in June of this year that data was being encrypted on only a minority of devices.

“From July through September 2007, the major agencies collectively reported that they had not yet installed encryption technology to protect sensitive information on about 70 percent of their laptop computers and handheld devices,” GAO reported.

But an intergovernmental Data at Rest Tiger Team, formed originally in 2006 to spur the use of encryption, has helped put in place 12 blanket purchase agreements under two governmentwide contract vehicles. Agencies purchased 1.4 million encryption licenses at steep discounts under these BPAs from July 2007 through July 2008, program manager Hollis said.

A tiger team is a temporary task force brought together to help solve a specific problem, in this case, protecting data at rest in mobile devices. What began in DOD in 2006 now includes 18 civilian agencies as well as NATO and state and local governments.

With the progress being made, “I’m hoping in the next year or two we can put this away,” Hollis said.

The team’s primary achievement has been to reach a consensus on minimum requirements for data encryption products and to award BPAs based on those requirements to 10 vendors for 12 products. The BPAs are co-branded under the DOD Enterprise Software Initiative and the General Services Administration’s SmartBUY program.

“For the most part, DOD is making progress,” Hollis said late in 2008. “There are some areas where we’re not going to make” the Dec. 31 deadline. For some ships at sea, compliance will mean a visit to port for a refitting of tactical systems, and waivers are being issued in these cases. There is also the huge number of devices to be protected, in both military and civilian agencies. “It’s a very large monster to defeat,” he said.

Information Sharing Environment

The Information Sharing Environment was created by the Intelligence Reform and Terrorism Prevention Act of 2004 to help stakeholders that acquire, process and use information about potential threats to the nation use this information more effectively. The act does not specify the technology to be used, describing the environment only as “an approach that facilitates the sharing of terrorism and homeland security information, which may include any method determined necessary and appropriate.”

The program manager issued an implementation plan in November 2006 and has updated is since through March of 2008. But a Government Accountability study of ISE progress released in June faulted it as not providing specific definitions for the scope of the program, the desired results and milestones and metrics for measuring those results. Broad goals have been established, a number of federal, state and local information sharing initiatives have been incorporated into the plan, but the program is behind schedule in completing the 89 action items identified in the plan for getting the environment in place by June.

“Our work since 2001 indicates that the federal government has improved the sharing of terrorism-related information but has struggled in the process,” GAO wrote in its report. “In January 2005, we designated information sharing for homeland security a high-risk function because the government had continued to face formidable challenges in analyzing and disseminating key terrorism-related information in a timely, accurate, and useful manner.”

That challenges that make information sharing a high-risk program are complex and more than technical, McNamara said. “The challenge lay in reconciling myriad policy, process and technology differences among multiple organizations tasked to perform a variety of disparate missions.”

McNamara told GAO that ISE is a governmentwide transformational effort and evolutionary process and that there is no roadmap for this kind of work.

“We are pioneering, at least within the Federal Government, in building a true, extensive, government-wide information sharing environment,” he wrote. “No one, to my knowledge, has attempted this before. No one, to my knowledge, knows with certainty the correct path, or sees a clear end state of the ISE. Indeed, there is no end state in the true meaning of that term, only a vision.”

GAO recommended that the program fully define the scope and results to be achieved and develop performance measures for implementation and improvements in information sharing.

“GAO acknowledges that creating such measures is difficult, particularly since the program is still being designed, but until these measures are refined, future attempts to measure and report on progress will be hampered,” GAO wrote in its assessment of the program.

Identity Management

The government has made significant progress in creating a technology platform for identity management with the deployment of Personal Identity Verification Cards in the civilian sector and Common Access Cards in the Defense Department.

But putting the technology into use can be more difficult than creating it. To be effective it has to be integrated with business processes, applications, and front and backend systems for a multitude of missions. Individuals can possess multiple identities or rolls within an organization with different sets of access privileges; and the facilities, information systems and applications have differing levels of risk and security, with differing requirements on identity management. These differing needs are why multiple identity management and access control systems have developed in separate silos and why each of us possesses so many passwords, PINs, tokens and other credentials that we have to manage.

The Center for Applied Identity Management Research, a non-profit group headquartered in Washington, was created last year to identify gaps in identity management solutions and drive research to fill them.

“We are trying to determine what the key challenges are,” said executive director Gordon. “Until we have a handle on that, it is difficult to answer the question of why identity management is so hard.”

The center will work on seemingly abstract problems, such as coming up with a common set of definitions for discussing identity management issues, but its goal is to produce quick, practical results. It expects to release a report early this year that will become a blueprint for multidisciplinary applied research and development on solutions over the next year.

CAIMR members include educational institutions such as IU and the University of Texas at Austin, as well as solution developers and providers such as IBM Corp., and end users such as banks, the Secret Service and the U.S. Marshals Service, so that solutions can bridge the gap between academic study and practical results.

Progress in this area will have to move quickly to remain practical because the technology, its uses and the threats are changing so quickly. But Gordon said standardized, interoperable smart ID credentials developed by government for its employees and contractors are a good starting place.

“It gives them a common way of dealing with employees and vendors,” he said. “It becomes a logical way of organizing and developing trusted relationships.”

Putting IPv6 to work

We still are waiting for the killer app that will take advantage of version 6 of the Internet Protocols and make IPv6 a must-have. Some industry observers say we won’t have too long to wait.

“What’s going to change this are applications like video,” said Cisco’s Rubal. IPv6 can simplify multicasting and give more people access to the resource. “In an IPv6 setting it is much more dynamic. I believe video will be the application that will shape and accelerate the use of IPv6.”

Mobility applications also are starting to bubble up, Rubal said.

Infoweapons’ Hughes says it will be voice over IP. “VOIP will be one of the big ones,” he said. “SIP on IPv6 can do away with a lot of the complexity and problems associated with VOIP.”

There already is VOIP equipment incorporating IPv6. The eyeBeam softphone from CounterPath Corp. uses the new protocols, as does the Asterisk IP PBX server. Infoweapons plans to release a dual stack IPv4/6 PBX server early this year.

Also coming this year is the RFID3 standard for addressable Radio Frequency ID devices that can easily provide information on location and conditions using IPv6. These can be used for sensors and logistics, and the expanded address space of IPv6 makes if feasible to use the devices in large numbers for public health, weather monitoring, materiel tracking and numerous other applications.

One of the keys to taking advantage of IPv6 is doing away with the restrictions imposed by the limited address space in version 4. This allows deployment of more endpoints and enables easier peer-to-peer connections for dynamic, adaptive networks. IPv4 has worked around its limited address space by using Network Address Translation, which works but some imposes networking limitations.

“NAT comes at a big cost,” said Hughes. “It turned the Internet into a one-way channel.”

But Rubal said he expects that to begin changing as IPv6 is put into use. “We’re going to see the NAT curtain dropping somewhat in the next 12 months,” moving toward more global visibility, he said.

Stay Connected

Sign up for our newsletter.

I agree to this site's Privacy Policy.