Stocking stuffers

Cheap, fast geomapping<@VM>Web 2.0 offers interactivity<@VM>Beware the Botnets<@VM>The battle of government search <@VM>Virtualization <@VM>Government smart cards <@VM>Loose Data<@VM>IT giants embrace open source <@VM>Defense software acquisition reform<@VM>IPv6 gets legs<@VM>Power consumption <@VM>11 trends from 2006<@VM>Technology of the Year: Challenge/Response spam filtering

If you're a chief information officer or a system administrator, pat yourself on the back'you survived one of the most challenging years in information technology so far. Whether you worked to raise system security to new levels, plotted a move to the next Internet Protocol, or found ways to take advantage of new, inexpensive technologies, you likely found yourself in uncharted territory at one time or another.

And down the road, you can probably expect more of the same. Here are 11 of the biggest disrupters'both good and bad'GCN has seen this year, and how they could affect your operations in the years to come. And if we've only whetted your appetite here, enter 719 in the GCN GCN.com/box at the top of this page for links to more coverage of these issues.



Geographic information systems have been around for well over a decade, though 2006 is certainly the year agencies started to get their hands on cheap geospatial capabilities, thanks to free and open-source offerings by Google Inc., Microsoft Corp., Autodesk Inc. of San Rafael, Calif., and MetaCarta Inc. of Cambridge, Mass.
Google Maps, for instance, offers a free application programming interface that others can link to from their own applications. Developer Adrian Holovaty created the Web site Chicagocrime.org by pulling crime data from the city of Chicago's own Web site and placing the locations of the crimes on a Google map.

Elsewhere, the Defense Intelligence Agency picked up MetaCarta's GeoTagger for its User Knowledge Environment Information Management System. GeoTagger analyzes documents for geographic references and then creates coordinates for those references that can be placed on maps.

Other offerings also show promise for creative geomapping. Microsoft upgraded its Virtual Earth 3D online map and data service that compiles photographic images of cities and terrain, which can be used to generate textured, photorealisitc 3-D models with engineering-level accuracy. Autodesk integrated several of its mapping programs with Google Earth's geographic information system capabilities.Web 2.0 certainly wins the buzzword of the year award, but behind the hype lies some promising technologies for government agencies. The term is shorthand for a wide and sometimes shifting range of Web technologies, including wiki (collaboration software), Ruby on Rails (rapid Web application deployment software) and Ajax (a scripting technology that makes Web pages more interactive).

In a nutshell, how these technologies make Web 2.0 different from the plain old World Wide Web we all know now is that they all can offer richer online interactions for the user, allowing you to better use agency services or even to communicate with like-minded individuals. 'Web 2.0 is a decentralized mode of organizing communities of practices,' said Eric Sauve, CEO of Washington-based Tomoye Corp., which makes Web collaboration software used by the Army and other agencies.

Smart government leaders actually caught on to Web 2.0 early'before the buzzword was even coined, in fact. Last year, when the Federal CIO Council's Architecture and Infrastructure Committee revised the Federal Enterprise Architecture Data Reference Model, it used two wikis to hash out the specification.

CIM Engineering Inc. of San Mateo, Calif., supplied the wikis through an existing contract with GSA. The CIA has already compiled about 12,000 wiki pages scattered throughout its top-secret network, said D. Calvin Andrus, the chief technology officer for the CIA's Center for Mission Innovation. The open nature of wikis lets analyst augment and update existing material a lot more quickly. Other agencies could well find similar benefits.Bots, or compromised computers under the remote control of a hacker, have been around for years. But botnets'networks of compromised machines under the control of a single evil overlord'have grown into a significant problem over the past year, as hacking has moved from a vanity hobby to profit-driven organized crime.

Targeted computers typically are infected en masse by self-replicating worms that exploit unpatched vulnerabilities. Once infected, the new bot is directed to contact a server and download malicious code that puts it at the disposal of a controller.

If this is done quietly, a single controller can amass an army of thousands of compromised machines, which can be rented out to the highest bidder for purposes such as extortion through denial-of-service attacks, phishing, distributing spam, hosting malicious or contraband software, and infecting more bots. In addition to malicious activities, botnets also can consume network resources.

Spikes in the number of suspected bot clients were seen in June and have continued to increase through the end of the year. Not coincidentally, spam has been a persistent problem despite the growing use of filters to block it.

Network intrusion prevention systems, from companies such as Cisco Systems Inc., Juniper Networks Inc. and McAfee Inc., are getting better at identifying and blocking this traffic.

False positives, which can wrongly block legitimate traffic, have been the bane of intrusion prevention, but maturing technology has made the tools more effective. Unfortunately, huge botnets can be assembled, used, disposed of and replaced quickly, so that the fight continues unabated.Government information became a hot commodity this year. In January, the General Services Administration relaunched FirstGov.gov, the official government search site, after hearing endless groans about the older system.

The agency used Vivisimo Inc.'s clustering technology and Microsoft Corp.'s MSN search tool. 'When FirstGov got started, we crawled, but as more and more agencies put information on the Web, we had to provide more service, and we had to scale to manage this,' said Mary Joy Pizzella, former associate administrator of GSA's Office of Citizen Services and Communications, who left in June for Google Inc.

Coincidentally, Google relaunched its own government-specific search site, Google U.S. Government Search, which carves out all the .mil and .gov sites from its voluminous index of the Web.

In the fall, Google representatives started visiting agencies asking them to index their database content, so it could also be retrievable through the search engine.
The good news is that while the search giants duke it out over which gets to be the premier gateway to government information, agencies themselves can benefit from the new technology such a battle inevitably brings about.

Vivisimo, for instance, has updated its Velocity 5.0 enterprise search software, based on the work it has done with GSA. It features visualization, document aggregation and the ability to connect different types of documents.To veteran mainframe systems administrators, virtualization is nothing new, and open-source enthusiasts have been slowly building on the technology over the past few years. This year, however, it broke into mainstream enterprise computing in a major way.

Virtualization enables one complete operating system, plus associated applications, to run within another OS. Because many applications use only a portion of a server's resources, more applications could run on a server'and thanks to virtualization, all those applications don't have to be all running under the same OS.

Virtualization can happen on multiple levels. Products from XenSource Inc. and VMware Inc., both of Palo Alto, Calif., can virtualize a complete operating system. Desktop virtualization, often linked to thin-client computing, comprises a complete server-hosted environment that's accessed remotely. Companies such as Citrix Systems Inc., Sun Microsystems Inc. and VMWare offer variants of this architecture. Single applications may also be virtualized, through the use of products from Altiris Inc. of Lindon, Utah, as others.

Dennis Clem, CIO of the Office of the Secretary of Defense and the Pentagon, deployed virtualization to consolidate a large number of servers within the Pentagon. 'It solves the space problem, and it reduces the cost of buying replacement hardware,' Clem said.The new Personal Identity Verification card mandated by Homeland Security Presidential Directive-12 could usher in an era of public-key-infrastructure-enabled transactions, improved network security and interagency trust models. But it won't happen anytime soon.

Judith Spencer, chair of the Federal ID Credentialing Committee, said the new interoperable smart ID card is 'in many ways, just a key.' How that key will be used will depend on the agency using it.

The October HSPD-12 deadline was largely a formality. Agencies had to demonstrate minimal ability to issue the cards. Most agencies are a long way from fully implementing a PIV card program. They have until 2008 to actually get the cards into the hands of all employees and contractors, and even that is an ambitious schedule.

Using the cards as anything more than a picture ID will require technical infrastructure and new business processes. In the near term, federal IT administrators probably will have too much on their plates moving their networks to IPv6 to give much attention to enabling PIV cards.

To date, agencies, along with the Office of Management and Budget and the National Institute of Standards and Technology, have focused on the technical standards and issuing process. These are necessary and important pieces of the PIV puzzle, but it will be some time before we have ubiquitous smart-card readers, PKI-enabled applications and reciprocal trust arrangements between agencies.The past year saw a steady parade of security breaches exposing sensitive personal data to possible abuse. One of the biggest was the theft in May of a Veterans Affairs Department notebook PC containing records on more than 28 million individuals.

It is unclear whether the problem of loose data is getting worse or we're just hearing more about it. Data breaches first became an issue in the wake of the 2003 California law requiring public notification of breaches. Since then, 28 states have passed similar laws. But it is undeniable that as data becomes more mobile it becomes more vulnerable to loss and misuse. Notebooks, personal digital assistants, cell phones and tiny USB drives are becoming increasingly powerful and connected.

The problem is too broad and the vulnerabilities too various for any single solution. Data encryption is emerging as a broad canopy for many problems. Last summer, Baltimore-based SafeNet Inc. offered government clients free downloads of its ProtectDrive encryption after the Office of Management and Budget ordered agencies to encrypt all data that seemed in potential danger of being lost or stolen.

Enforcing agencywide encryption policies and managing the technology on a large scale can be difficult. In the end, a broad range of physical security, identity management and access control tools is needed to ensure a reasonable level of security for sensitive data, regardless of where it resides. Major IT companies, most notably IBM Corp., have increasingly embraced open source over the past several years. But this year saw an unprecedented interest by the IT clan of the Fortune 500.

Most notably, Microsoft Corp. signed a partnership deal with Novell Inc., in order to have Novell's Linux platform work more easily with Microsoft Windows.

Although the deal soon fell into dispute over patent protection issues, the message was clear: Microsoft had to recognize the growing use of open-source software (or at least of Linux as a server platform).

This wasn't the first capitulation by Microsoft. Last summer, the company started to help develop a software tool that would let Microsoft Office users open and save documents in the Extensible Markup Language-based Open Document Format, the format used by the open-source Open Office suite, a competitor to Microsoft Office.

Microsoft is not alone in courting open-source users. Oracle Corp. started reselling, as Oracle Unbreakable Linux, a rebranded Red Hat Enterprise Linux.

And Sun Microsystems Inc. has released the source code to its widely used Java programming language. 'Commercial entities that embrace open source may find it leads to a greater market for other things,' said Bruce Sunstein, a lawyer who covers open-source issues for Bromberg and Sunstein LLP of Boston.Could 2006 be remembered as the year that the Defense Department finally declared war on its lumbering software development process?

In February, James Finley had taken the helm as the new deputy undersecretary of Defense for acquisition and technology and shortly thereafter started looking for ways to expedite the process of getting software to DOD's systems.

Defense has taken heat for several large IT-related projects of late. The Government Accountability Office and Congressional Budget Office have criticized the Army for how well the Future Combat Systems program is progressing. In November, the Senate lambasted the Defense Travel System for cost overruns and poor performance, among other things.

Also that month, the Defense inspector general reported that, in a sampling of procurements, it had found numerous instances of inadequate quality assurance oversight and acquisition. Software is only part of the problem, Finley admitted, but a sizable one.

Finley now is actively soliciting ideas from program managers and contractors on how to better and more quickly engineer software for fighter planes or tanks and other unique military systems. The key, he said, lies in the acquisition process. 'If there is one thing that I've heard that has been consistent, it is that the systems aren't broken. The decision-making process is broken,' Finley said.

Over the next few years, Finley hopes to introduce changes to the procurement process based on this feedback. Even the most basic assumptions will be questioned and, if found faulty, removed.

'If I'm going to go from A to B, do I need a Cadillac, or a jet fighter or a horse-drawn buggy? Historically [with the Defense Department] it has been meeting requirements at all costs. In this day and age, I think we need to be more mindful of our taxpayers' pocket books,' he said.A year ago, IPv6 was an unfunded mandate; a project offering few short-term benefits and with little in the way of motivation except directives to have the new version of Internet Protocols working on government backbones by 2008.

Today, agencies have begun developing written plans not only for how they will implement IPv6, but how they will integrate it into their core missions. In a recent survey of IT officials, nearly half of civilian agencies see the transition as important to supporting IT goals. The percentage was even higher within the Defense Department, which has had a two-year head start on the transition.

Acceptance of the new networking protocols has not come easily. Agencies have a huge investment in infrastructure, manpower and training in the existing IPv4 protocols on which the current generation of the Internet is based. Despite limitations, the current version is working well enough that many administrators would prefer not to have to move to IPv6, especially because they will also have to maintain IPv4 traffic on their networks for the foreseeable future.

But repeated efforts, often by the networking vendor community, to educate administrators about the potential benefits of IPv6 have begun to bear fruit. A lot of work remains to be done and the transition still poses headaches, as well as opportunities, for years to come. But administrators are beginning to think about how to use IPv6 features such as end-to-end performance, peer-to-peer security, autoconfiguration and improved collaboration.At this year's SC06 supercomputing conference in Tampa, Fla., Top500.org organizer Erich Strohmaier suggested adding a new metric to the ones he uses to evaluate the world's most powerful computers: power efficiency.

For years, data center managers did not have to worry about how much power their servers gobbled up'after all, the electricity bill went straight to accounting. With processors taking up ever more wattage'and electric rates spiking'the issue hit a critical threshold this year.

At the High Performance Computing and Communications Conference in Newport, R.I., Thomas Zacharia, associate lab director at the Energy Department's Oak Ridge National Laboratory, outlined ORNL's plans to build the world's first petascale computer, which will be built from 24,000 microprocessors. The laboratory will need a 170-megawatt substation to support this project.

Fortunately, the industry has taken notice. Intel Corp. rolled out a new line of Xeon processors, originally codenamed Woodcrest, that the company claims can ultimately boost performance by 80 percent while reducing power consumption by 35 percent. Introduced late last year, the new UltraSPARC T1 from Sun Microsystems Inc. also works at a lower wattage while offering the ability to execute more threads simultaneously.

The government is paying attention, too. Perceiving a lack of solid power-to-performance metrics for increasingly electricity-hungry data centers, the Environmental Protection Agency's Energy Star efficiency program has turned its efforts to the server market. This fall, the program released a set of metrics that can be used for measuring server energy efficiency.Readers can follow these links to GCN's 2006 coverage of a variety of information technologies, and how they could affect operations in the years to come.



1. Cheap, fast geomapping:


Geographic information systems have been around for well over a decade, though 2006 is certainly the year agencies started to get their hands on cheap geospatial capabilities, thanks to free and open-source offerings by Google Inc., Microsoft Corp., Autodesk Inc. of San Rafael, Calif., and MetaCarta Inc. of Cambridge, Mass.
More

More on geomapping from GCN


Agencies face new, 3-D era of geospatial information (11/07/06)



Autodesk meets Google (09/18/06)



Maps: the new application interface (09/25/06)



Autodesk meets Google (09/18/06)



When X doesn't mark the spot (08/28/06)



Data scraping, Web 2.0 style (04/24/06)




2. Web 2.0:


Web 2.0 certainly wins the buzzword of the year award, but behind the hype lies some promising technologies for government agencies. The term is shorthand for a wide and sometimes shifting range of Web technologies. In a nutshell, how these technologies make Web 2.0 different from the plain old World Wide Web we all know now is that they all can offer richer online interactions for the user, allowing you to better use agency services or even to communicate with like-minded individuals. More

More on Web 2.0 from GCN


Forecast predicts shift in IT spending (10/30/06)



Ajax-based collaboration (10/23/06)



Ruby won't trump Java (10/30/06)



Web 2.0 business models affecting enterprise systems design (9/26/06)



The amazing Wikis (08/21/06)



The story behind Ajax (08/23/06)



E-Gov meets Web 2.0 (07/17/06)



At your service (04/24/06)




3. Beware the Botnets


Bots, or compromised computers under the remote control of a hacker, have been around for years. But botnets'networks of compromised machines under the control of a single evil overlord'have grown into a significant problem over the past year, as hacking has moved from a vanity hobby to profit-driven organized crime. More

More on botnets from GCN


Spam surge bot driven (11/01/06)



Sharing data is crucial to cyberdefense (08/21/06)



Hacker arrested for breaching DOD systems with 'botnets' (11/04/05)




4. The battle of government search


Government information became a hot commodity this year. In January, the General Services Administration relaunched FirstGov.gov, the official government search site, after hearing endless groans about the older system. More

More on federal search engine from GCN


Google wants you (11/20/06)



FirstGov.gov's new search engine launched (01/24/06)



Google launches federal search engine (06/15/06)



The search is on (07/03/06)



Vivisimo goes beyond FirstGov (06/05/06)




5. Virtualization


To veteran mainframe systems administrators, virtualization is nothing new, and open-source enthusiasts have been slowly building on the technology over the past few years. This year, however, it broke into mainstream enterprise computing in a major way.

More on virtualization from GCN


The future of virtualization (08/22/06)



Virtual IT helps make do with less (06/26/06)



Microsoft goes virtually ga-ga (06/12/06)



The server that wasn't (05/22/06)



Virtualization for trusted computing? (04/17/06)



What is software virtualization? Try it (03/22/06)




6. PIV/PKI


The new Personal Identity Verification card mandated by Homeland Security Presidential Directive-12 could usher in an era of public-key-infrastructure-enabled transactions, improved network security and interagency trust models. But it won't happen anytime soon. More

More on smart cards from GCN


Education hires VeriSign to improve PIV card issuing (11/16/06)



PIV's new deal (11/06/06)



OMB wants copies of new PIV cards (10/27/06)



Ready or not, here come the PIV cards (10/26/06)



EPA signs deals in hopes of making HSPD-12 deadline (10/06/06)



PIV specs come down from NIST (09/25/06)



Agencies enter the home stretch for HSPD-12 (09/25/06)



HSPD-12: It's not all in the cards (08/28/06)



PKI use advancing at DOD (08/14/06)



Surveys: HSPD-12 plans lag (07/10/06)



7. Loose Data:


The past year saw a steady parade of security breaches exposing sensitive personal data to possible abuse. One of the biggest was the theft in May of a Veterans Affairs Department notebook PC containing records on more than 28 million individuals.

It is unclear whether the problem of loose data is getting worse or we're just hearing more about it. More

More data security from GCN.com

IP address exposed anonymous mudslinger (11/01/06)



Data held by feds, vendors at risk (10/13/06)



Free sells. Who knew? (10/06/06)



Agencies lag on reporting data breaches (08/17/06)



Hacker breaks into USDA system; data may be stolen (06/26/06)



When data walks (06/05/06)



VA not alone in letting data walk out the door (05/31/06)



VA data files on millions of veterans stolen (05/22/06)



NSA urges use of better redaction methods (02/20/06)



Without a trace (02/20/06)




8. Corporate open source


Major IT companies, most notably IBM Corp., have increasingly embraced open source over the past several years. But this year saw an unprecedented interest by the IT clan of the Fortune 500.

Most notably, Microsoft Corp. signed a partnership deal with Novell Inc., in order to have Novell's Linux platform work more easily with Microsoft Windows. More.

More on open-source from GCN


Stormy weather hits Microsoft/Novell parade (11/22/06)



Microsoft and Novell to play nice (11/20/06)



Sun opens Java (11/13/06)



Oracle serves Red Hat (10/27/06)



Microsoft relents on open documents (07/17/06)




9. Defense Software Acquisition Reform


Could 2006 be remembered as the year that the Defense Department finally declared war on its lumbering software development process?

In February, James Finley had taken the helm as the new deputy undersecretary of Defense for acquisition and technology and shortly thereafter started looking for ways to expedite the process of getting software to DOD's systems. More

More on Defense acquisition from GCN


DOD IG blames GSA, Defense for procurement problems (11/06/06)



Senators to DOD: Pull the plug on DTS (11/17/06)



On the defensive (10/09/06)



Field It Faster: Our Warriors Can't Wait (01/06)



10. IPv6 gets legs


year ago, IPv6 was an unfunded mandate; a project offering few short-term benefits and with little in the way of motivation except directives to have the new version of Internet Protocols working on government backbones by 2008.

Today, agencies have begun developing written plans not only for how they will implement IPv6, but how they will integrate it into their core missions. More

More on IPv6 from GCN


IPv6: It's a configuration management issue (11/20/06)



IPv6: The future is now (08/14/06)



Agency planning for move to IPv6 needs improvement, GAO says (07/31/06)



CIO Council offers best practices on IPv6 transition (05/31/06)



An attempt to define 'IPv6-capable' (05/15/06)



Agencies find there's no single path to IPv6 (04/03/06)



How exactly will you get your IPv6 addresses? (04/03/06)



Lost in Transition (04/03/06)



11. Power consumption


At this year's SC06 supercomputing conference in Tampa, Fla., Top500.org organizer Erich Strohmaier suggested adding a new metric to the ones he uses to evaluate the world's most powerful computers: power efficiency. More

More on data centers from GCN


Senate calls for studying data center power consumption (07/31/06)



When data centers lose their cool (05/15/06)



Energy lab to run petascale computer (03/29/06)



EPA Energy Star program to tackle server market (02/08/06)

For years the GCN Lab has reviewed filtering technology to see how to best protect e-mail from the increasingly devastating surge of spam. But today, spam has left the realm of the annoying and pushed into where it actually hurts business, making employees spend a lot of time deleting it as well as clogging mail servers and depleting needed bandwidth.

With this in mind, the GCN Lab was thankful to find a new appliance that beats back the tide of spam far more successfully than any other approach we've seen, the I.C.E. Box from Sendio Inc. of Newport Beach, Calif. We voted the I.C.E. Box the best product of the year in our yearly wrap-up of best new products.

Overall, when the GCN Lab tested several filtering appliances this year, we found good results. For spam, devices were able to remove 95 percent or more of the junk from the stream. That's pretty good, unless your volume of spam is extremely high, which was the problem the GCN Lab test network was experiencing.

On any given month more than 500,000 spam e-mails were coming in, overloading the filtering devices and sending on a big load of approved spam to the mail server. Tightening the spam filtering controls helped, but we began generating false positives, losing some of the good mail along with the bad.
The answer for us and for an increasing number of agencies and businesses is a challenge/response appliance that really does no spam filtering at all. Each e-mail that comes into the network goes to the appliance, which triggers an automatic challenge e-mail back to each new sender. If the sender spoofed their return address, they won't ever get the challenge. If the e-mail is addressed to a user that is not on the network, the mail is dropped without a challenge being issued. If the mail comes from a spammer, then the challenge likely goes to a distribution server that can't respond.

Valid users simply reply to the challenge and are validated by the system and added to the approved list. The velvet rope is always pulled back for them in the future without a challenge being issued. The box still scans for viruses, but never for spam after the sender is verified.

Since we installed this challenge/response appliance on our own network, there have been almost zero incidents of spam coming through. I do say almost because just the other day one got through, a noticeable chink in the challenge/response armor. This is no big deal considering one got through and more than two billion did not, but it brings up an interesting scenario whereby it is possible that spammers might start to take notice of the challenge/response systems and try to defeat them.

It's possible to circumvent the technology, though it would be difficult to do on a large-scale basis. A spammer would need to set up an automatic mail distribution server and then an automatic response server that simply responds to challenges. At the very least, the response server would need to be public and would expose the spammer to the long arm of the law, but if it were an expendable server sitting on an island somewhere, then it might work. This however adds a level of expertise and expense to spammers that is not required right now. And the challenge/response companies could counter with graphical files representing numbers in the challenge that a machine can't read, but a human could decipher.

But this game of one-upmanship is not yet being played because there are not enough challenge/response appliances out there. But with an almost 100 percent effectiveness and no way to generate a false positive, it's only a matter of time. Challenge/response appliances simply work better than filtering ones for killing spam, and moving forward we believe this new technology will begin to encompass and eventually overtake standard filtering.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above