Imaging

The term imaging can apply to anything from document management to computer-aided
design to scientific visualization. In its most classic sense, imaging—the digital
collection, manipulation and output of high-quality pictures—has long been considered
the province of graphic artists and game programmers.


No longer. Forces have combined to bring imaging technology down from the clouds:


Data mining tools are an excellent way to exploit that resource; in the last year or so
developers have introduced visualization tools to map business or management processes.
Computer Associates International Inc.’s CA-Unicenter TNG network management package,
for example, lets administrators visually check patterns and interruptions in smooth data
flow. And SeeIT, a visualization package from Visible Decisions Inc. of Toronto, offers
3-D maps of spreadsheet data.


But as these applications make imaging technologies a common desktop component, federal
managers must learn to cope with massive new equipment, software and support requirements.


Bandwidth that seemed adequate for largely text-driven applications stretches thin when
100M image files bounce around the network, and the standard commodity desktop PC may no
longer be adequate for the job. What appears to be a simple task—adding imaging
capabilities to an office—can quickly become an information technology headache if
administrators lack a solid knowledge of imaging capabilities and requirements.


Government systems managers confronting imaging requirements need to learn some new
languages and think about networks in new ways. IT truisms become meaningless. Imaging is
fraught with new standards, major software revisions and platform biases. Your imaging
requirements will drive the software choices you make.


In the pages that follow, I’ll guide you through the ins and outs of using imaging
on government networks and shed some light on what heretofore has been the province of
imaging specialists.


It’s certainly not all bad news. Low- to midlevel imaging tools and software offer
surprisingly sophisticated capabilities for desktop-level prices. Prices of color
printers, scanners, digital cameras and other devices have dropped dramatically over the
last two years, while features have continued to improve. Standards have finally jelled in
many areas, and manufacturers have flattened the learning curve to capture larger
audiences. So, many of these devices are far easier to use than their older counterparts.


Although it’s easy to produce a mediocre image, getting high-quality images and
the equipment needed to produce them can challenge a systems administrator. As the
sophistication—and price—climbs, imaging tools become increasingly difficult to
install, use and support in the typical office network. High-end imaging tools tend to be
proprietary, often introducing new operating systems and hardware platforms that will
require added training and support. User interfaces deviate from the look and feel of
standard graphical user interfaces and can be difficult to learn. And computer resource
requirements increase almost geometrically.


Sophisticated imaging software often incorporates features that the rest of the desktop
PC world left behind years ago, such as device-oriented copy protection, which many
support techs are not used to managing. The leading packages for several image-intensive
professions, such as computer graphics, visualization and CAD, routinely use parallel port
dongles and other restrictive protection schemes that can complicate user support.


A dongle, a copy protection device that generally attaches to the parallel port of the
computer, supplies the final key— after passwords, serial numbers and other software
protection mechanisms—needed to launch the application. Standard imaging applications
such as Adobe Systems Inc.’s Photoshop and 3D StudioMax from Autodesk Inc. of San
Rafael, Calif., are routinely protected this way. Dongles generally offer pass-through
capabilities that keep the parallel port free for other devices; for example, the first
dongle plugs into the port, the second dongle plugs into the first and a parallel printer
cable might plug into that. Although these devices rarely interfere with other
applications or printing devices, it can happen. And dongles can complicate user support.


Commercial software has become a driving force on virtually all other networked
clients, but it is still merely a nice idea in imaging, particularly at the highest
levels. Even so, there are pitfalls within the standards. The Microsoft Windows NT-Digital
Alpha processor combination, for example, has historically been a great choice for CAD.
But Autodesk, maker of category leader AutoCAD for Windows NT, stopped supporting Alpha
two versions ago.


From an OS point of view, Microsoft has positioned Windows NT as the platform of choice
for high-performance desktop PCs. And in many cases, workstations running NT can meet your
needs. However, it would be a mistake to automatically assume that the imaging tasks you
need to do can be done under Windows NT.


Although Windows has become the dominant platform among office desktop PCs, it’s
sometimes a poor second to other platforms in the imaging world. Windows electronic
publishing lacks many of the collaborative tools and prepress fine-tuning in Apple
Macintosh tools, for example, and high-speed visualization workstations may depend on
HP-UX, Ultrix or other Unix flavors.


Nowhere is OS diversity more evident than in imaging for paper publishing. The Windows
version of Quark Inc.’s QuarkXpress, the dominant application in this field, suffers
from serious limitations. Although the Denver company’s application is adequate for
many tasks, vital collaboration tools such as Quark’s QPS remain Mac-centric. Quark
has promised fully capable Windows versions of QPS for years but has, so far, failed to
deliver. Workgroup add-ins for other publishing software, such as Adobe’s PageMaker,
can require customization and are usually proprietary.


The choice of hardware is driven largely by de facto standards in software. The good
news is that it often doesn’t take much to render a new desktop PC fit for demanding
imaging apps. Advanced, high-end graphics workstations have simply previewed the features
that would appear in future desktop PCs. Few Windows office apps make good use of
processors faster than about 350 MHz, dual-processor options or 100-Mbps-plus network
interface cards. Such features— basic requirements for imaging apps—are
increasingly common on premium desktop systems.


Although many imaging apps will run on off-the-shelf products, some fine-tuning of
system requirements is usually necessary for optimal daily performance. And the decision
to add imaging to the network or workgroup may have repercussions down the line for other
IT functions:


Storage. All those images your users will create have to go somewhere, and they’ll
quickly fill up local hard drives unless you’re careful. Keeping a few images on a
hard drive is no problem, but you’re liable to find your network’s available
file space shrinking geometrically as new imaging systems go online. Storing extra data
centrally may make sense, but you’ll also need to cope with hefty network traffic
increases as 100M files sail between server and client.


It’s a good idea to incorporate a multitiered approach to image storage on these
systems. First, partition hard drive storage to separate applications and data, making
sure you give as wide a berth as possible to the boot system drive (usually c:) to prevent
potential crashes from overfull drives. If possible, opt for a non-FAT (file allocation
table) file storage system such as NTFS (the native Windows NT file system) on
image-centric computers. NTFS offers more efficient operation and, for volumes under 4G,
built-in compression.


Second, include a local backup drive to handle image files that can easily exceed 100M
in size. Your users may need to transfer files to other systems off the network, either to
work off-site or to deliver to a service bureau for output. Offloading those files to a
transportable cartridge can save enormous amounts of bandwidth as well as wear and tear on
your support staff.


Check to make sure your drive uses media that can be read by the service bureau. If
you’re likely to send files to multiple outside locations, you might want to invest
in a CD-recordable device that can store images on easily read CD-ROM disks. Remember,
however, that CD-R drives are typically very slow.


For using the most common formats, you’ll likely want to get Zip or Jaz drives
from Iomega Corp. of Roy, Utah. These devices dictate use of SCSI, not parallel port
connections, to get the best performance. In any case, parallel port Zip drives can
interfere with advanced color printers and they have compatibility problems with Windows
NT.


Third, rethink your file server storage. Although you’ll almost certainly add
storage, consider reallocating network storage for optimal performance, and investigate
new disk farms and storage area networks under development. These methods provide
extremely fast and efficient data storage and retrieval that won’t have a negative
impact on other network functions.


Network traffic. Although the occasional 40M graphics file shipped to a colleague has
only a momentary impact on overall LAN performance, expect traffic to grow as users become
accustomed to the new system. You may need to rethink your network design to isolate areas
of heavy traffic and concentrate expensive, high-performance equipment where it’s
needed. You may need to link graphics systems as a workgroup with their own server,
upgrade NICs to at least 100-Mbps capacity, increase file storage and use high-capacity
drives.


Interoperability. Many new imaging technologies don’t automatically interoperate
with conventional office networks. Fibre Channel links, for example, don’t talk to
Ethernet networks. And users on non-Windows clients can pose headaches, especially if they
also require standard, compatible office apps such as e-mail, word processing and
spreadsheets. Expect to make bridging the interoperability gaps between systems an
important part of your support.


Contrary to advertised notions, the most commonly used non-Windows imaging platform in
government circles—the Macintosh—requires specialized tech support. Apple
Computer Inc.’s you-don’t-have-to-know-that-so-we-won’t-tell-you philosophy
can sometimes make for an interesting troubleshooting experience. Count on either hiring a
Mac support tech or providing extra training for your staff.


Device compatibility. The proliferation of Windows might lead you to believe that any
Win9x imaging device will automatically work with Windows NT. It’s more likely to be
compatible with Macintosh than NT. Although they’re rapidly catching up, NT device
drivers lag behind those of Win9x. Not surprisingly, NT-capable devices tend to stay in
the higher, more expensive end of many product categories. Worse, some peripherals,
digitizing cameras and very high-end scanners, for instance, may be machine or
graphics-card specific.


A check of Microsoft’s NT Hardware Compatibility list, at
www.microsoft.com/hwtest/hcl/, lists few imaging peripherals as fully NT 4.0-compatible.
That can be misleading, as makers often ship devices with Win9x drivers in the box and
offer Windows NT support on their Web site. Before buying any imaging device, check the
vendor’s Web site for instructions on its use with NT.


As a rule of thumb, if you can attach it to a SCSI device, it’s more likely to be
NT-compatible than if it has an alternate interface, such as IDE, EIDE, parallel or
serial. Universal Serial Bus devices are likely to be Windows NT headaches, as will
FireWire’s, at least for the immediate future.


The best advice: Buy the software first, buy the computer second, and add all
peripherals—internal and external—according to the software developer’s
specifications. Usually, you’re best off buying the fastest established technology
and interface available. That’ll keep you from dead-end migration paths after an
unstable standard is upgraded.


You can group imaging devices into two functional categories: those that bring the
image into the computer and those that send it to its final destination, whether paper,
monitor, film or the Web.


Digital cameras. These devices are the imaging technology most likely to cause remote
support headaches. To users, they’re simply bulkier, costlier counterparts to film
cameras. Unfortunately, the ultra-high-resolution images even the cheapest film camera
offers are difficult to achieve in sub-$20,000 digital cameras.


Film cameras store images instantaneously as a continuous tone on film. Digital cameras
must break the picture into discrete pixels and write color and other information about
each pixel on some sort of storage medium. Digital cameras, unlike most imaging tools, do
most of their work untethered to a computer or network, and so must carry their own data
storage, power and programming.


Current limits on storage capacity constrain the resolution, color depth and size of
images that can be shot digitally; the speed with which image data can be stored limits
performance. Film cameras remain the best devices for rapid-sequence action shots, such as
those taken with motor drives. The degradation in picture size or resolution of extremely
wide-angle shots makes their use impractical with digital cameras.


Fortunately, in the most common applications for digital photography—Web and print
publishing—a film camera’s high resolution is overkill. Typical magazine photos
print at 300 dots per inch, while bandwidth-conscious Web images rarely exceed a mere 72
pixels per inch. A megapixel image is more than adequate for those applications. New,
2-megapixel cameras are on the market for less than $1,000, and the price of megapixel
models is dropping to well under $500.


Most digital cameras support Win9x and Mac OS, a few offer Windows NT support, and even
fewer support Unix. The camera may communicate adequately with the OS, but the bundled
software controlling it may be Win9x-specific. Professional graphics packages sometimes
can make up for any deficiencies.


Image transfer remains a problem; low-priced cameras for PCs use a slow serial port
interface to get pictures out of the camera and into the computer. Mac users are luckier;
most cameras offer the much faster SCSI interface.


All digital cameras use some form of digital film, i.e., removable storage cartridges
such as CompactFlash or SmartMedia ROM cards, to store images. Some card drives can
interface directly with a printer. Lexmark International Inc.’s 1,200-dpi Photo
JetPrinter 5770, for example, costs less than $400 and can accept the cards for direct
printing. But as resolutions increase, the size of the card or the card reader’s
interface speed will become limiting factors.


Sony Electronics Inc.’s Mavica line eliminates the serial cable and stores images
on high-density floppy diskettes. Although this is an easy way to transfer images, the
disk’s 1.44M capacity limits the Mavica to low-res images. More expensive Mavicas use
a FireWire interface—called iLink—for fast image transfer. As resolutions
increase, a USB or FireWire transfer mechanism will become the only practical alternative.


Scanners. With scanning, all you need to do is pop in a picture, press a button and
wait for the image to appear. But getting a good image to appear is more difficult.
Because scanners often yield disappointing results when used straight out of the box,
training has become key to successfully introducing a scanner into an office. Make sure
the primary user of the scanner knows how to optimize resolution and color palette with
the software tools provided.


Most offices have access to at least one film camera. If users don’t require
immediate digitization of their images—and if they’re trained to use the
image-processing software correctly—the film camera-scanner combination can be an
inexpensive way to digitize high-quality images.


The cheapest scanners can be had for less than $75; for that price you’ll get a
monochrome or grayscale scanner with the slowest possible interface, limited software and
low, interpolated resolutions in the 300-ppi range. For $150 to $300, you can expect
resolutions of up to 9,600 ppi, full-color support and a software bundle with high-quality
photo-retouch, optical character recognition and Web image optimization programs.


A scanner that supports 600- by 1,200-ppi resolution in 24-bit (16.7 million) color is
fine for most apps your users will encounter; consider paying extra for a transparency
accessory if your scanner doesn’t include one.


A SCSI interface, or USB if the PC supports it, gives the best performance. You’ll
save on network bandwidth if you install the scanner on the primary user’s desktop PC
rather than across the network.


Most scanners support Mac OS and Win9x equally well, although you may need to buy
interface kits for the secondary platform. Fewer fully support Windows NT; many of those
require a SCSI interface. Even fewer support Unix directly, although support is common at
the high end. Scanners from Umax Technologies Inc. have third-party drivers for many
common flavors of Unix.


Displays. Prices last year tumbled for everything from 21-inch monitors to flat-panel
displays, making the price of a high-quality display much less staggering.


Flat panels, the darlings of Wall Street executives, cramped desks and cool kiosks, are
not very important in the imaging world. Here, the emphasis is on screen real estate, not
skinny profiles. And flat-panel displays are notoriously fickle when it comes to video
card support, especially at the high end.


Except at the high end, CRTs usually require no special video cards and can work with
any platform. Today, 19-inch monitors are becoming common and, for imaging work, 20-inch
displays are the norm. These large-format displays have dramatically dropped in price,
though the color management and fine resolution built into the best of them can double or
triple their price.


The choice of display is highly subjective and largely dependent on the user’s
environment; our eyes generally have trouble detecting the differences in instrument
readings often used to rate monitors in tests. Before buying in quantity, set units up in
their new environment and select the one that, for a reasonable price, provides the
sharpest, brightest image and best use of color.


Don’t worry about dot pitch—the distance between like-colored pixels on the
screen—except as an indicator of quality between one company’s different CRTs.


The methods used to measure dot pitch vary enough to make the measurement almost
meaningless, and new technologies vary the size and placement of dots according to their
position on the screen. But the term has become such an accepted part of sales jargon that
dot pitch measurements are even given for aperture-grille monitors, such as Trinitron and
Vivitron, which display images in stripes, not dots, and, properly speaking, don’t
have a dot pitch at all.


Monitors require little support other than initial installation, but may require an
assist from facilities planners to cope with the huge footprint. A 21-inch display can
reach 24 inches or more in depth and is too heavy to place on a swing-arm monitor stand or
CPU. Newer, short-necked units with reduced depths may solve at least some of those
problems.


Printers. Once the exclusive province of graphic arts departments, color printers have
moved into the office at both desktop PC and workgroup levels. Ink-jet printers that often
sell for less than $300 yield remarkable color output; if your users’ color needs are
relatively minor, these devices can be cost-effective.


If the printer will be used to print overhead transparencies or viewgraphs, be sure to
test its film-printing capabilities before you buy. Varying ink formulations, differences
in transparency surfaces and the way the printer lays down color can make substantial
differences in quality. Ink-jet viewgraphs need drying time before being handled or
stacked to prevent smearing; some printers automatically rest between prints, making it
easier to print multiple copies unattended.


As with most color printers, ink cartridges aren’t usually interchangeable, so it
pays to standardize on one or two printers and buy cartridges in bulk. The best ink-jet
transparency films—never use standard transparency film in an ink-jet
printer—are sometimes hard to find and may not be the manufacturer’s own,
usually expensive brand. You can often save a great deal of time and money by
experimenting with different ink-jet films and buying both ink and film in bulk from
mail-order suppliers.


You can share color ink-jet printers between multiple users, but slow print times and
relatively small paper trays make sharing the devices impractical on a large scale. If you
have more than a few users requiring color, consider a workgroup or departmental color
printer.


The most cost-effective of such printers use solid, waxy blocks of color that are
melted and sprayed onto the paper for print quality that approaches the photographic.
Solid ink printers are fast, don’t require the special film and paper that inkjets
require to avoid smearing, offer optional built-in networking and may have impressive
network management tools. They start at less than $2,000 and can cost more than $10,000,
depending on the output size, print speed and options required; expect to pay around
$3,500 for a typical office configuration. Ink supplies can be expensive for the machines,
so again it pays to shop around for the best deals and buy in bulk.


Color laser printers are the darlings of the networked color printer crowd;
they’re fast, offer good quality output and operate in much the same way as standard
networked laser printers, once installed. Prices are dropping rapidly, to as little as
$3,000, although networkability and high performance can add considerably to the price.
Expect to pay about $4,000 for a typical color laser printer.


For that money, you’ll receive the ability to print on virtually any media,
including thin cardboard, at speeds not much slower than a low-end monochrome laser
printer. After installation, the machines are virtually maintenance-free, except for
replenishment of supplies.


The units tend to be much larger than other color printers, can produce a good deal of
waste heat and tend to be noisy, so choose their environments carefully.


Color laser printers can be one of the bigger headaches to set up, however. Unlike
plug-and-play monochrome lasers, color laser printers can require precise assembly and
configuration of as many as 40 separate elements, including four colors (cyan, magenta,
yellow and black) of toner. Although printer makers are working to reduce the complexity,
for now it’s wise to read installation instructions carefully and follow them to the
letter. Toner can be costly and difficult to find locally; the manufacturer may even be
the sole source of supply.


Until recently, large-format printers, capable of outputting documents 12 or more
inches wide, were most often found in drafting shops and photo-imaging studios. Now,
they’re making their way into federal offices as cost-effective in-house proofing and
signage systems. Prices for the devices generally range between $4,000 and $400,000.


The units are classified by the width of paper they’ll accept. Large format
technically means printers that can print from 12 to 36 inches wide; printers between 36
and 73 inches are considered wide format. Anything larger than that is grand format. They
can print on anything from paper and vinyl to canvas, glass, metal and even wood. Special
ultraviolet light-resistant inks prevent fading in outdoor signs.


If your office decides to buy one of these printers, check carefully into the
specifications. Many require a raster image processor (RIP), a dedicated computer that
pre-processes image files before sending them to the printer and greatly improves output
performance. A RIP may or may not be included in the base unit and can add considerably to
the cost of the machine if bought separately.


You’ll need to dedicate a large space to the printer and storage for the rolls or
sheets of paper and cases of ink it will consume. Buy the largest ink and paper capacity
you can justify. Too little can mean the printer must be constantly monitored and refilled
during large print jobs.


Slide imagers. Also known as digital film recorders, slide makers or slide printers,
slide imagers are a great way to get true photo-quality pictures from desktop PC imaging,
particularly if the final destination is a presentation. The devices draw with colored
light beams, reproducing the image on standard photographic transparency film. The result
is a continuous-tone picture with a 4,000- or 5,000-line screen resolution. The printing
industry measures resolution in lines; a 300-dpi image has a 150-line screen resolution.


Film recorders can cost from a few hundred to tens of thousands of dollars, depending
on the size and quality of the transparency and the speed with which it’s printed. At
the desktop PC level, units cost between $3,500 and $8,000.


Film recorders give an office the ability to produce slides in-house, resulting in more
immediate response and the ability to quickly correct and re-record problem slides. In
some instances, agencies use them for proofing files destined for the printer.


But these devices have drawbacks. Cheaper units lack the sophisticated RIP needed for
reasonable performance; a roll of 24 Microsoft PowerPoint slides can take many hours to
produce on a low-end recorder. Unless you also buy a developing unit, you’ll still
need to send slide film out for processing, which can take hours or days and reduce the
immediate-response appeal.


Instant-film attachments are good for proofing but not much use for final output, as
they produce slides that are grainy and have color shifts.


Unless your office needs more than a few hundred slides per year, it’s difficult
to justify the cost of a slide imager. You’ll probably save money using a service
bureau, which can deliver 35-mm slides in a few hours for $5 to $10 apiece. If you opt for
one, consider putting it on a separate workstation.


If sharp, accurate color reproduction drives the print side of output, bandwidth
controls its online form. The goal of every Web imaging publisher is to produce the
highest possible image quality with the lowest possible image size. Webmasters typically
do this by reducing image size, grouping similar colors into a single color to reduce the
overall number of colors in the image, and removing backgrounds and other nonessential
elements.


Your Web imagers will require little special equipment, aside from basic imaging
capabilities, to perform these tasks. What they will need, however, is training. Web
optimization of images can be a hit-or-miss, productivity-degrading operation unless users
innately understand how the Web displays an image and what effect new standards such as
Dynamic Hypertext Markup Language, Extensible Markup Language, Java and the Document
Object Model will have on graphics.


That’s a skill that until recently few art schools taught, so unless your artists
have kept up to date they’ll likely need to go back to school to learn to do Web
graphics well.


In imaging, bandwidth squeeze isn’t confined to the size of the pipe connecting
one computer to another. The massive amounts of data passing between the computer and
input, storage and output devices are equally critical.


And standard desktop interfaces such as serial ports, 10Base-T Ethernet, parallel
ports—even the 2-Mbps Enhanced Parallel Port interface—and IDE often have
trouble keeping up with the demands of an imaging application.


Here’s a quick rundown on the interfaces you might encounter when adding imaging
capabilities:


USB operates at 1.5 Mbps for keyboards, mice and other low-end devices, and at 12 Mbps
for digital cameras, monitors, scanners and the like. USB also supplies a 5-volt power
line to attached devices—as many as 127 daisychained to a single USB hub, although
practical limits are probably much lower—so an additional power line isn’t
required.


USB’s biggest drawbacks come from desktop legacies. Microsoft Windows 95, Windows
NT 4.0 and many flavors of Unix don’t support it natively, although you can, with
varying degrees of success, add third-party USB drivers. Although Windows 98 offers native
USB support, USB devices don’t work in Win98’s troubleshooting safe mode. And
USB devices are often more expensive than their older counterparts.


Although Fibre Channel doesn’t perform the myriad error checks required of
Ethernet, it offers lower latency rates. It’s also good at distributing available
bandwidth over multiple resources, and it transmits data in very large bundles, or
packets, so there are fewer packets on the network at any time. In addition, proposed new
standards would increase Fibre Channel transmission rates to as much as 4 Gbps.


Fibre Channel network equipment can be used to build tight clusters of extremely
high-performance server-storage combinations. And because Fibre Channel nodes can be
placed up to six miles apart, it shows great promise for centralizing campuswide
high-performance imaging. Such implementations are expensive, however, requiring sometimes
proprietary components. They aren’t easily bridged to existing Ethernet networks.


Less ambitious storage-specific connections using a Fibre Channel subset, Arbitrated
Loop can provide many of the same benefits—fast data transfer between storage device
and server, even at long distances—for much lower costs. Most storage experts think
it eventually will replace SCSI as the storage interconnect of choice.


But at 400 Mbps now and potentially 800 Mbps in the future, it’s definitely the
one to beat. Much cheaper than Fibre Channel, it transmits data in near-continuous
streams—perfect for broadcast-quality video, for example.


Today’s digital cameras typically connect to the desktop PC via a 19.2-Kbps serial
port, for tortuously slow image transfers that will only get worse as resolutions—and
resulting file sizes—increase. FireWire promises near-instantaneous transfer of
images from camera to computer, and it automatically provides for sharing of a single
device among several computers. The standard digital video interface in much of the United
States and Europe, FireWire eventually will become the high-end counterpart to USB device
interconnections.    n


In the office automation world, a file format usually describes how a data file has
been encoded by a particular application.


Any space-saving compression is done to the file by an external agent such as
winzip.exe. Because of the massive size of image files, various compression methods are
built into many image file formats.


Image file formats also serve as interchange agents; they provide a common format to
move images from one device or application to another.


Most competent imaging programs can save files in a variety of formats, so users can
pick the one that offers the best trade-off between delivering image quality and saving
data storage space:


Bitmaps are great for displaying rich, accurate images, but what you see is all
you’re going to get; enlarge the image and you simply add space to existing dots of
color. Many bitmapped file formats exist—.bmp format files were designed for
Microsoft Windows PC applications—but they’re all characterized by very large
file sizes and an inability to scale gracefully.


The technology has its drawbacks. Extremely complex image files can be larger than
their bitmapped counterparts. Because the file isn’t a dot-by-dot set of print
instructions but a set of instructions that must be compiled into a bitmapped image before
output can begin, the output device needs plenty of processing power and memory.


JPEG and GIF are currently the most common image file formats used on the Web.
Webmasters generally choose JPEG for photos and complex, continuous-tone images; GIF is a
better choice for text and line art. But both have the same drawback: The result is a
low-resolution bitmap.


Publishing on the Web has created a need for new file formats. New image formats reduce
the size of a Web image but also retain information needed to restore it for
high-resolution output:


Cynthia Morgan, of Newton, Mass., reviews software and hardware, and writes about
information technology.





inside gcn

  • health data

    Improving the VA patient journey with data transparency

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above