In a changing world, computer literacy demands skepticism
OK, I know next year isn't the new millennium, because there was no year zero. But the confusion means that columnists not only get to pontificate this year, they also get to do so at the end of 2000, when millennial hype will once again be ignited by marketeers who didn't sell all the software they're promoting this year.
While our computers are all still working, leap with me into the past and future.
The wristwatch, circa 1903, did as much to alter the way people lived in the first half of this century as the automobile, which was invented in the 19th century. The car made a big difference, but its importance has faded, at least for urban dwellers whose lives would change little without it. The wristwatch put us all on the clock in ways that had not been possible before. Pulling out a pocket timepiece had been a ritual for businessmen and railway workers, but when demon time was strapped on everyone's wrist, it changed everything.
As for medicine, improved sanitation probably has extended the average life span more than antibiotics.Media influences
Two other seminal inventions, the computer and the television, arrived in a big way around 1950. Commercial radio came earlier but did far less to alter people's lives. Movies, which came on the scene before 1900, brought people together in a common frame of reference. Television, on the other hand, tends to isolate, not only because it's a solitary event but also because 500 neighbors can each watch a different channel.
Computers started managing our lives a little, and they will end the century by isolating us even more than television. How often do you send e-mail to someone in the next office instead of just walking over? As automobiles gobble up too much of our resources, more people will telecommute, not only to work but to school.
The wristwatch made people aware of passing minutes, but the computer makes us fret about lost seconds. In 1800, people were were lucky to know what hour it was. In 1920, they could know the time to within a few minutes. By 2020, people's lives and work will be managed to the second if they let it happen. Why? Because it's possible, and probably for no better reason.
Now that the PC has reportedly infiltrated 60 percent of U.S. households, it is poised to fundamentally change not only our lives but also the way change occurs.
My ranch is a workplace, just as your office is. It had one surveillance camera when I moved in. I am now installing two low-light cameras and a color camera in a barn 200 feet from the house, using Category 5 cabling and inexpensive adapters. I can monitor the cattle day and night without ever leaving the office. The cows don't mind being watched, but will your kids? Your parents? Will you?
Computers can automatically monitor what happens on multiple cameras by detecting image changes. Cameras cost only $100, and biometric software can recognize individual faces. Cities in England already have computer-monitored video cameras covering much of their populations. Instant analysis
Will such surveillance make you feel spied upon, or safer? How will criminal justice change when police and security personnel can carry handheld DNA analyzers? If you think I'm kidding, point your browser to www.llnl.gov/sensor_technology/STR59.html
, a Lawrence Livermore National Laboratory Web site. The technical part is easy compared with predicting how people will react to the technology.
I used to look forward to seeing a new, computer-literate generation replace the 1980s-era managers who refused to learn how to use a keyboard because it was a so-called secretarial skill.
Now I have gotten what I asked for. Unfortunately, the people who grew up with computers tend to believe them implicitly. It wouldn't matter as much if we used PCs only to balance checkbooks or do taxes, even though my accountant has pointed out depreciation errors made by several leading tax software packages.
What worries me is that the PC generation seems to believe whatever is on a computer screen, sometimes contrary to common sense. They take as gospel almost anything on the Internet.
I cannot count how many times I have given my postal address to vendors via e-mail and then had them question whether I know my own address. Many insist on using an old address dredged up from an ancient online database. Apparently they cannot believe that real-time information from a person in a position to know beats an old computer file.
Why should this concern you? Your name and other vital statistics will live on in thousands of databases. As time passes, it will get harder to convince the PC generation that what computers say about you might be wrong.
Government agencies still tend to trust live people over old records when it comes to simple things such as telephone numbers and addresses. But as your agency does more of its purchasing online, will vendors continue to expect you to authorize their transactions long after you have moved on or retired? Will they believe you when you tell them not to deliver shipments to the old offices your agency rented five years ago?
Computer literacy is more than knowing how to open a window. It's increasingly a matter of knowing when not to believe the computer.
As if we weren't connected enough already, there is now a single-chip Internet access device. Remember the 1970s-era Z80 CPU from Zilog Inc. of Campbell, Calif.? Far from being a relic, Zilog has just announced the eZ80, which has a TCP/IP stack, microprocessor and digital signal processing on one chip. The eZ80 will appear in some devices by the real start of the millennium in 2001, adding only about $10 to the cost of virtually anything people think ought to be connected to the Internet.
If we go down this branch of the road too far, can we ever come back? Will we want to?
Perhaps we need to pay more attention to the ergonomics of the mind in order to stop harming the way people think. John McCormick, a free-lance writer and computer consultant, has been working with computers since the early 1960s. E-mail him at email@example.com.