Who's got the time?
As computer systems become increasingly precise, a conflict emerges between human and machine measurements<@VM>Sidebar: Avoiding Outlook's twilight zone
- By Joab Jackson
- Jun 20, 2008
At the JavaOne conference in May, Stephen Colebourne, an architect at European integrator Sita, asked how many developers in the audience worked with dates before 1900. Only a few people raised their hands. Colebourne looked relieved. Keeping track of historical dates using calendar systems rendered before 1900 is just another way humans manipulate time, he said.
Colebourne is one of the chief developers updating how the Java programming language handles dates and time. As a result, he must somehow accommodate all the ways computer programs manipulate time.
At first glance, computerized timekeeping might seem like a cinch. A computer sees time as one thing: A uniform accumulation of discrete moments. The more accurately such moments are captured, the more precise the resulting time.
Humans, on the other hand, have all sorts of quirky notions about how to measure time. We chop it up into weeks, months, days. We stagger it across different time zones. We box it into calendars that correspond only roughly with the orbit of the Earth around the sun. We inject quasi-arbitrary shifts in the form of daylight saving time. And every few millennia, we do away with our timekeeping systems entirely.
As a result, we complicate computerized timekeeping, just as the thirst for ever more precise machine timekeeping flummoxes our notions of keeping time in the cosmos.
'It's not simple,' said Steve Allen, a researcher at the University of California's Lick Observatory, about the job of reconciling human and machine time. 'It's not simple because of the physics, and it's not simple because of the political jurisdictions on Earth.'
Accuracy is essential for keeping the Internet and computer networks running, said Dave West, director of field operations at Cisco Systems. The company's networking equipment is geared to synchronize within a few milliseconds. If servers and routers weren't able to agree on what time it was, they could not communicate.
'Much of the data that flows over the network is very synchronous, especially real-time data,' West said.
'We are getting to the point where time is a global phenomenon, and that is the way it needs to be measured,' said Michael Daconta, a former metadata program manager at the Homeland Security Department and author of 'Information as Product: How to Deliver the Right Information to the Right Person at the Right Time.' E-discovery, automatic surveillance and thousands of new computer-related activities need reliable, globally understandable time stamps.
'Time is a perfect example of something that needs to be taken out of the realm of human interaction because we don't do it well, and machines do it well,' said Daconta, who is also a GCN columnist.
When machines work with time, it is usually in one of two ways, Colebourne said. Machines can mark a period of time or gauge an interval between events.
For Posix-based computers such as Unix or Linux, time began Jan, 1, 1970, at midnight. The time according to those machines is the number of seconds that have accumulated since. For instance, the Greenwich Mean Time 8:22 p.m., Tuesday, June 10, 2008, translates to 1213129345 in Posix-speak.
Interestingly, the more networks and computers rely on time, the greater the demand for ever-finer levels of accuracy.
At the SIGAda 2007 conference in Fairfax, Va., one developer bemoaned Linux's measurement of time only in microseconds, or millionths of a second. Microseconds are fine for most purposes, but the developer had written a program to applications for bugs and wanted to divide time into everfiner slices. He wanted Linux to cycle in nanoseconds, or billionths of a second.
If processors can divide work into gigahertz cycles ' a billion cycles or more per second ' why is the operating system still running an order of magnitude slower?Fine time
Networks seem to be requiring ever-finer chunks of network time, too. Internet Engineering Task Force (IETF) engineers are sharpening the granularity of time measurements in the Network Time Protocol (NTP), the veritable timesynchronization standard used by all computers on the Internet to stay on the same page, time-wise. NTP lets machines query regional time servers that get the Universal Coordinated Time via the Internet from highly accurate reference clocks.
Using algorithms to estimate the offset caused by transmission times, the current version of NTP can synchronize local time with a reference clock to within a few hundred milliseconds, an accuracy that can be maintained by checking the time server every 1,024 seconds. The NTP update would bring the accuracy to within tens of milliseconds and allow as much as 36 hours between checks with the time server.
Despite the improved accuracy, many technologists still say too much ambiguity remains in NTP.
'There is a lot of ambiguity about how you time stamp [an Internet] packet,' said Symmetricom technologist Greg Dowd, speaking at a March IETF meeting in Philadelphia. 'I have a tremendously difficult time trying to do high-accuracy, high-stability time transfer' using the NTP protocol, he said.
Even the reference clocks the Internet derives its time from are getting more specific. For the past four decades, the U.S. Naval Observatory has set the official U.S. time using atomic clocks, which count the number of vibrations of an atom, traditionally neutral cesium atoms.
This might sound precise, but the National Institute of Standards and Technology is planning higher accuracy.
NIST physicist Till Rosenband is working on an atomic clock based on a pair of ions, one aluminum and the other beryllium. This clock is at least 10 times more accurate than the cesium- based clocks, NIST concluded after a year of measurements. The aluminum ion emits a steady vibration, which is amplified by the beryllium. A femtosecond oscillation of light emitted by a laser records the vibration. A femtosecond, if you're keeping track, is a quadrillionth of a second.
'The aluminum clock is very accurate because it is insensitive to background magnetic and electric fields, and also to temperature,' Rosenband said. 'Accuracy is measured by how well you reproduce the unperturbed frequency of this atom without any background magnetic or electric fields.'
Rosenband said standards labs worldwide are in a race to build the next-generation atomic clock.
The new generation of atomic clocks would neither gain nor lose a second in more than 1 billion years ' if they could run that long. Such clocks change no more than 1.6 quadrillionth of 1 percent per year. By comparison, the cesium clock can run without gaining or losing a second for only about 80 million years.Human time
The time we know is called sidereal time ' based on the rotation of the Earth. Every 24 hours, the planet makes a complete turn, and a new day begins. And that day has a new date.
To capture dates uniformly, the In- ternational Organization for Standardization offers a standard format, ISO 8601. In this standard, dates are represented in the yyyymm- dd format ' June 23, 2008, is represented as 2008-06-23.
ISO 8601 also offers an annotation for specifying the offset of this time from Universal Coordinated Time. UTC is the same worldwide, and the offset represents the particular time zone in which a timekeeping device resides.
The advantage of ISO 8601 is that its standardized format lets computers make measurements against dates. One of the shortcomings with the original Java date and time classes was that it offered little consistency, especially with international dates. 'The Java date and time classes need to be totally redone,' Daconta said.
Taking advantage of ISO 8601, Java Specification Request 310 will include features to help resolve some of the trickier problems programmers face when devising calculations involving dates and time. When given an incorrect date ' such as Feb. 30 ' the new API can be used to adjust it to March 1 or March 2, depending on whether it is a leap year.
What happens when machine time conflicts with solar time?
Reconciling the two is a challenge the Naval Observatory grapples with as it maintains UTC for the United States. According to the Observatory's Web page, UTC is based on atomic time, though if it differs from the Earth's rotational time by more than 0.9 seconds, it is adjusted.
Because the solar system is not as precise as atomic clocks, our time must be adjusted every few years.
Traditionally, one second was defined as 1/86,400th of a solar day, a complete rotation of the Earth. Translating that into atomic time, the second was refined under the International System of Units to 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium atom.Longer days
The problem is that the two measurements aren't entirely equivalent because the day is slowly lengthening. The gravity of the moon, as expressed through daily tides, lengthens the global turn by roughly 1.4 milliseconds ' that is, thousandths of a second ' per day per century. Since 1820, what we think of as a 24- hour period has gotten 2 milliseconds longer.
As a result, atomic time differs from solar time by one second about every 500 days. To adjust, 23 leap seconds have been added to the clock during the past 34 years, usually on Dec. 31 or June 30.
Computer systems' increasing reliance on accurate time measurement could pose a problem, observers say. Can routers handle the extra second correctly? How about simple handheld Global Positioning System units, which rely strictly on UTC?
Some researchers have suggested eliminating the leap second and sticking with just atomic time, which would result in sun at midnight and dark during the day in about 43,000 years.
Allen suggested using time-zone offsets that are used by all operating systems. Each has a file with the world's time zones, along with local daylight saving time (GCN.com/1115). When the user picks a time zone, the computer offsets the UTC time so it looks like the local time. Allen said the Posix standard can also add offset seconds.
UTC would continue to march on in a perfectly uniform time scale. And we will still associate daylight with daytime.
'When the sun is in the sky, you want to have soccer practice,' Allen said. 'You want to pay attention to things like that. You want to have sundials keep working because they are beautiful works of art.' GCN contributing writer Kathleen Hickey contributed to this report.
Has this ever happened to you? You travel to another time zone ' say, from Washington, D.C., to San Francisco. Once in California, you reset the clock on your Microsoft Windows laptop PC for Pacific Standard Time. But when you scan your appointments in San Francisco, you find they've all been moved up by three hours. And any appointments you make while in San Francisco for after your return will be three hours off when you get back. Craziness!
Outlook is utterly dependent on the operating system for knowing what time it is. Microsoft Windows gets the Universal Coordinated Time from the Internet and offsets that time to the time zone the user sets, while also adjusting for any daylight saving time if necessary. (GCN GCN.com/1118).
This is helpful if you have to keep track of virtual meetings in other time zones. But if you don't watch it closely, appointments could be moved to incorrect times. And other kinds of scheduling hilarity might ensue, too, the page reports. The good news is that Microsoft has streamlined multi-time-zone scheduling.
In Outlook 2007, you can specify the time zone for appointments. Microsoft has also released a plug-in for Office for anyone permanently moving to another time zone (GCN GCN.com/1119). It moves all appointments on the calendar to the new times.