[comp.sys.amiga.tech] Timestamps have to be UCT

jms@tardis.Tymnet.COM (Joe Smith) (11/14/89)

In article <3824@nigel.udel.EDU> new@udel.edu (Darren New) writes:
>  At the file server end, the server would do an
>Examine(), use local preferences to translate the DateStamp to UTC,
>send the packet.  The client would get the packet, convert back from
>UTC to local time using the client's preferences, and all is well.
>I still think things should all be stored in local time.

This won't work.  I currently have on my floppy two files that were
created 60 minutes apart, but they both correctly say 29-Oct-89 at 1:30am.

If you have an OS that understands both GMT (UCT) and local time, there
is no way to store the correct date/time unless it is in GMT.  If you store
only local time, anything between 1:00am and 2:00am on the last Sunday
in October can be interpreted as Standard time or Daylight Savings time.
And on the first Sunday in April, any local time between 2:00am and 3:00am
will be illegal.

We want a format that stores the date/time as a monotonically increasing,
non-discontinuous value.  Local time just doesn't cut it.
-- 
Joe Smith (408)922-6220 | SMTP: JMS@F74.TYMNET.COM or jms@gemini.tymnet.com
McDonnell Douglas FSCO  | UUCP: ...!{ames,pyramid}!oliveb!tymix!tardis!jms
PO Box 49019, MS-D21    | PDP-10 support: My car's license plate is "POPJ P,"
San Jose, CA 95161-9019 | narrator.device: "I didn't say that, my Amiga did!"

) (11/15/89)

Timestamps
I just wrote some time routines for use on a large project that will be run
around the world. We have the problem that the operating system (VMS) doesn't
know anything but local_time, and some of the layered products only understand
local_time too. To get around this i designed a time structure as the following:
utc_time (in VMS time format - just use the time structure of current system)
local_time (VMS time format)
Timezone Code (1 character)
Daylight Indicator (1 character)

As you can see, both times are stored. When a program wants local_time, it
uses the correct field, while programs wanting absolute times (usually for 
sorting) use the utc_time. With the Timezone Code and Daylight Indicator it
is easy to write a few routines to convert from 1 timezone to another when
required.

The system can be either local_time or utc_time (you just have to know
which) and you calculate the unknown time once when the time is stored.
You also need a table to convert the timezone to some sort of numeric
offset to add/subtract to the time. It isn't easy! The world isn't
standardized on the values for Timezone codes, and some countries
still run on solar time (changes daily). I'm sure that CBM could
create a table (in preferences) that could assign a letter to each
timezone and we could live with that.

You also have to be careful when the Daylight Indicator changes. The
best way to handle this is to either speed up or slow down the system
clock for a short period of time to gain/lose an hour. This guarantees
that all times are still in correct order (the actual time won't be
accurate for the slow/fast period), but that usually isn't the comcern.

In the current software, we store the utc_time and the timezone and the 
daylight indicator and convert to local_time every time we want it. This
causes problems for those layered products that want a local_time in the
understood (VMS) format, which isn't possible. 

Mark Kaye		|		|				|
Box 172, Munster Hamlet	| 613-838-3580  | kaye@fscore.dec.com		|
Ontario, Canada K0A 3P0	|		| DEC fscore::kaye or kaye @kao	|