[comp.lang.c] Timekeeping in ANSI C

karl@haddock.ISC.COM (Karl Heuer) (02/10/88)

In article <7216@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>[Why time_t and clock_t might be floating-point types]
>I seem to recall that somebody pointed out that an integer would
>overflow in an unduly small amount of real time on a system with
>a high-resolution system clock.

Yes, that's a problem.  Given an environment with only 32 bits in a long int,
and a need for much finer resolution than one second, one can't use an
integral type for timekeeping.  This applies to both clock_t and time_t.

I don't think floating-point is the answer, though; at best it seems like a
stopgap.  A more realistic solution would be to use an aggregate type.  (Egad,
the ancient PDP-11 solution may have been on the mark after all!)  The
encoding of time_t is already unspecified; as far as I can see, the only
reason it has to be an arithmetic type is so that -1 can be used as an error
return.  The clock_t type is currently guaranteed to be a simple counter, but
it also should probably be explicitly unspecified.

So, let's allow either/both of them to be a struct if necessary, and provide a
call-by-reference function for each.  Then the function return value can be 0
for success, -1 for failure.

Karl W. Z. Heuer (ima!haddock!karl or karl@haddock.isc.com), The Walking Lint

rbbb@acornrc.UUCP (David Chase) (02/11/88)

I'm in the middle of porting a C library from one machine to another.  In this
library, time is represented as a structure containing two unsigned long
quantities (hi and lo).  The time is encoded as:

	hi contains (centi-seconds since 1900 began) / 65536
		    The high order bit on indicates a bogus time.

	lo contains ((centi-seconds since 1900 began) % 65536) * 65536 +
		    left-over microseconds.

OR	cccccccc ccccuuuu
	hi	 lo

This structure can encode dates from 1900 AD to something like 47000 AD.
It is a pill to work with, of course, which is why you write a bunch of
library routines to work with the time structures (add, subtract, convert
to and from a date+time form, etc).

But don't listen to me.  Go ahead, use the Unix format.  I should still be
alive on 'Mon Jan 18 19:14:07 2038 (GMT)' when all you code dies.  I'll gloat.

David Chase
Olivetti Research Center, Menlo Park

csm@garnet.berkeley.edu (02/11/88)

In article <594@acornrc.UUCP> rbbb@acornrc.UUCP (David Chase) writes:
>   ...   time is represented as a structure containing two unsigned long
>quantities (hi and lo).  The time is encoded as  ...
>This structure can encode dates from 1900 AD to something like 47000 AD.
>It is a pill to work with, of course  ...
>David Chase

 I agree with David that the current UNIXY time functions leave a lot
 to be desired. (BTW 1900 AD is not nearly far enough back - e.g. some
 birthdates will be in the 19th century for a number of years to come.) 
 I for one am tired of writing software that I know will break.  My 
 current workaround is to check the system year and complain and exit
 if the software isn't going to work correctly.
 This group would seem to be the obvious place to establish some sort of
 consensus on better time functions.  Anybody out there proud enough of
 his/her own efforts to show us how it should be done?
             -Brad Sherman

dhesi@bsu-cs.UUCP (Rahul Dhesi) (02/11/88)

In article <594@acornrc.UUCP> rbbb@acornrc.UUCP (David Chase) describes a
structure that holds a time value and continues:
>This structure can encode dates from 1900 AD to something like 47000 AD.

Sooner than you can say "UNIX is a Trademark of ...", 47000 AD will be
here.  The greatest mistake a designer can make is to assume that a
certain date and time will never come.  Such short-sightedness has
caused problems over and over again, yet we see it again and again.
Just recently we heard about chaos in the DEC-20 world because they had
to extend their time structure and use up some reserved fields that
people had already begun to use for other things.

The right way to do it is as follows:

     typedef struct {
        long hi_time;
        long lo_time;
        time_t *next_val;
     } time_t;

the next_val field will be NULL until 47000 AD, at which time the
routine returning the time value will malloc() space for another struct
to hold the higher 64 bits.  Then, another epoch later, the time_t
structure will become a linked list of three structs, and so on, to the
end of time (or end of free memory).

The only problem I can see with the above typedef is that it won't
compile because of the forward reference.  But at least it will be
valid far longer than anything that compiles that has been proposed so
far.
-- 
Rahul Dhesi         UUCP:  <backbones>!{iuvax,pur-ee,uunet}!bsu-cs!dhesi

rk9005@cca.ucsf.edu (Roland McGrath) (02/12/88)

["Timekeeping in ANSI C"] - dhesi@bsu-cs.UUCP (Rahul Dhesi):
} Sooner than you can say "UNIX is a Trademark of ...", 47000 AD will be
} here.  The greatest mistake a designer can make is to assume that a
} certain date and time will never come.  Such short-sightedness has
} caused problems over and over again, yet we see it again and again.
} Just recently we heard about chaos in the DEC-20 world because they had
} to extend their time structure and use up some reserved fields that
} people had already begun to use for other things.

If we're still using Unix (or POSIX, or GNU) in AD 47000,
or even 2038, I will do repeated belly-flips in my grave (or
perhaps in my armchair in the case of the latter).
-- 
		Roland McGrath
UUCP: 			...!ucbvax!lbl-rtsg.arpa!roland
ARPA:	 		roland@rtsg.lbl.gov

pardo@june.cs.washington.edu (David Keppel) (02/13/88)

In article <1152@ucsfcca.ucsf.edu> roland@rtsg.lbl.gov (Roland McGrath) writes:
>["Timekeeping in ANSI C"] - dhesi@bsu-cs.UUCP (Rahul Dhesi):
>} Sooner than you can say "UNIX is a Trademark of ...", 47000 AD will be
>} here.  The greatest mistake a designer can make is to assume that a
>
>If we're still using Unix (or POSIX, or GNU) in AD 47000,
>or even 2038, I will do repeated belly-flips in my grave (or
>perhaps in my armchair in the case of the latter).

Just a comment.

Mutual of Omaha Insurance does most of their munching on a *huge* IBM
mainframe that is recent technology -- but the processing is all batch
because

(a) there is a lower overhead for batch processing (they are CPU bound)
(b) that was what was available when they made their software investment
(c) it still does their job about as good as anything (see (a))

I can't say which is a bigger factor, but they're all relevant.  The
greater principle here has been espoused by lots of people, and I heard
it most recently from Richard Stallman, something along the lines of
(this is not a quote)  "Don't make any assumptions about how big the
input is going to be".  If the machine will let us allocate 1Mb and the
user asks us to allocate 1Mb, then by all means do so.  But don't
compile in any limits, in 5 years somebody will want 1Tb (terabyte) and
have a machine that can do it.

Consider: if the "timeval" structure is sufficiently general than it need
not be used just for system time, but can be used by applications to
hold interesting things like the birthdate of Charles Babbage and the
applications don't have to invent their own storage format and (possibly
buggy, probably incompatable) manipulation/printing routines.

As far as I can tell, this doesn't have anything to do with comp.lang.c
anymore.

 ;-D on ("$128,000 Pyaramid" started out as "The $64 Question"--on radio) Pardo

gregory@ritcsh.UUCP (Gregory Conway) (02/14/88)

In article <1152@ucsfcca.ucsf.edu>, rk9005@cca.ucsf.edu (Roland McGrath) writes:
> ["Timekeeping in ANSI C"] - dhesi@bsu-cs.UUCP (Rahul Dhesi):
> } Sooner than you can say "UNIX is a Trademark of ...", 47000 AD will be
> } here.  The greatest mistake a designer can make is to assume that a
> } certain date and time will never come.  Such short-sightedness has


You gotta be kidding us!  The year 47000 A.D.????  Assuming the human race
is still here, by that time we'll be talking interactively with computers
with "operating systems" that will make Un*x look like a small monitor.
Should we assume that was a typo?   :-)



-- 
================================================================================
Gregory Conway@Computer Science House    UUCP: ...rochester!ritcv!ritcsh!gregory
Rochester Institute of Technology, Rochester, NY
    "I got an allergy to Perrier, daylight, and responsibility", Marillion

cabo@tub.UUCP (Carsten Bormann) (02/15/88)

In article <594@acornrc.UUCP> rbbb@acornrc.UUCP (David Chase) writes:
() 
() But don't listen to me.  Go ahead, use the Unix format.  I should still be
() alive on 'Mon Jan 18 19:14:07 2038 (GMT)' when all you [sic] code dies.

If I'm alive on January 18th, 2038, I will very likely, just as
everybody else, run my UNIX code on a 64 bit machine, and my code will
happily live on until the sun turns into a supernova.
-- 
Carsten Bormann, <cabo@tub.UUCP> <cabo@db0tui6.BITNET> <cabo@tub.BITNET>
Communications and Operating Systems Research Group
Technical University of Berlin (West, of course...)
Path: ...!pyramid!tub!cabo from the world, ...!unido!tub!cabo from Europe only.

franka@mmintl.UUCP (Frank Adams) (02/17/88)

In article <2079@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes:
>In article <594@acornrc.UUCP> rbbb@acornrc.UUCP (David Chase) describes a
>structure that holds a time value and continues:
>>This structure can encode dates from 1900 AD to something like 47000 AD.
>Sooner than you can say "UNIX is a Trademark of ...", 47000 AD will be
>here.  The greatest mistake a designer can make is to assume that a
>certain date and time will never come.

This is a joke, right?

The correct rule is that any built-in limitations should be ridiculously
large.  2050, for example, is not ridiculously large; even 2500 isn't
really.  But 47000 *is* ridiculously large, and thus this scheme meets the
criterion.

I *do* have a problem with it, however: it doesn't go back far enough.
Sure, nobody is going to need to represent current times before 1900 on
their computer, but a good system for representing dates should be able to
deal with history, as well.  To be on the safe side, I would go back to
10000 BC; this still gets us to 37000 AD.

(Of course, now you get into the issue of the Julian calendar.)
-- 

Frank Adams                           ihnp4!philabs!pwa-b!mmintl!franka
Ashton-Tate          52 Oakland Ave North         E. Hartford, CT 06108

pardo@june.cs.washington.edu (David Keppel) (02/18/88)

In article <350@tub.UUCP> cabo@tub.UUCP (Carsten Bormann) writes:
>If I'm alive on January 18th, 2038, I will very likely, just as
>everybody else, run my UNIX code on a 64 bit machine, and my code will
>happily live on until the sun turns into a supernova.

Of course one of the places that I used to go to school *still* has
some of their IMSAI S-100 8080-based machines still in regular use.
The figure that I heard was that over thanksgiving break '86 the
machines were in use an average of 22 hours/day.

The IMSAIs were one of the first successful microcomputers.  True,
they're only just over ten years old, but the world of microcomputers
is only just over fifteen, and 2038 is only forty years away.

	;-D on  (Silicon crystal ball)  Pardo

gsarff@argus.UUCP (gary sarff) (02/20/88)

I agree that the system and software should be as flexible as possible pretty
much regardless of how much trouble it will be for the implentors.  (Up to
some point anyway.)  The naivete of some programmers regarding date/time
information is suprising.  I was using a data base program on a mini at
work and there was a date field for personnel birthdates.  Unfortunately
it was only two digits long, and the programmer tried to catch errors in
entry, but for a reason that became apparent after a moment's thought we
could not enter the birthdate of someone who was born in 1919.  Because
the programmer had written error code such that if the first two digits
were 19 he thought the operator had ignored the instructions and put in
something like 1945 or such, not thinking that people born in 1919 are not
yet even 70 years old and may be around for some time to come.  It took
months to get the company to fix this, they thought it was a "feature"
thank goodness it was only application code and not an OS.  The time format
on the OS I use at work (a proprietary os called WMCS from WICAT SYSTEMS
based on VMS) uses two bytes for the year, two bytes for the day in the
year, and a byte each for hours, minutes, seconds, and 100ths of a second
for both absolute time and time since boot of the system, two tick clocks.

-- 
Gary Sarff           {uunet|ihnp4|philabs}!spies!argus!gsarff
To program is human, to debug is something best left to the gods.
"Spitbol?? You program in a language called Spitbol?"
  The reason computer chips are so small is that computers don't eat much.