[net.lang.c] calculating leap years

devine@vianet.UUCP (Bob Devine) (10/07/86)

  First I need to correct myself.  I wrote that a test for divisiblity
of the year by 4000 should be used accommodate the difference between
the Gregorian calendar and the true tropical year.  Wrong.  There is
currently not a leap year correction accepted beyond the 400-year rule
even though a straight line extrapolation for the Gregorian minus the
tropical year leads to an extra leap day in about 3000 years.  At the
level of difference (365.2425 - (approx.)365.2422), there is too much
"noise" caused by the varying earth rotation.  Thanks to Henry
Spencer for making me double check.

  That doesn't mean it wouldn't be better to use the rule of tossing
out one leap year every 128 years (=365.24219) instead of three every
four hundred years....


  In reply to fgd3@jc3b21.UUCP (Fabbian G. Dufoe) who writes:
> It is particularly galling to see a correct algorithm criticized as
> overkill when it is as simple and short as the above code segment.  There
> may be a justification for writing code that only works part of the time if
> the fix is costly and difficult.  However, it should be a general rule that
> an algorithm which works in all cases is preferred over one that only works
> in most cases.
> Instead of panning someone's code because he has written it more correctly,
> one should adopt the improved algorithm with gratitude.

  You are missing the point: I wrote that doing the simple check for
divisibility by four is sufficient for most programs.  I would bet that
99% of programs don't need to handle dates with precision outside of
+/-20 years from the current date.  Geneology programs might want more.
Astronomical programs almost certainly do.  In the same sense, not all
arithmetic calculations need be done in double precision.

  That algorithm for the Gregorian leap years DOES NOT work in all cases.
It does only after a country has adopted that calendar.  Japan, for
instance, didn't adopt it until 1873.  USSR did so in 1918.  [I wonder if
that means its "October Revolution" really falls in the Gregorian November 
because of the 13 days its calendar 'lost'?  I'll check.]

  I'll post to mod.sources a program to correctly handle leap years for
a large range of years.  It will handle the problem of different countries
switching calendars (somewhat).  Expect it in a coupl'a weeks.

Bob Devine

mat@mtx5a.UUCP (m.terribile) (10/10/86)

> > It is particularly galling to see a correct algorithm criticized as
> > overkill when it is as simple and short as the above code segment.  There
> > may be a justification for writing code that only works part of the time if
> > the fix is costly and difficult.  However, it should be a general rule that
> > an algorithm which works in all cases is preferred over one that only works
> > in most cases.
> > Instead of panning someone's code because he has written it more correctly,
> > one should adopt the improved algorithm with gratitude.
> 
>   You are missing the point: I wrote that doing the simple check for
> divisibility by four is sufficient for most programs.  I would bet that
> 99% of programs don't need to handle dates with precision outside of
> +/-20 years from the current date.  Geneology programs might want more.
> Astronomical programs almost certainly do.  In the same sense, not all
> arithmetic calculations need be done in double precision.

``There is never time to do it right, but there is always time to do it
over.''  If you always try to get away with the stuff that you can probably
get away with, you *will* get burnt.  More important:  innocent users will
get burnt.  People who never met you, people who trusted the stuff that
``always worked'' will get burnt needlessly.

There are always engineering tradeoffs, but when it's cheap to be safe,
fer pete's sake, be safe.

Three is mounting concern in the commercial (read COBOL) world right now
because old code with old date handling is begining to break.  Nobody
expected the old code to be running for 20+ years; nobody expected that
people would take old code segments that they couldn't make sense of and
re-use them blindly, but people did because managers said ``use what's
already working.  Evolve, rather than destroy''

Meet my pet, Peeve ...
-- 

	from Mole End			Mark Terribile
		(scrape .. dig )	mtx5b!mat
					(Please mail to mtx5b!mat, NOT mtx5a!
						mat, or to mtx5a!mtx5b!mat)
					(mtx5b!mole-end!mat will also reach me)
    ,..      .,,       ,,,   ..,***_*.

ggw@ethos.UUCP (Gregory Woodbury) (10/11/86)

<where is the line-eater hiding - is it a wumpus? (or a boojum)>

In article <40@vianet.UUCP> devine@vianet (Bob Devine) writes:
>
>  You are missing the point: I wrote that doing the simple check for
>divisibility by four is sufficient for most programs.  I would bet that
>99% of programs don't need to handle dates with precision outside of
>+/-20 years from the current date.  Geneology programs might want more.
>Astronomical programs almost certainly do.  In the same sense, not all
>arithmetic calculations need be done in double precision.

The programs may not "need" to handle the dates beyond that 20 year interval
but when the programs that are being written now hit the end of the century
there are going to be a lot of installations and programs that are going
to be suprised come "March 1" to see the computer say "Feb 28, 2000".

In at least one industrial control system, there is going to be a discrepancy
between the VAXen and their comm controllers.  I had modified the comm system
date routines to handle the 2000 non-leap year, and was told to take it back
out "because the system won't be in use that long."  That's the worst excuse
I have ever heard for managemental incompentence - how many systems (especially
commercial ones) are still running programs that were written more than 20 years
ago [more than you might think] just because they never can be bothered to
re-implement (when emulation is available).
------------------------------------------
Gregory G. Woodbury				The usual disclaimers apply
Red Wolfe Software and Services, Durham, NC
{duke|mcnc|rti-sel}!ethos!ggw

------------------------------------------
Gregory G. Woodbury				The usual disclaimers apply
Red Wolfe Software and Services, Durham, NC
{duke|mcnc|rti-sel}!ethos!ggw

ron@brl-sem.ARPA (Ron Natalie <ron>) (10/12/86)

In article <813@ethos.UUCP>, ggw@ethos.UUCP (Gregory Woodbury) writes:
> The programs may not "need" to handle the dates beyond that 20 year interval
> but when the programs that are being written now hit the end of the century
> there are going to be a lot of installations and programs that are going
> to be suprised come "March 1" to see the computer say "Feb 28, 2000".
Eh?  You mean Feb 29?  Any computer that doesn't get the 28th right is really
ill.
> I had modified the comm system
> date routines to handle the 2000 non-leap year.
The year 2000 is a leap year.  Your code was more correct when it wasn't
modified.

-Ron

byron@gitpyr.gatech.EDU (Byron A Jeff) (10/14/86)

In article <447@brl-sem.ARPA> ron@brl-sem.ARPA (Ron Natalie <ron>) writes:
>In article <813@ethos.UUCP>, ggw@ethos.UUCP (Gregory Woodbury) writes:
>> The programs may not "need" to handle the dates beyond that 20 year interval
>> but when the programs that are being written now hit the end of the century
>> there are going to be a lot of installations and programs that are going
>> to be suprised come "March 1" to see the computer say "Feb 28, 2000".
>Eh?  You mean Feb 29?  Any computer that doesn't get the 28th right is really
>ill.
>> I had modified the comm system
>> date routines to handle the 2000 non-leap year.
>The year 2000 is a leap year.  Your code was more correct when it wasn't
>modified.
>
>-Ron

This is absolutely correct. The introductory pascal classes at Georgia Tech
have been doing leap year computations as a programming assignment forever
(or at least the 3 years  I've been here). We use the following definition.

"A leap year is any year divisible by 400 or any non-century year that is
divisible by 4".

By this definition the following fragment should compute the proper leap year.

int leapyear(year)
{
   return(!(year % 400) || (!(year % 4) && (year % 100)));
}

(I think this is correct - I'm writing off the top of my head. All corrections
welcome.)

fgd3@jc3b21.UUCP (10/15/86)

In article <813@ethos.UUCP>, ggw@ethos.UUCP (Gregory Woodbury) writes:
> In at least one industrial control system, there is going to be a discrepancy
> between the VAXen and their comm controllers.  I had modified the comm system
> date routines to handle the 2000 non-leap year...
> ------------------------------------------
> Gregory G. Woodbury				The usual disclaimers apply
> Red Wolfe Software and Services, Durham, NC
> {duke|mcnc|rti-sel}!ethos!ggw

     Let's get this straight.  The year 2000 will be a leap year.  The
years 1900 and 2100 are not leap years.  The rule is a year is a leap year
if it is evenly divisible by 4 and is not divisible by 100 EXCEPT FOR THOSE
CENTURY YEARS WHICH ARE DIVISIBLE BY 400.

     One of the reasons it is important to write date handling functions
correctly is there aren't enough people who understand the rule.  Let's
give them a correct function and let them copy it.  There is a very
readable date conversion function in K&R, first discussed on pages 103-105,
then again on pages 119-120.  The part that determines whether a year is a
leap year is

{
     int leap, year;
     leap = year%4 == 0 && year%100 !=0 || year%400 == 0;
}

     I hope that will save someone from getting bitten by a wrong
calculation.

Fabbian Dufoe
  350 Ling-A-Mor Terrace South
  St. Petersburg, Florida  33705
  813-823-2350

UUCP: ...akgua!akguc!codas!peora!ucf-cs!usfvax2!jc3b21!fgd3 

dave@murphy.UUCP (Lerxt) (10/17/86)

Summary: does anyone remember January 5, 1975?
Line eater: enabled

In article <813@ethos.UUCP>, ggw@ethos.UUCP (Gregory Woodbury) types:

>The programs may not "need" to handle the dates beyond that 20 year interval
>but when the programs that are being written now hit the end of the century
>there are going to be a lot of installations and programs that are going
>to be suprised come "March 1" to see the computer say "Feb 28, 2000".

>In at least one industrial control system, there is going to be a discrepancy
>between the VAXen and their comm controllers.  I had modified the comm system
>date routines to handle the 2000 non-leap year, and was told to take it back
>out "because the system won't be in use that long."  That's the worst excuse
>I have ever heard for managemental incompentence - how many systems (especially
>commercial ones) are still running programs that were written more than 20 years
>ago [more than you might think] just because they never can be bothered to
>re-implement (when emulation is available).

I know of one case where something similar has already happened, and it
was a major disaster for a lot of people.  DEC had the misfortune of
having every TOPS-10 system in the world come to a screeching halt on
January 5, 1975.  Reason: on that date, a 13-bit date counter field in
the kernel overflowed.  Apparently the author didn't think TOPS-10 would
be around long enough to need a bigger field. 

---
It's been said by many a wise philosopher that when you die and your soul
goes to its final resting place, it has to make a connection in Atlanta.

Dave Cornutt, Gould Computer Systems, Ft. Lauderdale, FL
UUCP:  ...{sun,pur-ee,brl-bmd}!gould!dcornutt
 or ...!ucf-cs!novavax!houligan!dcornutt
ARPA: wait a minute, I've almost got it...

"The opinions expressed herein are not necessarily those of my employer,
not necessarily mine, and probably not necessary."

ron@brl-sem.ARPA (Ron Natalie <ron>) (10/22/86)

In article <1603@mtx5a.UUCP>, mat@mtx5a.UUCP (m.terribile) writes:
 > >   You are missing the point: I wrote that doing the simple check for
 > > divisibility by four is sufficient for most programs.  
 > 
 > Three is mounting concern in the commercial (read COBOL) world right now
 > because old code with old date handling is begining to break.  Nobody
 > expected the old code to be running for 20+ years; nobody expected that
 > people would take old code segments that they couldn't make sense of and
 > re-use them blindly, but people did because managers said ``use what's
 > already working.  Evolve, rather than destroy''
 > 
Well you still have 114 years before things break.  Perhaps people will
stop using COBOL by then.

Note that most peoples internal time formats fall apart before then.
UNIX is only good until around 2047.

-Ron