[comp.lang.ada] of a year, years, and half a year

karl@grebyn.com (Karl Nyberg) (09/04/89)

In article <2007@munnari.oz.au> ok@cs.mu.oz.au (Richard O'Keefe) writes:
>In article <8909022213.AA06286@ajpo.sei.cmu.edu>, cdonalds@AJPO.SEI.CMU.EDU writes:
>:           The year  1990 marks  the start of a new decade and the end of 
>:           one in which  the Ada  language came  of age,  in  terms  of  use  
>:           and acceptability. What  better time  to reflect on the 
>:           experiences of Ada users  throughout that  time and  to look to 
>:           the future in the light of that experience.
>
>Ahem.  1990 is the LAST year of the decade 1981-1990.
>Just as 2000 will be the LAST year of this century and millenium,
>not the first year of the next.
>
>If ADA.people get such blatant off-by-one errors in their specs,
>can their code be any better?  [ponderous humour]

Ada is spelled thus.  What else don't you know? [imponderous humor]

According to Webster's New Collegiate, 8th Edition:

	decade : 1 : a group or set of 10 ; 2 : a period of 10 years ; ...
	millenium : 1 a period of 1000 years ; ...

No indication on the lower bound.  I'm surprised that C.hackers (of all
people) wouldn't start with 0-based counting (like their poor arrays, K&R,
1.6, p 20), making the current millenium from 1000 .. 1999, and the current
decade of the 1980s from 1980 .. 1989.  Then the year 1990 would indeed MARK
the end of the current decade (known as the eighties, as all the years are
of the form 198[0-9]) and begin the first year of the nineties.

I can't fathom what DECADE we would call the years 1981 - 1990 (inclusive),
but there's probably some name for it...

Of course, all this wonderful analysis gets blown away by the fact that the
year 1 BC is followed immediately by the year 1 AD (BCE, or whatever your
religious conviction allows you to say).  Suffer the discontinuities of this
concept of time (running from negative infinity to -1, followed by 1 to plus
infinity, again depending upon your religions conviction), make the first
decade of the current era have only 9 years, change the common meaning of
things like "the eighties", the "nineteenth century", there are dozens of
options.  And throw in the missing months during the calendar change, leap
years, leap seconds, add various timezones for the folks in Australia, ...

But I digress.  Maybe somebody can make this another Ada 9X language issue.
I just love having more language issues for 9X.  It'll give me something to
do this winter and spring!  :-)

-- Karl --

Disclaimer: Opinions (such as they might be!) expressed herein are mine, all
mine, and are not intended to be the official position of the Ada 9X
program, the Requirements Team (of which I am a member), or any other person
or organization, now living or dead.  Any similarities to events, current or
past are unintentional.

ok@cs.mu.oz.au (Richard O'Keefe) (09/05/89)

In article <13711@grebyn.com>, karl@grebyn.com (Karl Nyberg) writes:
> In article <2007@munnari.oz.au> ok@cs.mu.oz.au (Richard O'Keefe) writes:
> >If ADA.people get such blatant off-by-one errors in their specs,
> >can their code be any better?  [ponderous humour]

> Ada is spelled thus.  What else don't you know? [imponderous humor]

I know how Ada is spelled.  But ADA.people is my word, and I spell it
how I please.  (ADA is an acronym standing for "Ada Discussion and
Admiration"...)

> According to Webster's New Collegiate, 8th Edition:

> 	decade : 1 : a group or set of 10 ; 2 : a period of 10 years ; ...
> 	millenium : 1 a period of 1000 years ; ...

> No indication on the lower bound.  I'm surprised that C.hackers (of all
> people) wouldn't start with 0-based counting.

If that C.hackers bit is aimed at me, I offer you this quotation for
your .signature:
	The longer I program in C the better I like Ada.  (me)

I would point out that Webster's dictionary does not contain ALL knowledge...
It's not a matter of where *I* start counting.  It's a matter of how our
calendar works.  The Japanese recently started a new calendar:  the first
year in that is Year 1 of whoever.  Lots of calendars work the same way.
The first decade of the present system would have been 1 to 10.

The complexities in human calendar systems are precisely the reason why
computer systems count days or seconds or whatever from an arbitrary
epoch.

mitchell@chance.uucp (George Mitchell) (09/05/89)

Please omit these petty items from the INFO-ADA Digest.  I have no idea why
the flames ever got started.  It seems rather obvious that the twentieth
century will indeed end 31 December 2000 and that the century of the 1900's
(and the decade of the 1990's) will end 31 December 1999.

Is there REALLY any problem?

/s/ George   vmail: 703/883-6029  [alt.email: gmitchel@mitre.arpa]
GB Mitchell, MITRE, MS Z676, 7525 Colshire Dr, McLean, VA 22102

diamond@csl.sony.co.jp (Norman Diamond) (09/06/89)

In article <2020@munnari.oz.au> ok@cs.mu.oz.au (Richard O'Keefe) writes:

>The Japanese recently started a new calendar:  the first
>year in that is Year 1 of whoever.

True, but hmm, Heisei year 1 is only 51 weeks long.  The first year is
from year 1 through the 7th day of year 2.  And the first decade is
from year 1 through the 7th day of year 11.

(Showa year 64 was 1 week long.  Coins dated Showa 64 are selling for a
hefty premium.)

>Lots of calendars work the same way.
>The first decade of the present system would have been 1 to 10.

True.  And the 175th decade was 1741 through 1750.  But when was the
176th decade?  What was the ending date of the 1752nd year?  How long
is a year?

(If you don't understand this question, type "!cal 09 1752")

--
-- 
Norman Diamond, Sony Corporation (diamond@ws.sony.junet)
  The above opinions are inherited by your machine's init process (pid 1),
  after being disowned and orphaned.  However, if you see this at Waterloo or
  Anterior, then their administrators must have approved of these opinions.

cet1@cl.cam.ac.uk (C.E. Thompson) (09/06/89)

In article <10804@riks.csl.sony.co.jp> diamond@riks. (Norman Diamond) writes:
>True.  And the 175th decade was 1741 through 1750.  But when was the
>176th decade?  What was the ending date of the 1752nd year?  How long
>is a year?
>
>(If you don't understand this question, type "!cal 09 1752")

This example shows a rather parochial attitude. It was only the
English who changed from the Julian to the Gregorian calendar in
September 1752. This was late even by the standards of the rest
of Protestant Europe. (The Catholics had converted in 1582-1584.)

Chris Thompson (cet1@uk.ac.cam.phx)

lindsay@comp.vuw.ac.nz (Lindsay Groves) (09/07/89)

Newsgroups: comp.lang.ada
Subject: Re: of a year, years, and half a year (was Re: (none))
Summary: 
Expires: 
References: <8909022213.AA06286@ajpo.sei.cmu.edu> <2007@munnari.oz.au> <13711@grebyn.com>
Sender: 
Reply-To: lindsay@taputeranga.comp.vuw.ac.nz (Lindsay Groves)
Followup-To: 
Distribution: 
Organization: Dept of Comp Sci, Victoria University of Wellington, NZ.
Keywords: 

In article <13711@grebyn.com> karl@grebyn.com (Karl Nyberg) writes:
> ...
> ...                                 I'm surprised that C.hackers (of all
>people) wouldn't start with 0-based counting (like their poor arrays, K&R,
>1.6, p 20), making the current millenium from 1000 .. 1999, and the current
>decade of the 1980s from 1980 .. 1989.  Then the year 1990 would indeed MARK
>the end of the current decade (known as the eighties, as all the years are
>of the form 198[0-9]) and begin the first year of the nineties.
>
>I can't fathom what DECADE we would call the years 1981 - 1990 (inclusive),
>but there's probably some name for it...

Why not call it the 199th decade?  This terminology is clearly superior, 
since it doesn't suffer from the ambiguity inherent in "the eighties",
"the nineties" etc -- viz. that they don't specify the century to which 
they belong, you have to somehow infer what century the speaker/write is
referring to, which may or may not be the century in which they made the
statement.

>Of course, all this wonderful analysis gets blown away by the fact that the
>year 1 BC is followed immediately by the year 1 AD (BCE, or whatever your
>religious conviction allows you to say).  Suffer the discontinuities of this
>concept of time (running from negative infinity to -1, followed by 1 to plus
>infinity, again depending upon your religions conviction), make the first
>decade of the current era have only 9 years, change the common meaning of
>things like "the eighties", the "nineteenth century", there are dozens of
>options.  And throw in the missing months during the calendar change, leap
>years, leap seconds, add various timezones for the folks in Australia, ...
>
>But I digress.  Maybe somebody can make this another Ada 9X language issue.
>I just love having more language issues for 9X.  It'll give me something to
>do this winter and spring!  :-)

It seems to me that the issue for Ada 9X (or any other language whose
designers care to address such issues) is how you then define YEAR as
a data type, which should clearly be available in a standard predefined
package.  First, we need an unbounded integer type.  Then we need to define
YEAR as a subtype of this type, which does not include 0.  This could be
done either by taking the union of two semibounded integer types
(range -INFINITY .. -1 and range 1 .. INFINITY) or by some form of type
subtraction (e.g. (range -INFINITY .. INFINITY) - 0).  Since Ada does not 
currently have either type union or type subtraction, it will be necessary
to consider these features carefully to determine how general they need to
be, devise formal semantics, consider implementation problems etc, so the
alternatives can be evaluated and a final proposal decided upon.

One issue that needs to be addressed is how to define operators on such types.
Should SUCC(-1) give 1, or should this raise an exception?  Perhaps any
program evaluating this expression should cause a picture of a bright star to
be displayed on the user's terminal!

I think I'll leave the rest of the issues of be debated by the appropriate
bodies and get back to some work.

	Lindsay

conor@inmos.co.uk (Conor O'Neill) (09/07/89)

In article <10804@riks.csl.sony.co.jp> diamond@riks. (Norman Diamond) writes:
>In article <2020@munnari.oz.au> ok@cs.mu.oz.au (Richard O'Keefe) writes:
>>Lots of calendars work the same way.
>>The first decade of the present system would have been 1 to 10.
>
>True.  And the 175th decade was 1741 through 1750.  But when was the
>176th decade?  What was the ending date of the 1752nd year?  How long
>is a year?

Surely the reason for the change was to ensure that the 1752nd year
actually ended on the right day.
It was the 1751st year which was wrong. The adjustment corrected
the fact, so that the 176th decade correctly finished on
December 31st 1760, but didn't start on January 1st 1751, but 11 days
out (earlier?).
(All dates refer to England, as has been pointed out by someone else)

-- 
Conor O'Neill, Software Group, INMOS Ltd. UK: conor@inmos.co.uk
Disclaimer: All views are my own,         US: @col.hp.com:conor@inmos-c
            not those of INMOS.
"It's state-of-the-art" "But it doesn't work!" "That is the state-of-the-art".