[comp.lang.fortran] Floating-Point Indoctrination: Final Lecture

dgh%dgh@Sun.COM (David Hough) (07/15/88)

     During May-July 1988, Prof. W. Kahan of the University
of California presented a lecture course on Computer System
Support for Scientific and Engineering Computation at Sun
Microsystems in Mountain View, CA.  To summarize this
course, Prof. Kahan will present a final lecture, at 7:30 PM
on Thursday, 28 July 1988, at Apple Computer's DeAnza-3
Building, 10500 N DeAnza Boulevard, Cupertino, CA.  Enter
from the south side.

     This final lecture is free to the public.  Please pub-
licize to interested colleagues.

                          ABSTRACT

Most scientific and engineering computer users consider
irrelevant the details of floating-point hardware implemen-
tation, compiler code generation, and system exception han-
dling, until some anomalous behavior confronts them and
prevents the satisfactory completion of a computational
task.  Some of these confrontations are inevitable conse-
quences of the use of finite-precision floating-point arith-
metic; others are gratuitous results of hardware and
software designs diminished by the designers' well-
intentioned corner-cutting.  Distinguishing the intrinsic
from the gratuitous is no simple matter; such chastened com-
puter users are not sure what they might reasonably demand
of computer system purveyors.

The novice's impression that there is no rhyme nor reason to
the dark mysteries of floating-point computation is some-
times superseded by a euphoric discovery that there is a
good deal that can be axiomatized and proven about floating
point; later experience may temper such a discovery by indi-
cating that not everything that can be axiomatized or proven
is worth the trouble.  Furthermore, what would be worth
knowing is often surprisingly difficult to encapsulate and
refractory to prove; even when each subproblem of a realis-
tic application permits a satisfactory error analysis, the
overall problem may admit no such analysis.  The proofs of
simple statements about algorithms or programs often require
machinery from other parts of mathematics far more elaborate
than expected.  Thus some of the mathematically inclined who
become involved in these studies, out of external necessity,
then become permanently sidetracked by intricate mathemati-
cal issues. To remain relevant, a sense of engineering econ-
omy must guide such studies, in order to distinguish the
things that are worth doing, and therefore worth doing well,
from those that aren't.

Over the nearly twenty years since this lecture course was
first presented, the software environment has gradually
deteriorated despite that hardware has improved. The
software deterioration may be attributable to the
establishment of Computer Science as a separate academic
discipline, whose graduates need have little acquaintance
with scientific computation.  The hardware improvement can
be principally attributed to the advent and acceptance, for
most microcomputers, of the ANSI/IEEE Standards 754 and 854
for floating-point arithmetic.  But some of the potential
benefits of those standards are lost because so much
software was and is written to exploit only those few
worthwhile features common to almost all commercially signi-
ficant existing systems.  In fact, much portable mathemati-
cal software, created with funding directly or indirectly
from American taxpayers, is crippled by a misguided quest
for performance on the fastest existing supercomputers
regardless of detriment to far more numerous mini- and
microcomputers.

Well-intentioned attempts by language architects and stan-
dardizing bodies to ameliorate some of the difficulties
encountered in floating-point computation have too often
exacerbated them and, in some instances, spread over them a
fog caused by ostensibly insignificant variations in the
definitions of words with otherwise familiar connotations.
What we need now is a measure of consensus on language-
independent definitions of needed functionality, even if we
must sacrifice some compatibility with past practice to
achieve intellectual economy in the future.  Alas, few pro-
fessionals will pay the present costs of incompatibility
with past errors to achieve gains promised for an indeter-
minate future.  The computing world has too many broken
promises rusting in its basement.

One of the anticipated outcomes of this course is that lec-
ture notes will eventually be published reflecting current
thinking on some of these issues.  In addition a group of
students has undertaken to improve the implementation of
certain elementary transcendental functions to a better
standard than has been customary.


David Hough

dhough@sun.com   
na.hough@na-net.stanford.edu
{ucbvax,decvax,decwrl,seismo}!sun!dhough

bzs@bu-cs.BU.EDU (Barry Shein) (07/15/88)

From: dgh%dgh@Sun.COM (David Hough)
>Over the nearly twenty years since this lecture course was
>first presented, the software environment has gradually
>deteriorated despite that hardware has improved. The
>software deterioration may be attributable to the
>establishment of Computer Science as a separate academic
>discipline, whose graduates need have little acquaintance
>with scientific computation.

(oh c'mon, suffer me one comment)

I think this is misguided, having spent many years in the scientific
computation biz I can assure you that it is not a place where the
problem is absent. In fact, in my experience most natural scientists
seemed bored and/or suspicious when the problems were broached in
conversation. The reaction often was "oh, don't be absurd, of course
the machine [language, whatever] takes care of that?! I have work to
do and no time for this twaddle (ie. ranting of a computer
scientist.)"

What is generally absent from places where natural scientists compute
is computer scientists. The reasons are several, among them they don't
like to be sneered at for using Fortran or whatever isn't in vogue at
the moment (and on that count they are often, but not always, correct,
they're correct when it doesn't *really* make much any difference,
which is often, incorrect when they fail to see that their software
problems, eg. trying to manage records, are due to their insistence on
trying to do that sort of thing in Fortran, with dozens of overlapping
named commons and an I/O model never designed for that sort of thing
etc), an all too common arrogance that programming is just busy-work,
almost clerical in nature, and is just as well done by a young grad
student on stipend rather than paying someone a real salary to concern
themselves with the issues and finally a simple and real frustration
with a language gap, no argument, a grad student at least can talk
physics (eg) with the "customer", a skill the CS person usually is
completely lacking.

Unfortunately, this only separates the CS blame but does not exonerate
it. I would agree, as one involved in CS education, that the
curriculum does not adequately cover such issues as precision and
accuracy etc. S/he may very well leave with a respect for the issues
but probably has little actual knowledge of them. What is this due to?

Several things, as David pointed out the separation of fields may have
something to do with it. This has led to departments that are often
populated with logicians and other people of such a theoretical bent
that they have no real ability (and less interest) in teaching
something as mundane as floating point precision. Some of that can be
traced back to the relative salary levels in Academia vs Industry,
there's little to attract someone who can actually do something with a
computer into teaching. Also, let's face it, in this fast paced world
such mundane issues are boring, who wants to be so unpopular by
teaching a course in numerical analysis when they could be teaching
networking or window systems or something like that, something the
kids have actually heard of and have a hunger for, and the rarity of
applied computer scientists exacerbates that (they can usually teach
what they want.) Few universities judge the viability of a course
based upon its relevance to the subject at hand, head-counts seem so
much more objective and reflect tuition dollars so much better...

Well, that's my cynical 2c.

	-Barry Shein, Boston University