[comp.arch] Third public review of X3J11 C

cik@l.cc.purdue.edu (Herman Rubin) (08/23/88)

In article <8365@smoke.ARPA>, gwyn@smoke.ARPA (Doug Gwyn ) writes:
> In article <4566@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes:
> >The draft may not be 'badly broken' but is missing out on the opportunity
> >to make C a convenient language for numerical computing as well.
> 
> I happen to use C for numerical programming, despite occasional flaws
> such as those you mention, primarily because it offers much better
> support for data structures than do other alternatives such as FORTRAN.
> I agree that there are some changes that could make C more convenient
> for such applications.  Hough's suggestions are for the most part good
> ones, but they haven't been receiving sufficient committee support.
> 
> The fundamental problem is that IT IS MUCH TOO LATE to be making
> significant changes to the proposed standard.  Look at all the trouble
> the last-minute addition of "noalias" caused.  The public review
> period is intended as a REVIEW of work done by the committee, not as
> an opportunity for language design.  Where were all these scientific
> users of C when the design work was being done?  By leaving that up
> to people who didn't think the flaws you perceive were significant,
> you did not get those flaws addressed in the proposed C standard.
> It's easy to complain about other people's work; much easier than
> helping with the work.  I suggest that you GET INVOLVED in drafting
> the NEXT (revised) standard.
> 
> Obviously I am not speaking for X3J11 officially here..

I use C for numerical programming, and then have to edit the resulting .s
file.  All of the languages, including C, are woefully deficient is letting
the user use the capacities of the machines.  If C is to be a good flexible
language, the committee should widely advertise for complaints about the
deficiencies of the language before starting out.

I would have no trouble coming up with pages of these items.  But the last
time I did something like this, in reply to the open invitation to attend
the meeting on the IEEE floating-point convention, was to receive an invi-
tation to attend!  I do not have the time to attend meetings on software.

Another problem is that the language gurus are unsympathetic to ideas which
run counter to their perception of computing needs.  They see integer 
arithmetic as primarily for addressing and looping; I see integer arithmetic
as important for number-crunching.  What about fixed-point (_not_ integer)
arithmetic?  What about the use of overflow?  What about division with
simultaneous quotient and remainder?  What about an operation or function
returning a string of values?  What about table-driven branches?  What 
about inserting new operators, using the processor syntax to specify the
argument structure of these operators?  In fact, what about using the 
easy-to-use hardware operators on most machines?  A good example is &~,
which is more useful than &, and is hardware on many machines, including
the ones for which C was initially written.  Many of those machines do not
even have a hardware &.

How many useful instructions have disappeared from hardware because they
do not occur in the HLLs?  Multiprecision arithmetic needs unsigned
multiplication and division to be efficient, and not floating point
arithmetic.  The presence of a single hardware instruction can be
essential to an algorithm being worthwhile; if the instruction is in
software, it is more likely to appear in hardware.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leic) (08/23/88)

In article <887@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes...
 
>I use C for numerical programming, and then have to edit the resulting .s
>file.  All of the languages, including C, are woefully deficient is letting
>the user use the capacities of the machines.  If C is to be a good flexible
>language, the committee should widely advertise for complaints about the
>deficiencies of the language before starting out.

On the contrary:  C is NOT woefully deficient for the vast majority of
applications to which the vast majority of "paying users" are interested in
applying it.  As later comments make clear, the kinds of users Mr. Rubin has
in mind are rather different.  The fact of the matter is, hardly anyone thinks
that "fixed-point arithmetic" (as opposed to integer) is important.  It just
does not come up in the vast majority of uses to which computers are put.

Developing software is an expensive proposition.  Everything added to a
language has to be implemented somewhere, by someone.  Then it has to be
debugged, supported, and maintained.  There are only two ways this will
happen:  If someone is willing to pay for it; or when someone is willing
to do it out of their own love for the subject.

>I would have no trouble coming up with pages of these items.  But the last
>time I did something like this, in reply to the open invitation to attend
>the meeting on the IEEE floating-point convention, was to receive an invi-
>tation to attend!  I do not have the time to attend meetings on software.

Ah, so Mr. Rubin is willing to COMPLAIN, but he is NOT willing to do the work
out of his own love for the subject.  He certainly gives no indication that
he is willing (or able) to pay to have it done either.

>Another problem is that the language gurus are unsympathetic to ideas which
>run counter to their perception of computing needs.

I am a "language guru", though my interests happen to be in parallel program-
ming languages.  Again, why should I care what Mr. Rubin thinks "computing
needs" are when he can't provide money, isn't willing to invest his own time,
and can only provide the most specialized examples of what such features might
be used for?

>						      They see integer 
>arithmetic as primarily for addressing and looping; I see integer arithmetic
>as important for number-crunching.  What about fixed-point (_not_ integer)
>arithmetic?  What about the use of overflow?  What about division with
>simultaneous quotient and remainder?  What about an operation or function
>returning a string of values?  What about table-driven branches?  What 
>about inserting new operators, using the processor syntax to specify the
>argument structure of these operators?  In fact, what about using the 
>easy-to-use hardware operators on most machines?  A good example is &~,
>which is more useful than &, and is hardware on many machines, including
>the ones for which C was initially written.  Many of those machines do not
>even have a hardware &.

What about all these things?  Being absolutely brutal about it:  Why should
I (or other readers) care?  What will it gain us to worry about this?

>How many useful instructions have disappeared from hardware because they
>do not occur in the HLLs?

Along the same brutal lines, my answer is:  No USEFUL instructions have
disappeared at all.  What has disappeared are a lot of non-essential ideas
that were tossed in back in the days when computer architecture was a new
field, with a large research component.  No one really knew what would turn
out to be "useful".

Well, for better or for worse, computer architecture isn't like that any more.
Computer design is a multi-billion dollar industry.  It is driven, not by what
people might WANT in some abstract sense, but by what they are willing and
able to pay for.  THAT is the only workable definition of "useful", and on
that scale the things Mr. Rubin wants have long ago fallen to the bottom of
the list.

>			    Multiprecision arithmetic needs unsigned
>multiplication and division to be efficient, and not floating point
>arithmetic.  The presence of a single hardware instruction can be
>essential to an algorithm being worthwhile; if the instruction is in
>software, it is more likely to appear in hardware.
>-- 
>Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
>Phone: (317)494-6054
>hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

It's painful to see economics dominating a field one loves and pushing it in
directions one is not inclined to go.  I'm not unsympathetic to Mr. Rubin's
position; my own background, way back when, is in mathematics (complex analy-
sis and a bit of analytic number theory).  Even the work I do now is beyond
the current "commercial" leading edge, and I am sometimes frustrated by the
way hardware manufacturers put roadblocks in the way of doing "obviously
useful" things, because they are too busy heading in other directions.  But
that's life.

The USEFUL thing for Mr. Rubin to do, if he really thinks these issues are
important, is to work at convincing others of it.  Not by complaining in this
and other newsgroups about how he is being ignored.  But exactly by spending
some time with those committees, by offering some real alternatives, by
showing how what he proposes is useful to people other than himself.  Frankly,
I doubt anything he can do will ever get major commercial ventures interested.
But that doesn't mean he can't get other researchers interested.  Many people
are able to design and build special-purpose hardware and software today; if
Mr. Rubin talked to some of them, he might discover that many good research
hardware hackers have the tools, but are lacking interesting problems.  I
will say, however, that his chances of getting people interested would improve
markedly if he stopped complaining about how he didn't "have the time to
attend meetings on software".  Very few computer scientists have the time to
attend meetings on statistics either.
							-- Jerry

dhesi@bsu-cs.UUCP (Rahul Dhesi) (08/24/88)

In article <887@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
[wish list for HLLs]

I agree that many of the features wished for ought to be in higher-
level languages.  But to put all of them in C would no longer leave
the relatively small, simple, low-level language that C was designed
to be.

Nearly all of the features that Herman Rubin wishes to see *are*
already in HLLs, only not all are in each HLL.  C++ has some.  Ada has
*many* of them, especially fixed point arithmetic and functions
returning structured values.

The real problem is not with C designers.  The real problem is with
Fortran designers, who have always had an explicit mandate to design a
language for scientific computing, and have continued to fail miserably
to achieve this.  In a way the C users who do numerical computing want
to put on C the burden that Fortran was supposedly designed to carry.

The trouble with doing so is that other users will lose.  Each new
feature added to a language increases the complexity of the language
translator, and *all* users, even those who don't need to use these
features, will pay in money, disk space, and CPU time.
-- 
Rahul Dhesi         UUCP:  <backbones>!{iuvax,pur-ee,uunet}!bsu-cs!dhesi

chasm@killer.DALLAS.TX.US (Charles Marslett) (08/24/88)

In article <36243@yale-celray.yale.UUCP>, leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerr writes:
> On the contrary:  C is NOT woefully deficient for the vast majority of
> applications to which the vast majority of "paying users" are interested in
> applying it.

I find this comment and the attitude of the author woefully parochial -- I do
not program in COBOL and I might not even recognize a either a data entry
language or a data base language if it hit me in the face, but I do know that
more money (real dollars, payroll hours or however you want to look at it) is
spent on programs that are much more difficult to write in C than in the
language they are written in (and in some cases -- heresy -- that language
is even 8086 assembly language!).  I am quite certain that spreadsheets
garner more user dollars than C compilers for any computers other than
Crays and Suns (and Fortran compilers are probable ahead of C compilers on
at least the Crays).

C is rapidly catching up with Pascal as the second most well known language
but it has a long way to go before it becomes as well know (and perhaps as
useful)as BASIC (more heresy?).

For my purposes, C is the language of choice most of the time (by a fair
margin -- I have no second choice, except maybe Modula were C to vanish
from the face of the earth).  But C is not a universal language and she
does not appear to be expanding into other areas of applicability any more
rapidly than her elder brother and sister, FORTRAN and LISP.  And I think this
is both A GOOD THING, and the reason that it is unlikely to be a major language
20 years from now.  I have plenty of spare time in 20 years to learn several
new small languages and I have no real need to program in Ada or PL/I.

(How do you like my personification of programming language? Shall we create
a few mythic tales to describe her birth?)

Charles Marslett
chasm@killer.dallas.tx.us

chris@mimsy.UUCP (Chris Torek) (08/25/88)

In article <5282@killer.DALLAS.TX.US> chasm@killer.DALLAS.TX.US
(Charles Marslett) writes:

>In article <36243@yale-celray.yale.UUCP> leichter@venus.ycc.yale.edu
>(Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu
>(Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu
>(Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu
>(Jerr writes:

[A rather unusual name :-) .]

>>On the contrary:  C is NOT woefully deficient for the vast majority of
>>applications to which the vast majority of "paying users" are interested in
>>applying it.

[back to chasm@killer:]

>I find this comment and the attitude of the author woefully parochial
>-- I do not program in COBOL and I might not even recognize a either a
>data entry language or a data base language if it hit me in the face,
>but I do know that more money ... is spent on programs that are [done
>in other languages] ....  C is not a universal language and she does
>not appear to be expanding into other areas of applicability any more
>rapidly than her elder brother and sister, FORTRAN and LISP.  And I
>think this is both A GOOD THING, and the reason that it is unlikely to
>be a major language 20 years from now.

This is curious, because I see Jerry Leichter and Charles Marslett as
basically in agreement---so why should this attitude be `woefully
parochial'?  That C does not make a good functional programming
language is no surprise; that people who pay for programs written in C
are not paying for such code should also be no surprise; and hence that
there is no great push for C to be augmented with everything out of
Miranda and FP combined should likewise be no surprise.

To return somewhat to the original subject:  If you believe that, with
a few tweaks that would either improve, or at least not damage, the
language, C could become an ideal language for numerical software, it
is then your job to demonstrate it.  Make the changes---write yourself
a compiler, or have someone else write it---and show that the new
language is better than the old.  If it is sufficiently better,
programmers will beat a path to your mailbox, and the new language will
become popular in the same way that C became popular.  And if *you* are
not willing to put in the effort, why then should *we* be?
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7163)
Domain:	chris@mimsy.umd.edu	Path:	uunet!mimsy!chris

hankd@pur-ee.UUCP (Hank Dietz) (08/25/88)

	I've been using C for most programs since 1978.  I've taught and am
currently teaching a C programming course at Purdue University.  However, C
isn't supposed to be all things to everyone:  it is a systems programming
language and has little real competition as such (Ada? Modula 2?).

	Making C a numerical applications language has never been a priority,
nor should it be.  For example, fixed-point arithmetic would never be used
by most of the originally-intended C user community; it would simply clutter
the language definition and impede the development of good quality compilers.
I personally feel that X3J11 has done an outstanding job of resisting the
"kitchen sink" syndrome, keeping the language reasonably clean and
implementable, while resolving more than a few ambiguous/omitted details.
Propose a new language if you're not happy with any existing one.

	As for the language standardization process, if you're not willing
to attend the meetings nor to correspond in a reasonably formal way, I don't
think you've got much of a reason to complain.  Now, I'm a bit unhappy in
that I wasn't invited to be on X3J11 and would like to have had more input,
but even so I have had no trouble in getting X3J11 folk to listen to me.  My
number one remaining beef with X3J11 is that they changed the function
declaration syntax in an incompatible way without simultaneously providing
public-domain software to automatically convert old C programs to the new
notation...  but this is a problem I personally intend to remedy.

	So, let's not flame on about X3J11.  It isn't perfect, but it is C
and it is a better definition than we had before.  Enough said.

							-hankd

smryan@garth.UUCP (Steven Ryan) (08/25/88)

Sounds like somebody wants an extensible C.

Are you crazy?

Extensibilty implies the gods are mortal and a rational mode system exists.

Shame for mentioning this is comp.lang.c.

tnvscs@eutrc3.UUCP (c.severijns) (08/25/88)

We have been using C for scientific computing for some time now and so far we
only feel the need for a very few changes to the language ( we use a non-ANSI
C compiler). One of these changes is already made in the ANSI standard, the
possibility to pass a float as an argument to a function. The second change
we would like to be made is the possibility to compile C with "intrinsic"
function to be able to use a floating point processor like the MC68881 more
efficiently. This requires only an extra option for the compiler.
For the rest we consider C a good language for scientific computing that
generates code that is not much slower than FORTRAN and has the advantage of
structures. In one case were we needed complex data structures our C version
turned out to be even more than twice as fast as a similar code in FORTRAN.

Camiel Severijns				UUCP: mcvax!eutrc3!eutnv1!camiel
Surface Physics Group, Dept. of Physics
Eindhoven Universtiy of Technology
The Netherlands

henry@utzoo.uucp (Henry Spencer) (08/25/88)

In article <887@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>...  I do not have the time to attend meetings on software.

In other words, you want it fixed, but you can't be bothered investing your
own time and effort in getting it fixed?  Don't expect much sympathy.
Standards are hard work; if you can't be bothered helping with it, those
who do put in long hours on them are likely to feel that you don't really
care all that much.

> What about fixed-point (_not_ integer) arithmetic?

What about it?  Last time I did something along those lines, there wasn't
any formidable difficulty in implementing it on top of integer arithmetic.
That was a long time ago, and the stuff I was doing was specialized and
simple, mind you.

> What about the use of overflow?

A nice idea, but it's hard to make it portable.

> What about division with simultaneous quotient and remainder?

Already in X3J11 C; see div() and ldiv() in section 4.10.6.  If your
compiler supplier doesn't implement them or implements them inefficiently,
complain to him, not to X3J11 or to the net.

>What about an operation or function returning a string of values?

What about it?  Can be done right now, although a bit clumsily, using
pointers; see scanf for an example.  It's not at all clear that adding it
as an explicit construct would improve efficiency; in fact it could well
reduce it.

>What about table-driven branches?

See the "switch" construct, which has been in C all along.  If your
compiler doesn't do this well, again, complain to the supplier.

>What 
>about inserting new operators, using the processor syntax to specify the
>argument structure of these operators?

Again, perfectly possible now if you're willing to live with distasteful
syntax (function calls).  The past experiments with user control of syntax
have mostly been limited successes at best.

>In fact, what about using the 
>easy-to-use hardware operators on most machines?  A good example is &~,
>which is more useful than &, and is hardware on many machines, including
>the ones for which C was initially written...

And which any sensible compiler on those machines will use if you write
x & ~y, just as you'd expect.  See above comments on compiler defects.

>How many useful instructions have disappeared from hardware because they
>do not occur in the HLLs?

How many useless instructions have appeared in hardware because some clot
had the mistaken idea that they could be useful to HLLs?  Exacting a speed
and cost penalty from the customers as a result of the extra complexity,
too.  Such things are always compromises.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

bill@proxftl.UUCP (T. William Wells) (08/27/88)

In article <13180@mimsy.UUCP> chris@mimsy.UUCP (Chris Torek) writes:
:                                      Make the changes---write yourself
: a compiler, or have someone else write it---and show that the new
: language is better than the old.

Anticipating at least one possible complaint: compiler writing is
*hard* work.  Agreed.  But you don't have to write the whole
thing.  If you are going to make what are essentially minor
changes, you can do them in available compilers.  For example,
the Gnu compiler which is more-or-less ANSI compatible and which
does not cost money (this is not an endorsement of Stallman et
al.  just recognizing that they exist), or the Minix C compiler
which does cost (but only ~$100), or the Amsterdam Compiler Kit
(which costs a whopping $10,000).  No doubt there are others as
well.

However, I suspect that the essential work would have to be done
in the libraries, but, given that the existing libraries are not
adequate (mostly the point of the complaints, I think), and that
numerical computing is your field, that should be, rather than a
problem, the heart of your activity.  (Urk!  The structure of
that sentence!)


---
Bill
novavax!proxftl!bill

karl@haddock.ima.isc.com (Karl Heuer) (08/29/88)

In article <309@eutrc3.UUCP> tnvscs@eutrc3.UUCP (c.severijns) writes:
>We have been using C for scientific computing for some time now and so far we
>only feel the need for a very few changes to the language.  [One is passing
>float by value, which is already in ANSI C.]  The second change we would like
>to be made is the possibility to compile C with "intrinsic" function

This also is already in ANSI C.

Karl W. Z. Heuer (ima!haddock!karl or karl@haddock.isc.com), The Walking Lint