[comp.lang.c] Third public review of X3J11 C

joseph@chromo.ucsc.edu (Joseph Reger) (08/20/88)

In article <8358@smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>I don't mean to discourage comments on the draft; however, you should be
>advised that you'll need some extremely strong arguments for making any
>substantive changes.  Examples showing that the current draft is badly
>broken would help.

The draft may not be 'badly broken' but is missing out on the opportunity
to make C a convenient language for numerical computing as well.  It is a
pity that many of the 'real programmers' feel that any change that would
allow C to be the language of choice for 'non-real programmers'
(scientists) somehow would hurt their feeling/interests. I did not
participate in the debates about the power operator, noalias, conformant
arrays etc., because I was scared by some the vehemence of the 'defender
of the faith'.  It is sad that never seemed to be enough time to discuss
some recommendations in detail. There are many scientist that I know
(mostly younger people) who really came to like C, and we are using it
despite its problems and deficiencies as far as numerical computing is
concerned.

I strongly feel that it is an unacceptable situation that many of us has
to program around these problems, although some of them could be easily
fixed. Much of today's (computational) science is done in a workstation
environment, mostly under Unix. In the future this is going to be even
more so, especially now that the supercomputer manufacturers are adopting
Unix, too. The best compilers in these environments are the C compilers,
period. Since the manufacturer often uses the same compilers for his own
development, the user can be fairly confident that most of the bugs have
already been eliminated. So there will be ever more scientist who program
in C. Why is it such a good idea to have a growing amount of code around
that contains ugly, difficult to understand "fixes"?

The power operator is a small issue, I agree. Noalias (no flames please, I
am afraid of you) is definitely going to come, since the vector machines
need it. Only that it will come in many (vendor specific) colors and
flavors. Conformant arrays?  We (scientists) need them very much and I do
not see how they would mean any grand problem for C --and the end of the
western civilization-- in the simple version proposed by David Hough (see
his "Comments on Proposed ANSI C Standard").

All these problems could be solved, of course, by the inclusion of the
following statement into the Draft:

"Scientist and other non-real programmers are not allowed to use the
programming language C".

(The funny thing is that some scientist would actually like to see this
statement, not only in the Draft, but everywhere).

Joseph D. Reger,	joseph@chromo.ucsc.edu

gwyn@smoke.ARPA (Doug Gwyn ) (08/21/88)

In article <4566@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes:
>The draft may not be 'badly broken' but is missing out on the opportunity
>to make C a convenient language for numerical computing as well.

I happen to use C for numerical programming, despite occasional flaws
such as those you mention, primarily because it offers much better
support for data structures than do other alternatives such as FORTRAN.
I agree that there are some changes that could make C more convenient
for such applications.  Hough's suggestions are for the most part good
ones, but they haven't been receiving sufficient committee support.

The fundamental problem is that IT IS MUCH TOO LATE to be making
significant changes to the proposed standard.  Look at all the trouble
the last-minute addition of "noalias" caused.  The public review
period is intended as a REVIEW of work done by the committee, not as
an opportunity for language design.  Where were all these scientific
users of C when the design work was being done?  By leaving that up
to people who didn't think the flaws you perceive were significant,
you did not get those flaws addressed in the proposed C standard.
It's easy to complain about other people's work; much easier than
helping with the work.  I suggest that you GET INVOLVED in drafting
the NEXT (revised) standard.

Obviously I am not speaking for X3J11 officially here..

joseph@chromo.ucsc.edu (Joseph Reger) (08/23/88)

In article <8365@smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>In article <4566@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes:
>>The draft may not be 'badly broken' but is missing out on the opportunity
>>to make C a convenient language for numerical computing as well.
>
>
>The fundamental problem is that IT IS MUCH TOO LATE to be making
>significant changes to the proposed standard.

It seemed to me - and I admittedly did not follow it from the very beginning -
that it was always MUCH TOO LATE.

>It's easy to complain about other people's work; much easier than
>helping with the work.  I suggest that you GET INVOLVED in drafting
>the NEXT (revised) standard.

I certainly will.

Joseph Reger, 	joseph@chromo.ucsc.edu

cik@l.cc.purdue.edu (Herman Rubin) (08/23/88)

In article <8365@smoke.ARPA>, gwyn@smoke.ARPA (Doug Gwyn ) writes:
> In article <4566@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes:
> >The draft may not be 'badly broken' but is missing out on the opportunity
> >to make C a convenient language for numerical computing as well.
> 
> I happen to use C for numerical programming, despite occasional flaws
> such as those you mention, primarily because it offers much better
> support for data structures than do other alternatives such as FORTRAN.
> I agree that there are some changes that could make C more convenient
> for such applications.  Hough's suggestions are for the most part good
> ones, but they haven't been receiving sufficient committee support.
> 
> The fundamental problem is that IT IS MUCH TOO LATE to be making
> significant changes to the proposed standard.  Look at all the trouble
> the last-minute addition of "noalias" caused.  The public review
> period is intended as a REVIEW of work done by the committee, not as
> an opportunity for language design.  Where were all these scientific
> users of C when the design work was being done?  By leaving that up
> to people who didn't think the flaws you perceive were significant,
> you did not get those flaws addressed in the proposed C standard.
> It's easy to complain about other people's work; much easier than
> helping with the work.  I suggest that you GET INVOLVED in drafting
> the NEXT (revised) standard.
> 
> Obviously I am not speaking for X3J11 officially here..

I use C for numerical programming, and then have to edit the resulting .s
file.  All of the languages, including C, are woefully deficient is letting
the user use the capacities of the machines.  If C is to be a good flexible
language, the committee should widely advertise for complaints about the
deficiencies of the language before starting out.

I would have no trouble coming up with pages of these items.  But the last
time I did something like this, in reply to the open invitation to attend
the meeting on the IEEE floating-point convention, was to receive an invi-
tation to attend!  I do not have the time to attend meetings on software.

Another problem is that the language gurus are unsympathetic to ideas which
run counter to their perception of computing needs.  They see integer 
arithmetic as primarily for addressing and looping; I see integer arithmetic
as important for number-crunching.  What about fixed-point (_not_ integer)
arithmetic?  What about the use of overflow?  What about division with
simultaneous quotient and remainder?  What about an operation or function
returning a string of values?  What about table-driven branches?  What 
about inserting new operators, using the processor syntax to specify the
argument structure of these operators?  In fact, what about using the 
easy-to-use hardware operators on most machines?  A good example is &~,
which is more useful than &, and is hardware on many machines, including
the ones for which C was initially written.  Many of those machines do not
even have a hardware &.

How many useful instructions have disappeared from hardware because they
do not occur in the HLLs?  Multiprecision arithmetic needs unsigned
multiplication and division to be efficient, and not floating point
arithmetic.  The presence of a single hardware instruction can be
essential to an algorithm being worthwhile; if the instruction is in
software, it is more likely to appear in hardware.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

mcdonald@uxe.cso.uiuc.edu (08/23/88)

>The fundamental problem is that IT IS MUCH TOO LATE to be making
>significant changes to the proposed standard.  Look at all the trouble

>I suggest that you GET INVOLVED in drafting
>the NEXT (revised) standard.

The problem is, how does one do this? IF you are a regular reader of this
august information dispersal system, you might here about some such
effort not too late after it gets started. But, in the absence of that,
you are going to know about it only AFTER the standard gets approved,
when the next version of your compiler comes out and your programs
stop compiling. I never heard about Fortran 77 until my programs
refused to run because a new compiler didn't support Hollerith fields.

henry@utzoo.uucp (Henry Spencer) (08/23/88)

In article <4566@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes:
>The draft may not be 'badly broken' but is missing out on the opportunity
>to make C a convenient language for numerical computing as well...

Well, remember two things.  First, that there was opportunity for input
along these lines earlier, and little was received; it is now much too late
for major changes.  Second, that X3J11's mission was to standardize an
existing language, not to invent a new one; they did make some small steps
toward making C friendlier for numerical work, and that is probably about
all one should expect from a standards committee.

If you really want to see C improved as a language for numerical computing,
the first thing to do is to scream at your compiler supplier until he/she/it
does some of the things you want.  Then, when the time rolls around for the
next revision of the C standard, you can propose changes based on *actual
experience*.  This will carry a lot more weight than untried inventions.
Given the time lags involved in all this, if you are serious about it, the
time to start haranguing your supplier is *now*.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leic) (08/23/88)

In article <887@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes...
 
>I use C for numerical programming, and then have to edit the resulting .s
>file.  All of the languages, including C, are woefully deficient is letting
>the user use the capacities of the machines.  If C is to be a good flexible
>language, the committee should widely advertise for complaints about the
>deficiencies of the language before starting out.

On the contrary:  C is NOT woefully deficient for the vast majority of
applications to which the vast majority of "paying users" are interested in
applying it.  As later comments make clear, the kinds of users Mr. Rubin has
in mind are rather different.  The fact of the matter is, hardly anyone thinks
that "fixed-point arithmetic" (as opposed to integer) is important.  It just
does not come up in the vast majority of uses to which computers are put.

Developing software is an expensive proposition.  Everything added to a
language has to be implemented somewhere, by someone.  Then it has to be
debugged, supported, and maintained.  There are only two ways this will
happen:  If someone is willing to pay for it; or when someone is willing
to do it out of their own love for the subject.

>I would have no trouble coming up with pages of these items.  But the last
>time I did something like this, in reply to the open invitation to attend
>the meeting on the IEEE floating-point convention, was to receive an invi-
>tation to attend!  I do not have the time to attend meetings on software.

Ah, so Mr. Rubin is willing to COMPLAIN, but he is NOT willing to do the work
out of his own love for the subject.  He certainly gives no indication that
he is willing (or able) to pay to have it done either.

>Another problem is that the language gurus are unsympathetic to ideas which
>run counter to their perception of computing needs.

I am a "language guru", though my interests happen to be in parallel program-
ming languages.  Again, why should I care what Mr. Rubin thinks "computing
needs" are when he can't provide money, isn't willing to invest his own time,
and can only provide the most specialized examples of what such features might
be used for?

>						      They see integer 
>arithmetic as primarily for addressing and looping; I see integer arithmetic
>as important for number-crunching.  What about fixed-point (_not_ integer)
>arithmetic?  What about the use of overflow?  What about division with
>simultaneous quotient and remainder?  What about an operation or function
>returning a string of values?  What about table-driven branches?  What 
>about inserting new operators, using the processor syntax to specify the
>argument structure of these operators?  In fact, what about using the 
>easy-to-use hardware operators on most machines?  A good example is &~,
>which is more useful than &, and is hardware on many machines, including
>the ones for which C was initially written.  Many of those machines do not
>even have a hardware &.

What about all these things?  Being absolutely brutal about it:  Why should
I (or other readers) care?  What will it gain us to worry about this?

>How many useful instructions have disappeared from hardware because they
>do not occur in the HLLs?

Along the same brutal lines, my answer is:  No USEFUL instructions have
disappeared at all.  What has disappeared are a lot of non-essential ideas
that were tossed in back in the days when computer architecture was a new
field, with a large research component.  No one really knew what would turn
out to be "useful".

Well, for better or for worse, computer architecture isn't like that any more.
Computer design is a multi-billion dollar industry.  It is driven, not by what
people might WANT in some abstract sense, but by what they are willing and
able to pay for.  THAT is the only workable definition of "useful", and on
that scale the things Mr. Rubin wants have long ago fallen to the bottom of
the list.

>			    Multiprecision arithmetic needs unsigned
>multiplication and division to be efficient, and not floating point
>arithmetic.  The presence of a single hardware instruction can be
>essential to an algorithm being worthwhile; if the instruction is in
>software, it is more likely to appear in hardware.
>-- 
>Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
>Phone: (317)494-6054
>hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

It's painful to see economics dominating a field one loves and pushing it in
directions one is not inclined to go.  I'm not unsympathetic to Mr. Rubin's
position; my own background, way back when, is in mathematics (complex analy-
sis and a bit of analytic number theory).  Even the work I do now is beyond
the current "commercial" leading edge, and I am sometimes frustrated by the
way hardware manufacturers put roadblocks in the way of doing "obviously
useful" things, because they are too busy heading in other directions.  But
that's life.

The USEFUL thing for Mr. Rubin to do, if he really thinks these issues are
important, is to work at convincing others of it.  Not by complaining in this
and other newsgroups about how he is being ignored.  But exactly by spending
some time with those committees, by offering some real alternatives, by
showing how what he proposes is useful to people other than himself.  Frankly,
I doubt anything he can do will ever get major commercial ventures interested.
But that doesn't mean he can't get other researchers interested.  Many people
are able to design and build special-purpose hardware and software today; if
Mr. Rubin talked to some of them, he might discover that many good research
hardware hackers have the tools, but are lacking interesting problems.  I
will say, however, that his chances of getting people interested would improve
markedly if he stopped complaining about how he didn't "have the time to
attend meetings on software".  Very few computer scientists have the time to
attend meetings on statistics either.
							-- Jerry

dhesi@bsu-cs.UUCP (Rahul Dhesi) (08/24/88)

In article <887@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
[wish list for HLLs]

I agree that many of the features wished for ought to be in higher-
level languages.  But to put all of them in C would no longer leave
the relatively small, simple, low-level language that C was designed
to be.

Nearly all of the features that Herman Rubin wishes to see *are*
already in HLLs, only not all are in each HLL.  C++ has some.  Ada has
*many* of them, especially fixed point arithmetic and functions
returning structured values.

The real problem is not with C designers.  The real problem is with
Fortran designers, who have always had an explicit mandate to design a
language for scientific computing, and have continued to fail miserably
to achieve this.  In a way the C users who do numerical computing want
to put on C the burden that Fortran was supposedly designed to carry.

The trouble with doing so is that other users will lose.  Each new
feature added to a language increases the complexity of the language
translator, and *all* users, even those who don't need to use these
features, will pay in money, disk space, and CPU time.
-- 
Rahul Dhesi         UUCP:  <backbones>!{iuvax,pur-ee,uunet}!bsu-cs!dhesi

prh@actnyc.UUCP (Paul R. Haas) (08/24/88)

Given:
1. ANSI C (as proposed) does not support numerical computing adequately.
2. There is not enough time to fix it for the current standard.

There are several ways of coping:
1. Use Fortran.
2. Hope your compiler vendor comes up with reasonable extensions.
3. Do something to encourage your compiler vendor to come up with
   something reasonable.

I would favor writing a standard for "correct" math extensions to the C
language.  If the standard is adopted by at least one compiler vendor
(or producer so as to include the FSF) then you can show prior art for
the next round of X3J11.  If the "correct" math extensions are simple
enough to implement then many manufacturers will put it into their
compilers.

A proposal for a standard to be produced by an individual, an
independent committee or a committee from one of the user groups (ACM,
IEEE, /usr/group, Usenix, etc...).  A committee could meet in person
or electronically, etc...

Unfortunately, I lack the skills to produce such a document.
----
Paul Haas,  uunet!actnyc!prh (if that doesn't work: haas@msudoc.egr.msu.edu)

gwyn@smoke.ARPA (Doug Gwyn ) (08/24/88)

In article <225800053@uxe.cso.uiuc.edu> mcdonald@uxe.cso.uiuc.edu writes:
>you are going to know about it only AFTER the standard gets approved,

I'm pretty sure the formation of X3J11 was announced in CACM, and it
has been well known in the C community for years (e.g. "The C Advisor"
and other regular columns).  I don't think it made the TV network news.

chasm@killer.DALLAS.TX.US (Charles Marslett) (08/24/88)

In article <36243@yale-celray.yale.UUCP>, leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerr writes:
> On the contrary:  C is NOT woefully deficient for the vast majority of
> applications to which the vast majority of "paying users" are interested in
> applying it.

I find this comment and the attitude of the author woefully parochial -- I do
not program in COBOL and I might not even recognize a either a data entry
language or a data base language if it hit me in the face, but I do know that
more money (real dollars, payroll hours or however you want to look at it) is
spent on programs that are much more difficult to write in C than in the
language they are written in (and in some cases -- heresy -- that language
is even 8086 assembly language!).  I am quite certain that spreadsheets
garner more user dollars than C compilers for any computers other than
Crays and Suns (and Fortran compilers are probable ahead of C compilers on
at least the Crays).

C is rapidly catching up with Pascal as the second most well known language
but it has a long way to go before it becomes as well know (and perhaps as
useful)as BASIC (more heresy?).

For my purposes, C is the language of choice most of the time (by a fair
margin -- I have no second choice, except maybe Modula were C to vanish
from the face of the earth).  But C is not a universal language and she
does not appear to be expanding into other areas of applicability any more
rapidly than her elder brother and sister, FORTRAN and LISP.  And I think this
is both A GOOD THING, and the reason that it is unlikely to be a major language
20 years from now.  I have plenty of spare time in 20 years to learn several
new small languages and I have no real need to program in Ada or PL/I.

(How do you like my personification of programming language? Shall we create
a few mythic tales to describe her birth?)

Charles Marslett
chasm@killer.dallas.tx.us

henry@utzoo.uucp (Henry Spencer) (08/24/88)

In article <4581@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes:
>It seemed to me - and I admittedly did not follow it from the very beginning -
>that it was always MUCH TOO LATE.

X3J11 has been trying to get the %#@%$% thing out the door for quite some
time now.  The combination of lengthy public-review cycles and official
meetings held only quarterly means that a standard which needs *three*
public-review cycles will be in "almost finished, no substantive changes
without a damn good reason" status for a long time.  Sounds like you came
on the scene after that phase started.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

ian@argosy.UUCP (Ian L. Kaplan) (08/25/88)

In article <3732@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes:
>
>The real problem is not with C designers.  The real problem is with
>Fortran designers, who have always had an explicit mandate to design a
>language for scientific computing, and have continued to fail miserably
>to achieve this.  In a way the C users who do numerical computing want
>to put on C the burden that Fortran was supposedly designed to carry.

  The Fortran 8x committee has its problems, but lack of features is
not one of them.  The April '87 Fortran draft standard includes a
number of "modern programming language" features, including something
like structures (referred to as derived types) and modules, with
imports and exports.  The real problem with the Fortran
standardization process is the in ability of the Fortran community to
arrive at a standard.  The decade is almost over.  Soon it will be
Fortran 9x.

                           Ian Kaplan

"I don't know what the most popular numeric programming language will
 look like in the year 2000, but it will be named Fortran."

           These opinions are my own.

chris@mimsy.UUCP (Chris Torek) (08/25/88)

In article <5282@killer.DALLAS.TX.US> chasm@killer.DALLAS.TX.US
(Charles Marslett) writes:

>In article <36243@yale-celray.yale.UUCP> leichter@venus.ycc.yale.edu
>(Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu
>(Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu
>(Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu
>(Jerr writes:

[A rather unusual name :-) .]

>>On the contrary:  C is NOT woefully deficient for the vast majority of
>>applications to which the vast majority of "paying users" are interested in
>>applying it.

[back to chasm@killer:]

>I find this comment and the attitude of the author woefully parochial
>-- I do not program in COBOL and I might not even recognize a either a
>data entry language or a data base language if it hit me in the face,
>but I do know that more money ... is spent on programs that are [done
>in other languages] ....  C is not a universal language and she does
>not appear to be expanding into other areas of applicability any more
>rapidly than her elder brother and sister, FORTRAN and LISP.  And I
>think this is both A GOOD THING, and the reason that it is unlikely to
>be a major language 20 years from now.

This is curious, because I see Jerry Leichter and Charles Marslett as
basically in agreement---so why should this attitude be `woefully
parochial'?  That C does not make a good functional programming
language is no surprise; that people who pay for programs written in C
are not paying for such code should also be no surprise; and hence that
there is no great push for C to be augmented with everything out of
Miranda and FP combined should likewise be no surprise.

To return somewhat to the original subject:  If you believe that, with
a few tweaks that would either improve, or at least not damage, the
language, C could become an ideal language for numerical software, it
is then your job to demonstrate it.  Make the changes---write yourself
a compiler, or have someone else write it---and show that the new
language is better than the old.  If it is sufficiently better,
programmers will beat a path to your mailbox, and the new language will
become popular in the same way that C became popular.  And if *you* are
not willing to put in the effort, why then should *we* be?
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7163)
Domain:	chris@mimsy.umd.edu	Path:	uunet!mimsy!chris

hankd@pur-ee.UUCP (Hank Dietz) (08/25/88)

	I've been using C for most programs since 1978.  I've taught and am
currently teaching a C programming course at Purdue University.  However, C
isn't supposed to be all things to everyone:  it is a systems programming
language and has little real competition as such (Ada? Modula 2?).

	Making C a numerical applications language has never been a priority,
nor should it be.  For example, fixed-point arithmetic would never be used
by most of the originally-intended C user community; it would simply clutter
the language definition and impede the development of good quality compilers.
I personally feel that X3J11 has done an outstanding job of resisting the
"kitchen sink" syndrome, keeping the language reasonably clean and
implementable, while resolving more than a few ambiguous/omitted details.
Propose a new language if you're not happy with any existing one.

	As for the language standardization process, if you're not willing
to attend the meetings nor to correspond in a reasonably formal way, I don't
think you've got much of a reason to complain.  Now, I'm a bit unhappy in
that I wasn't invited to be on X3J11 and would like to have had more input,
but even so I have had no trouble in getting X3J11 folk to listen to me.  My
number one remaining beef with X3J11 is that they changed the function
declaration syntax in an incompatible way without simultaneously providing
public-domain software to automatically convert old C programs to the new
notation...  but this is a problem I personally intend to remedy.

	So, let's not flame on about X3J11.  It isn't perfect, but it is C
and it is a better definition than we had before.  Enough said.

							-hankd

rob@raksha.eng.ohio-state.edu (Rob Carriere) (08/25/88)

In article <8374@smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>In article <225800053@uxe.cso.uiuc.edu> mcdonald@uxe.cso.uiuc.edu writes:
>>you are going to know about it only AFTER the standard gets approved,
>
>I'm pretty sure the formation of X3J11 was announced in CACM, and it
>has been well known in the C community for years (e.g. "The C Advisor"
>and other regular columns).  I don't think it made the TV network news.

Doubtless.  But we were talking about the numerical community, they
generally don't read CACM.  So, was it announced in, say publications
of the AAAS or the IEEE?  (NOTE: I'm not saying it wasn't, I don't
know, and I'm curious)

Rob Carriere

smryan@garth.UUCP (Steven Ryan) (08/25/88)

Sounds like somebody wants an extensible C.

Are you crazy?

Extensibilty implies the gods are mortal and a rational mode system exists.

Shame for mentioning this is comp.lang.c.

tnvscs@eutrc3.UUCP (c.severijns) (08/25/88)

We have been using C for scientific computing for some time now and so far we
only feel the need for a very few changes to the language ( we use a non-ANSI
C compiler). One of these changes is already made in the ANSI standard, the
possibility to pass a float as an argument to a function. The second change
we would like to be made is the possibility to compile C with "intrinsic"
function to be able to use a floating point processor like the MC68881 more
efficiently. This requires only an extra option for the compiler.
For the rest we consider C a good language for scientific computing that
generates code that is not much slower than FORTRAN and has the advantage of
structures. In one case were we needed complex data structures our C version
turned out to be even more than twice as fast as a similar code in FORTRAN.

Camiel Severijns				UUCP: mcvax!eutrc3!eutnv1!camiel
Surface Physics Group, Dept. of Physics
Eindhoven Universtiy of Technology
The Netherlands

cik@l.cc.purdue.edu (Herman Rubin) (08/25/88)

In article <509@accelerator.eng.ohio-state.edu>, rob@raksha.eng.ohio-state.edu (Rob Carriere) writes:
> In article <8374@smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
> >In article <225800053@uxe.cso.uiuc.edu> mcdonald@uxe.cso.uiuc.edu writes:
> >>you are going to know about it only AFTER the standard gets approved,
> >
> >I'm pretty sure the formation of X3J11 was announced in CACM, and it
> >has been well known in the C community for years (e.g. "The C Advisor"
> >and other regular columns).  I don't think it made the TV network news.
> 
> Doubtless.  But we were talking about the numerical community, they
> generally don't read CACM.  So, was it announced in, say publications
> of the AAAS or the IEEE?  (NOTE: I'm not saying it wasn't, I don't
> know, and I'm curious)

It is important not just that it appear in a journal, but prominently.  If
you want input, go out and loudly proclaim it.  As a researcher, I find it
necessary to glance at more than 200 journals.  I certainly missed the
announcement in CACM (one of my lower priority journals).

I do not believe it appeared in _Science_, the journal of AAAS.  Now most
mathematicians and statisticians do not read any of the above named journals.
How about including the _Notices_ of the AMS, the _Bulletin_ of the IMS,
and the appropriate information journals of SIAM and ASA?  How about asking
the physicists and chemists and astronomers and geologists and biologists?
(Apologies to the groups left out are in order.)
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

barmar@think.COM (Barry Margolin) (08/25/88)

In article <509@accelerator.eng.ohio-state.edu> rob@raksha.eng.ohio-state.edu (Rob Carriere) writes:
>In article <8374@smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
>>I'm pretty sure the formation of X3J11 was announced in CACM, and it
>Doubtless.  But we were talking about the numerical community, they
>generally don't read CACM.  So, was it announced in, say publications
>of the AAAS or the IEEE?

I believe that IEEE Computer (and maybe also IEEE Software) has a
regular column containing standards-related notices.

As for announcing such things in non-computer journals, that would
take quite a bit of foresight.  I'm sure that X3 simply has a list of
publications they regularly announce things in, rather than trying to
figure out all the possible journals that might be interested in a
particular standard.  It wouldn't seem obvious that journals of the
AAAS or AMS would be interested in a standard for a systems
programming language (that's all C has ever been intended to be, no
matter how many statisticians and scientists use it).  If the
scientific/numeric communities are interested, I think it should be
the responsibility of the editors of their journals to gather the
information, rather than relying on us CS people to know that they
care.  ANSI publishes a regular newsletter on all standards-related
activity; while I would not expect most people to read this, I WOULD
expect at least one journalist for each magazine to keep an eye on it.


Barry Margolin
Thinking Machines Corp.

barmar@think.com
{uunet,harvard}!think!barmar

henry@utzoo.uucp (Henry Spencer) (08/25/88)

In article <887@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>...  I do not have the time to attend meetings on software.

In other words, you want it fixed, but you can't be bothered investing your
own time and effort in getting it fixed?  Don't expect much sympathy.
Standards are hard work; if you can't be bothered helping with it, those
who do put in long hours on them are likely to feel that you don't really
care all that much.

> What about fixed-point (_not_ integer) arithmetic?

What about it?  Last time I did something along those lines, there wasn't
any formidable difficulty in implementing it on top of integer arithmetic.
That was a long time ago, and the stuff I was doing was specialized and
simple, mind you.

> What about the use of overflow?

A nice idea, but it's hard to make it portable.

> What about division with simultaneous quotient and remainder?

Already in X3J11 C; see div() and ldiv() in section 4.10.6.  If your
compiler supplier doesn't implement them or implements them inefficiently,
complain to him, not to X3J11 or to the net.

>What about an operation or function returning a string of values?

What about it?  Can be done right now, although a bit clumsily, using
pointers; see scanf for an example.  It's not at all clear that adding it
as an explicit construct would improve efficiency; in fact it could well
reduce it.

>What about table-driven branches?

See the "switch" construct, which has been in C all along.  If your
compiler doesn't do this well, again, complain to the supplier.

>What 
>about inserting new operators, using the processor syntax to specify the
>argument structure of these operators?

Again, perfectly possible now if you're willing to live with distasteful
syntax (function calls).  The past experiments with user control of syntax
have mostly been limited successes at best.

>In fact, what about using the 
>easy-to-use hardware operators on most machines?  A good example is &~,
>which is more useful than &, and is hardware on many machines, including
>the ones for which C was initially written...

And which any sensible compiler on those machines will use if you write
x & ~y, just as you'd expect.  See above comments on compiler defects.

>How many useful instructions have disappeared from hardware because they
>do not occur in the HLLs?

How many useless instructions have appeared in hardware because some clot
had the mistaken idea that they could be useful to HLLs?  Exacting a speed
and cost penalty from the customers as a result of the extra complexity,
too.  Such things are always compromises.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

henry@utzoo.uucp (Henry Spencer) (08/25/88)

In article <225800053@uxe.cso.uiuc.edu> mcdonald@uxe.cso.uiuc.edu writes:
>>I suggest that you GET INVOLVED in drafting
>>the NEXT (revised) standard.
>
>The problem is, how does one do this? ...  [If you're inattentive]
>you are going to know about it only AFTER the standard gets approved,
>when the next version of your compiler comes out and your programs
>stop compiling. I never heard about Fortran 77 until my programs
>refused to run because a new compiler didn't support Hollerith fields.

If you wish to be involved in drafting standards, you are going to have
to sit up and pay attention so you know when work is in progress.  X3J11
was fairly well publicized as such things go; anyone who was seriously
monitoring language-standards activity heard about it.  Again, I'm afraid
the answer is that the only way to get involved in such things is to make
an effort to do so.  This will generally involve spending both time and
money on it.  A good first step is to join ACM's SIGPLAN (Special Interest
Group on Programming Languages); its monthly journal, SIGPLAN Notices,
publishes (a) quite a bit of drivel, and (b) a certain amount of news on
things like impending standards work.  For example, subscribers to it
were not caught unprepared by Fortran 77, since an entire (preliminary)
draft of the F77 standard appeared there.  That was kind of an extreme
case, which hasn't been repeated, but in general, if you subscribe to
the major publications of the programming-languages community, you will
not be caught by surprise by standards efforts.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

news@ism780c.isc.com (News system) (08/26/88)

In article <9@argosy.UUCP> ian@argosy.UUCP (Ian L. Kaplan) writes:

>  The Fortran 8x committee has its problems, but lack of features is
>not one of them.  The April '87 Fortran draft standard includes a
>number of "modern programming language" features, including something
>like structures ...

>"I don't know what the most popular numeric programming language will
> look like in the year 2000, but it will be named Fortran."

Deja vu. In 1963 a commitee (Share) got together to produce FORTRAN V. It
had structures, if-then-else, switch statements (spelled 'GOTO
<expression>'), eight types of numeric data, and a whole bunch more.  After
the commitee saw what they had wrought, they decided that it was good.  But
FORTRAN V was a bad name.  So they called the language NPL (New programming
Language).  When the Naval Physics Lab complained, the commitee changed the
name again.  And Voila!  PL/I was born.

   Marv Rubinstein

mcdonald@uxe.cso.uiuc.edu (08/26/88)

>I'm pretty sure the formation of X3J11 was announced in CACM, and it
>has been well known in the C community for years (e.g. "The C Advisor"
>and other regular columns).  I don't think it made the TV network news.

Chemists such as I, and other scientists don't read CACM. Most of
us don't read ANYTHING about computers, we just use them. BUT, we
are vitally interested in serious things like changing languages. The
best source of info would probably compiler vendors: but they don't WANT
us to know about such things. They want to control the process themselves.
I have made many suggestions to compiler vendors: they have never ever
shown any interest.

burgett@steel.COM (Michael Burgett) (08/26/88)

These discussions about the flaws of the C language in dealing with complex
floating point ops, and the *failure* of X3J11 to solicit input and rectify
these things are getting _old_....

1) C is not now, has not been in the past, and (hopefully) will not be in
the future, a lanugage designed for writing scientific applications

2) C was designed and implemented to remove the onus of using assembly language
to write operating systems, utilities, device drivers and the ilk.  In this
regard, it has no equal.

In light of 1 & 2... where's the beef?  C is doing what it is designed to do,
and from what I've seen of the ANSI standard, will continue to do so.  My hat
off to the committee for not bowing to public pressure to try to make C all
things to all people (can you say PL/1... I knew you could.)

If you want to write an application demanding scientific functions, write the 
damn things in fortran and then write all the stuff that makes sense to, in C.
(How would you like it if you hired a carpenter and he showed up with one tool
to try and add a room on your house?)  This seems to me the essence of why we 
have different languages to begin with, and all the whining, sniveling and 
crying *shouldn't* change that... just face it, to program effectively
you just might have to learn more than one language.... (shock! disbelief!!)

awww well.....  i guess i've flamed enuff for one letter....

	mike burgett		burgett!adobe@decwrl.dec.com

"my intellectual work belongs to my employer, but my flames are my own..."

henry@utzoo.uucp (Henry Spencer) (08/26/88)

In article <1290@garth.UUCP> smryan@garth.UUCP (Steven Ryan) writes:
>Sounds like somebody wants an extensible C.

It's been done, it works well, and it's readily available:  C++.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

lupton@uhccux.uhcc.hawaii.edu (Robert Lupton) (08/27/88)

Re Ansi C: I knew about it as a graduate student in astrophysics in 1984,
so it can't have been that hard to find out. I probably first saw it on
net.lang.c, which we all read.

		Robert

joseph@chromo.ucsc.edu (Joseph Reger) (08/27/88)

In article <4203@adobe.COM> burgett@steel.UUCP (Michael Burgett) writes:
>These discussions about the flaws of the C language in dealing with complex
>floating point ops, and the *failure* of X3J11 to solicit input and rectify
>these things are getting _old_....
>
>1) C is not now, has not been in the past, and (hopefully) will not be in
>the future, a lanugage designed for writing scientific applications
>.....
>In light of 1 & 2... where's the beef?  C is doing what it is designed to do,
>and from what I've seen of the ANSI standard, will continue to do so.  My hat
>off to the committee for not bowing to public pressure to try to make C all
>things to all people (can you say PL/1... I knew you could.)
>
>	mike burgett		burgett!adobe@decwrl.dec.com
>
>"my intellectual work belongs to my employer, but my flames are my own..."
                                                   ^^^^^^^^^^^^^^^^^^^^^^^

And they are nothing to be proud of! Mr. burgett sounds  like  if
he  had  had  invented the language C, and as if he were the only
authority to decide just who is permitted to use it.  The  humble
proposition was to make a _few_ changes that would _not_ make the
language more complex, or bigger or more difficult  to  implement
or whatever the usual "arguments" against these are. I am pulling
out from this debate now and just would like to comment  that  it
is  ending  yet another time where it sadly usually does: "Scien-
tist go home, you buggers program in YOUR language not in OURS."

I thank all of you who sent me e-mail  on  this  topic  (none  of
which was like Mr. burgett's above piece). I will continue to use
C as long as the standard does not require  the  implementors  to
code  special  "Scientific  Application Detectors" (SAD) into it,
which would produce erroneous code if the  probability  of scien-
tific use exceeds a certain value.

Joseph D. Reger,	joseph@chromo.ucsc.edu

rob@kaa.eng.ohio-state.edu (Rob Carriere) (08/27/88)

In article <4203@adobe.COM> burgett@steel.UUCP (Michael Burgett) writes:
> [ C is not a numerical language, fortran is, so ]
>just face it, to program effectively
>you just might have to learn more than one language.... (shock! disbelief!!)

And having done so, you might then find that there is a language that
does almost everything you want, could do everything you want with
*only small changes*, and is already better than anything else around.
Can you blame people for then trying to get these minor changes done?
I am not talking about anything like a PL/1 syndrome, because I like C
for its simplicity, and I'd much rather do some work than have the
language bloated, but there are a couple of minor changes that would
greatly improve the utility of C in the numerical field.

Rob Carriere
Face it, C is just to damn *_GOOD_* for you systems guys to keep it
all to yourselves... :-)

cik@l.cc.purdue.edu (Herman Rubin) (08/27/88)

In article <4203@adobe.COM>, burgett@steel.COM (Michael Burgett) writes:
> These discussions about the flaws of the C language in dealing with complex
> floating point ops, and the *failure* of X3J11 to solicit input and rectify
> these things are getting _old_....
> 
> 1) C is not now, has not been in the past, and (hopefully) will not be in
> the future, a lanugage designed for writing scientific applications
> 
> 2) C was designed and implemented to remove the onus of using assembly language
> to write operating systems, utilities, device drivers and the ilk.  In this
> regard, it has no equal.

Whether C was designed for writing scientific applications is absolutely 
irrelevant.  English was not designed for discussing computer issues.
It was at least somewhat recognized that it might not be possible or
desirable to eliminate the use of assembler in C.

FORTRAN was designed specifically for the IBM704, and was not intended for
subroutine libraries.  Unfortunately, this language, whose inadequacies should
have been obvious to anyone with any understanding of computer hardware and
numerical mathematics, has become so common that many of its devotees cannot
understand that they could profitably use other languages.

Many people have posted that they can do a better job of programming numerical
applications in C than in FORTRAN.  How can _you_ flame them for that?

> In light of 1 & 2... where's the beef?  C is doing what it is designed to do,
> and from what I've seen of the ANSI standard, will continue to do so.  My hat
> off to the committee for not bowing to public pressure to try to make C all
> things to all people (can you say PL/1... I knew you could.)

That a badly designed language was rejected is irrelevant.

> If you want to write an application demanding scientific functions, write the 
> damn things in fortran and then write all the stuff that makes sense to, in C.
> (How would you like it if you hired a carpenter and he showed up with one tool
> to try and add a room on your house?)  This seems to me the essence of why we 
> have different languages to begin with, and all the whining, sniveling and 
> crying *shouldn't* change that... just face it, to program effectively
> you just might have to learn more than one language.... (shock! disbelief!!)

Can you provide me with a good way to program where one _line_ is in C and
another in FORTRAN?  Of course not.  Subroutine calls, cheap when FORTRAN and
ALGOL were produced, range from expensive to very expensive.  I do not exag-
gerate when I say one line.

I consider an instruction a tool, and a programming language a tool box.
It is useful to have electric drills, power saws, etc.  But a competent tool-
user knows when to use a given tool.  I expect a carpenter to know when not
to use a high-level power saw and use a low-level hand saw instead.  I expect
a programmer to know when to use an assembler instruction instead of clumsily
using C.

Also, a programmer is more like a constructor than a carpenter.  It is
sometimes even necessary for the same person to combine the tasks of a
carpenter, plumber, and electrician simultaneously.  Thus, the tool box
must contain all of the relevant tools.

BTW, I find the instruction set of any computer far simpler than any HLL.
Now the obfuscated assembler directives are another matter.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

cik@l.cc.purdue.edu (Herman Rubin) (08/27/88)

In article <1988Aug26.162706.22671@utzoo.uucp>, henry@utzoo.uucp (Henry Spencer) writes:
> In article <1290@garth.UUCP> smryan@garth.UUCP (Steven Ryan) writes:
> >Sounds like somebody wants an extensible C.
> 
> It's been done, it works well, and it's readily available:  C++.

There are gross weaknesses in C++.  It does not allow the introduction of
new operators, for example.  It does not address the problem of multiword
hardware types, using machine dependencies where they can profitably be
used (see the discussion about short x short -> long), and other such
goodies.  I have used one type when C would assume another type; C++ would
complain.  

Fortunately, the newer C++ compilers do not reduce to C; that gave such
atrocious code that if there was another way it would be preferable.

C++ addresses a few of the weaknesses of C.  However, it ignores the worst
of the problems.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

bill@proxftl.UUCP (T. William Wells) (08/27/88)

In article <13180@mimsy.UUCP> chris@mimsy.UUCP (Chris Torek) writes:
:                                      Make the changes---write yourself
: a compiler, or have someone else write it---and show that the new
: language is better than the old.

Anticipating at least one possible complaint: compiler writing is
*hard* work.  Agreed.  But you don't have to write the whole
thing.  If you are going to make what are essentially minor
changes, you can do them in available compilers.  For example,
the Gnu compiler which is more-or-less ANSI compatible and which
does not cost money (this is not an endorsement of Stallman et
al.  just recognizing that they exist), or the Minix C compiler
which does cost (but only ~$100), or the Amsterdam Compiler Kit
(which costs a whopping $10,000).  No doubt there are others as
well.

However, I suspect that the essential work would have to be done
in the libraries, but, given that the existing libraries are not
adequate (mostly the point of the complaints, I think), and that
numerical computing is your field, that should be, rather than a
problem, the heart of your activity.  (Urk!  The structure of
that sentence!)


---
Bill
novavax!proxftl!bill

bill@proxftl.UUCP (T. William Wells) (08/27/88)

In article <891@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
: It is important not just that it appear in a journal, but prominently.  If
: you want input, go out and loudly proclaim it.  As a researcher, I find it
: necessary to glance at more than 200 journals.  I certainly missed the
: announcement in CACM (one of my lower priority journals).
:
: I do not believe it appeared in _Science_, the journal of AAAS.  Now most
: mathematicians and statisticians do not read any of the above named journals.
: How about including the _Notices_ of the AMS, the _Bulletin_ of the IMS,
: and the appropriate information journals of SIAM and ASA?  How about asking
: the physicists and chemists and astronomers and geologists and biologists?
: (Apologies to the groups left out are in order.)
: --
: Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
: Phone: (317)494-6054
: hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

Hind-sight is wonderful.  You shoulda looked in those journals.
If only you knew.  Now you do.  This is not intended as
condemnation, but rather just observing that humans are not
omnicient.  And to emphasize what you already know: that if you
want to keep on top of things, you must read the relevant
publications.

I do not think that it is the business of standard committees
(and, by extension, lots of other groups) to make exceptional
effort to be known about by people outside their fields.  To put
it bluntly, it is damn near impossible to do this even half right
and certainly a waste of effort; it is the job of those who have
an interest in the field to look in the right places and not the
job of the standards committees to shout their business from the
rooftops.  (Think of the information pollution we already have!)

---
Bill
novavax!proxftl!bill

brian@radio.astro.toronto.edu (Brian Glendenning) (08/28/88)

In article <1988Aug26.162706.22671@utzoo.uucp>, henry@utzoo (Henry Spencer) writes:
>
>It's been done, it works well, and it's readily available:  C++.
 
Does C++ solve the oft-mentioned problems with C for numerical work? Are
vectorizing C++ compilers available on "crunching" machines, e.g. Cray, Convex
and Alliant? (In fact, are vectorizing _C_ compilers available for the latter
two)?

Do C and C++ compilers generally give about the same level of optimization,
i.e. are C compilers much more mature than C++ compilers.
-- 
Brian Glendenning                INTERNET - brian@radio.astro.toronto.edu
Radio Astronomy, U. Toronto          UUCP - {uunet,pyramid}!utai!radio!brian
+1 (416) 978-5558                  BITNET - glendenn@utorphys.bitnet

henry@utzoo.uucp (Henry Spencer) (08/28/88)

In article <891@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>It is important not just that it appear in a journal, but prominently.  If
>you want input, go out and loudly proclaim it...

However, if you want input primarily from people who are competent in C and
interested in its future development, it suffices to mention it quietly in
the places such people frequent.  Which is what was done.

>As a researcher, I find it necessary to glance at more than 200 journals...

Then I'd say you can't possibly have time to read endless X3J11 drafts, and
related documents, with care and attention.  People who have never been
involved in standards work have *NO CONCEPT* of the tonnage of paper one
has to read if one wants to do a proper job of it.  This requires real
motivation, not a dilettante's casual interest.  People with that level
of motivation are going to see a quiet announcement in selected journals,
because they'll be reading those journals already.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

henry@utzoo.uucp (Henry Spencer) (08/28/88)

In article <225800058@uxe.cso.uiuc.edu> mcdonald@uxe.cso.uiuc.edu writes:
>Chemists such as I, and other scientists don't read CACM. Most of
>us don't read ANYTHING about computers, we just use them. BUT, we
>are vitally interested in serious things like changing languages...

Can you explain why you never read any of the journals that discuss
something that is of vital interest to you?
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

henry@utzoo.uucp (Henry Spencer) (08/28/88)

In article <1203@radio.toronto.edu> brian@radio.astro.toronto.edu (Brian Glendenning) writes:
>Does C++ solve the oft-mentioned problems with C for numerical work?

Probably not completely, although its extensibility makes it better than
C (for example, defining new kinds of numbers is simple).

>Are
>vectorizing C++ compilers available on "crunching" machines, e.g. Cray, Convex
>and Alliant? (In fact, are vectorizing _C_ compilers available for the latter
>two)?

The answer is probably "not yet".  However, the same comment would apply
to any other proposed solution to the problems.  The language itself is
pretty much right; getting the implementations right is important, but
is a separate problem.

>Do C and C++ compilers generally give about the same level of optimization,
>i.e. are C compilers much more mature than C++ compilers.

Most existing C++ implementations are based on C compilers to some degree,
so they're pretty much comparable.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

smryan@garth.UUCP (Steven Ryan) (08/28/88)

>>Sounds like somebody wants an extensible C.
>
>It's been done, it works well, and it's readily available:  C++.

Some people have been asking for access to machine specific features. C is
good at getting at machine features for one particular machine whether they
exist or not.

Query: Does C++ do the same or does it define its machine independent operators
in terms of specific machine features and give programmers access to the same
mechanism?

(Why bother buying an unavailable book if I can con someone else in to doing
my research for me?)

henry@utzoo.uucp (Henry Spencer) (08/28/88)

In article <899@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>> >Sounds like somebody wants an extensible C.
>> 
>> It's been done, it works well, and it's readily available:  C++.
>
>There are gross weaknesses in C++...

I didn't say it was perfect, I said it worked well.  There is a difference.
Nobody expects a language to keep everybody happy.  (Personally I doubt
that any language would keep Herman Rubin happy.)  C++ is a fairly well-done
and highly usable extensible C.

>It does not allow the introduction of new operators, for example.

There is room for debate about whether dynamic alteration of language
syntax is a good idea.  C++ does provide for new operators, provided that
you are willing to use function-call syntax for them.  Call syntax is
admittedly clumsy for anything complicated, but user-defined syntax is
a real minefield for both users and implementors.

>It does not address the problem of multiword
>hardware types, using machine dependencies where they can profitably be
>used (see the discussion about short x short -> long)...

You mean, the current *implementations* do not provide for this.  There
is no reason why the implementation of a C++ type can't use hardware-
specific extensions when they exist.  The client interface can remain
machine-independent, as it generally should be.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

henry@utzoo.uucp (Henry Spencer) (08/29/88)

In article <1317@garth.UUCP> smryan@garth.UUCP (Steven Ryan) writes:
>Some people have been asking for access to machine specific features. C is
>good at getting at machine features for one particular machine whether they
>exist or not.
>... Does C++ do the same or does it define its machine independent operators
>in terms of specific machine features and give programmers access to the same
>mechanism?

C++ is essentially a superset of C, so it takes the same approach as C.
In both, there is no reason why a perceptive implementor can't provide
machine-specific hooks for users to use to implement packages which have
machine-independent interfaces.  This works rather better in C++, mind
you, because package interfaces are much nicer in C++.
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

bill@proxftl.UUCP (T. William Wells) (08/29/88)

In article <525@accelerator.eng.ohio-state.edu> rob@kaa.eng.ohio-state.edu (Rob Carriere) writes:
: And having done so, you might then find that there is a language that
: does almost everything you want, could do everything you want with
: *only small changes*, and is already better than anything else around.
: Can you blame people for then trying to get these minor changes done?
: I am not talking about anything like a PL/1 syndrome, because I like C
: for its simplicity, and I'd much rather do some work than have the
: language bloated, but there are a couple of minor changes that would
: greatly improve the utility of C in the numerical field.

So, keeping within the Spirit of C :-), what are the *small*
changes that one could make to the dpANS that would make it an
ideal :-) language for numerical computing?  Perhaps if we could
boil down these into a coherent recommendation, we could get them
fixed in some later standard.  Anyone who wants to bat that
around should probably start posting in a new series of messages,
to separate it from these anti-ANSI flames.

: Rob Carriere
: Face it, C is just to damn *_GOOD_* for you systems guys to keep it
: all to yourselves... :-)

Amen to that.

---
Bill
novavax!proxftl!bill

burgett@steel.COM (Michael Burgett) (08/29/88)

In article <4628@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes:
#he  had  had  invented the language C, and as if he were the only
#authority to decide just who is permitted to use it.  The  humble
#proposition was to make a _few_ changes that would _not_ make the
#language more complex, or bigger or more difficult  to  implement
#or whatever the usual "arguments" against these are. I am pulling
#out from this debate now and just would like to comment  that  it
#is  ending  yet another time where it sadly usually does: "Scien-
#tist go home, you buggers program in YOUR language not in OURS."
[...]
#Joseph D. Reger,	joseph@chromo.ucsc.edu

not so Mr. Reger, you misrepresent what I said.  I lay no special *claim* to 
the C language nor pretend to make decisions on who may or may not use it.
I do contend that :

	a) The ANSI C Committee seems to have done their job in standardizing
	current practice (as opposed to implementing anyone's wish list.)

	b) EVERYONE is welcome to use C and I greatly ENCOURAGE this.  I have
	been evangelizing C for some time now and will continue to do so.
	what I don't approve of is attempts to make C the best language for
	all applications at the expense of it beautiful simplicity and
	compactness...  This is akin to taking a set of brushes and
	paints after the Mona Lisa because you don't like her smile... :-)

have you "scientists" considered using an extensible language (like C++ :-))
to solve some of your woes??

	Mike Burgett

"my intellectual work belongs to my employer, but my flames are my own."

(and yes I'mm proud of them!)

john@uw-nsr.UUCP (John Sambrook) (08/29/88)

Mr. Rubin has been an active contributor to comp.lang.c for many
months now.  He has argued his points with great vigor and seems 
genuinely interested in proving his case.  However, it seems to me 
that few people share his concern.  

It seems likely that this debate will continue for a very long time.
While that may not be a bad thing, in and of itself, it isn't as 
satisfying as, say, an implementation of the ideas that Mr. Rubin 
has advanced.

I would like to ask Mr. Rubin what he is doing, outside of posting to
this and other USENET newsgroups, to bolster his position.  Is there
any research and/or design work in progress, or is it just talk.  It is
my feeling that such work would be useful, and that everyone would 
benefit.

Perhaps a good first start would be a carefully considered paper that
presents the fundamental issues Mr. Rubin would like to see addressed.

John Sambrook                        Internet: john@nsr.bioeng.washington.edu
University of Washington RC-05           UUCP: uw-nsr!john
Seattle, Washington  98195               Dial: (206) 548-4386

-- 
John Sambrook                        Internet: john@nsr.bioeng.washington.edu
University of Washington RC-05           UUCP: uw-nsr!john
Seattle, Washington  98195               Dial: (206) 548-4386

karl@haddock.ima.isc.com (Karl Heuer) (08/29/88)

In article <309@eutrc3.UUCP> tnvscs@eutrc3.UUCP (c.severijns) writes:
>We have been using C for scientific computing for some time now and so far we
>only feel the need for a very few changes to the language.  [One is passing
>float by value, which is already in ANSI C.]  The second change we would like
>to be made is the possibility to compile C with "intrinsic" function

This also is already in ANSI C.

Karl W. Z. Heuer (ima!haddock!karl or karl@haddock.isc.com), The Walking Lint

meissner@xyzzy.UUCP (Michael Meissner) (08/30/88)

In article <891@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
| I do not believe it appeared in _Science_, the journal of AAAS.  Now most
| mathematicians and statisticians do not read any of the above named journals.
| How about including the _Notices_ of the AMS, the _Bulletin_ of the IMS,
| and the appropriate information journals of SIAM and ASA?  How about asking
| the physicists and chemists and astronomers and geologists and biologists?
| (Apologies to the groups left out are in order.)

Complain to the X3 parent body then.  They are the ones that are
responsible for publishing when Ansi committees are formed, when the
public reviews are, etc.  They have a list of journals that they send
such annoucements to -- maybe the journals you read didn't wish to
include it, or their backlog is too large to be able to print such
annoucements.  Several announcements were made on Usenet stating a new
C standard was started four years ago.  I participated in an early
USENIX BOF on the C standard, and others have done so in more recent
times.  I recall that somebody made the observation that if you really
wanted to reach the mass of C programmers, Ansi should have made the
announcement in the funny papers the last time this discussion came
up. :-)

In case you wish to complain, the address for the X3 secretariat is:

	X3 Secretariat
	Computer and Business Equipment Manufacturers Association
	311 First Street, N.W. Suite 500
	Washington, DC 20001-2178
-- 
Michael Meissner, Data General.

Uucp:	...!mcnc!rti!xyzzy!meissner
Arpa:	meissner@dg-rtp.DG.COM   (or) meissner%dg-rtp.DG.COM@relay.cs.net

smryan@garth.UUCP (Steven Ryan) (08/30/88)

>I didn't say it was perfect, I said it worked well.  There is a difference.
>Nobody expects a language to keep everybody happy.  (Personally I doubt
>that any language would keep Herman Rubin happy.)  C++ is a fairly well-done
>and highly usable extensible C.

That's a very nice compliment. Complacency is a sign of death.

ray@micomvax.UUCP (Ray Dunn) (09/01/88)

In article <1988Aug27.231211.15404@utzoo.uucp> henry@utzoo.uucp (Henry Spencer) writes:
>
>However, if you want input primarily from people who are competent in C and
>interested in its future development, it suffices to mention it quietly in
>the places such people frequent.  Which is what was done.
>

Hmm.  Yes and no.

If the standardizers of hammer design want to establish a better hammer
standard and want "user input" on the subject, should they advertise the
fact in the Journal of Tool and Die Making, or in the Communications of the
Association of Cabinet Makers?

If the keepers of 'C' are interested in its usefulness in various
applications areas, then one would expect them to *solicit* that input by
addressing those application people in their appropriate forums.

-- 
Ray Dunn.                      |   UUCP: ..!philabs!micomvax!ray
Philips Electronics Ltd.       |   TEL : (514) 744-8200   Ext: 2347
600 Dr Frederik Philips Blvd   |   FAX : (514) 744-6455
St Laurent. Quebec.  H4M 2S9   |   TLX : 05-824090

nevin1@ihlpb.ATT.COM (Liber) (09/01/88)

In article <891@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:

>As a researcher, I find it
>necessary to glance at more than 200 journals.  I certainly missed the
>announcement in CACM (one of my lower priority journals).

So you, as a *researcher*, wanted to know if anything spectacular was
happening in the C language development community.  Did you start your
research into this subject by looking at a journal designed for this
group?  From what you have posted so far, this does not appear to be
the case.

>I do not believe it appeared in _Science_, the journal of AAAS.  Now most
>mathematicians and statisticians do not read any of the above named journals.

And most R&D computer people do not read Science.  Do mathematicians
and statisticians make important announcements in the CACM?  I think
not.

>How about asking
>the physicists and chemists and astronomers and geologists and biologists?
>(Apologies to the groups left out are in order.)

Apologies NOT accepted!  If you can't come up with the definitive list
of places where this information should be published to reach everybody
who might even be remotely interested in this, why are you expecting
anyone else to be able to?  Also, how many of these groups are really
interested in codifying existing C practice (as the X3J11 charter
clearly mandates), or are they just interested in getting their 'wish
list' kludged into the language (as you seem to be)?
-- 
 _ __		NEVIN J. LIBER  ..!att!ihlpb!nevin1  (312) 979-4751  IH 4F-410
' )  )				NEWS FLASH ... ... ...  I GOT A PHONE!!
 /  / _ , __o  ____		(and there was much rejoicing ... yeaaa.)
/  (_</_\/ <__/ / <_	These are NOT AT&T's opinions; let them make their own.

nevin1@ihlpb.ATT.COM (Liber) (09/01/88)

In article <897@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
|In article <4203@adobe.COM>, burgett@steel.COM (Michael Burgett) writes:
 
|> My hat
|> off to the committee for not bowing to public pressure to try to make C all
|> things to all people (can you say PL/1... I knew you could.)

|That a badly designed language was rejected is irrelevant.

But the part that isn't irrelevant is *why* did PL/1 turn out, in your
words, to be a badly designed language?  We don't want to go around
repeating the mistakes of the past if we don't have to.
-- 
 _ __		NEVIN J. LIBER  ..!att!ihlpb!nevin1  (312) 979-4751  IH 4F-410
' )  )				NEWS FLASH ... ... ...  I GOT A PHONE!!
 /  / _ , __o  ____		(and there was much rejoicing ... yeaaa.)
/  (_</_\/ <__/ / <_	These are NOT AT&T's opinions; let them make their own.

cik@l.cc.purdue.edu (Herman Rubin) (09/01/88)

In article <8660@ihlpb.ATT.COM>, nevin1@ihlpb.ATT.COM (Liber) writes:
> In article <897@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
> |In article <4203@adobe.COM>, burgett@steel.COM (Michael Burgett) writes:

 
> |That a badly designed language was rejected is irrelevant.

> But the part that isn't irrelevant is *why* did PL/1 turn out, in your
> words, to be a badly designed language?  We don't want to go around
> repeating the mistakes of the past if we don't have to.

A language should be easy to read and as easy to write as possible.  The
kludges made in PL/1 to allow the use of the properties of the machine were
to use the common assembler notation, which while it is precise, is difficult
to read and write.

HLLs such as C make heavy use of overloaded operators and infix notation for 
operators.  There are only a few assemblers which use infix notation, and I
know of none which use overloaded operators and weak typing.  In addition,
HLLs allow multiple operations in a single statement, array handling, and
similar goodies.

The makers of PL/1, when they came to allowing the user to use the low-level
procedures, required the users to use the clumsy assembler notation or even
worse.  I believe that a flexible HLL which comes close to accomplishing 
what both C and FORTRAN accomplish, and a lot more, can be produced.  It
might be necessary to require explicit operator precedence instead of implicit,
at least in some cases (it has been stated that this is one of the biggest
problems in a compiler; APL has completely dropped it), and possibly to 
remove some of the implicities introduced in some of the languages.

If there is a movement to produce a flexible HLL, I would be willing to
participate.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

cik@l.cc.purdue.edu (Herman Rubin) (09/01/88)

In article <8659@ihlpb.ATT.COM>, nevin1@ihlpb.ATT.COM (Liber) writes:
> In article <891@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
> 
> >As a researcher, I find it
> >necessary to glance at more than 200 journals.  I certainly missed the
> >announcement in CACM (one of my lower priority journals).
> 
> So you, as a *researcher*, wanted to know if anything spectacular was
> happening in the C language development community.  Did you start your
> research into this subject by looking at a journal designed for this
> group?  From what you have posted so far, this does not appear to be
> the case.
> 
> >I do not believe it appeared in _Science_, the journal of AAAS.  Now most
> >mathematicians and statisticians do not read any of the above named journals.
> 
> And most R&D computer people do not read Science.  Do mathematicians
> and statisticians make important announcements in the CACM?  I think
> not.

If mathematicians or statisticians attempt to produce a product to be used
by people in other fields, they surely make an effort to inform the users.

Research in computer science generally should not be advertised in other 
journals.  Research in programming languages are of this type.  Mathematicians
and statisticians are not that arrogant that they would attempt to freeze
terminology for use by biologists, sociologists, physicists, computer
scientists, etc.  The committee to establish C standards is doing this.
Therefore, they have the obligation to find out how that will impact the
users.

> >How about asking
> >the physicists and chemists and astronomers and geologists and biologists?
> >(Apologies to the groups left out are in order.)
> 
> Apologies NOT accepted!  If you can't come up with the definitive list
> of places where this information should be published to reach everybody
> who might even be remotely interested in this, why are you expecting
> anyone else to be able to?  Also, how many of these groups are really
> interested in codifying existing C practice (as the X3J11 charter
> clearly mandates), or are they just interested in getting their 'wish
> list' kludged into the language (as you seem to be)?

The apologies are mine.  However, when I post a reply to this group stating
that various users should be consulted, I do not feel obligated to come up
with a complete list.  The C committee is composed of a larger number of
people and is more deliberative; it can and should be expected to come up
with a list which will leave out few users or potential users.

What does it mean to codify existing C practice?  If it means what you are
implying, it is like the French Academy attempting to keep out English and
making a mess.

BTW, there are many statistical packages.  I recommend that they not be used.
If the user does not understand the problem, they will give wrong answers.
Programming languages are somewhat similar.  Fortunately, most of the time
they give correct answers, albeit slowly.  Since a UTM can handle all problems,
any problem can be done clumsily on any machine with any remotely reasonable
HLL.  You can walk from New York to Los Angeles, but you will probably use
an airplane instead.

Summarizing, languages (and operating systems) exist for the purpose of
enabling the user to efficiently use the capabilities of the machines to
solve problems.  Currently, they fail to do this for people who can 
understand the machine's capabilities.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

ok@quintus.uucp (Richard A. O'Keefe) (09/02/88)

In article <908@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>In article <8660@ihlpb.ATT.COM>, nevin1@ihlpb.ATT.COM (Liber) writes:
>> But the part that isn't irrelevant is *why* did PL/1 turn out, in your
>> words, to be a badly designed language?  We don't want to go around
>> repeating the mistakes of the past if we don't have to.
>
>A language should be easy to read and as easy to write as possible.  The
>kludges made in PL/1 to allow the use of the properties of the machine were
>to use the common assembler notation, which while it is precise, is difficult
>to read and write.
>
I used to use PL/I (yes, that's an "I" not a "1"), and I'm afraid I don't
quite know what Herman Rubin is getting at here.  PL/I syntax, for those
who are fortunate enough not to know it, is full of things like
	PUT FILE (OUTFILE) EDIT (THIS,THAT,THE_OTHER) (A(10)) PAGE;
Roughly,
	<main keyword> {<sub keyword> [(<argument list>)]}... ;
For another example,
	CALL PROCEDURE(ARG1, ..., ARGN);

For an example of arithmetic operations, to add two MxN arrays of
floating point numbers:
	DECLARE (A,B) BINARY FLOAT DIMENSION (1:M, 1:N);
	A = A+B;

I have no desire to praise PL/I, but I honestly don't see any resemblance
to any of the assembly languages I've ever used.

As for infix notation, I wish someone would come up with a standard
notation for sequence concatenation: I've seen "+" (which mathematical
convention reserves for commutative operations), "&" (which looks like
"and"), "*" (which makes the most sense, but is rare), and the theory
papers tend to use a sign which is a bit like ^ and a bit like the
intersection sign, and needless to say isn't in the ISO 8859/1 character set.
In the absence of an agreed notation for such a fundamental operation,
the use of functional notation has a lot to commend it.

ark@alice.UUCP (Andrew Koenig) (09/02/88)

In article <908@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes:
 
> A language should be easy to read and as easy to write as possible.  The
> kludges made in PL/1 to allow the use of the properties of the machine were
> to use the common assembler notation, which while it is precise, is difficult
> to read and write.

Would you mind explaining this a little more?

I don't understand what you're trying to say.
-- 
				--Andrew Koenig
				  ark@europa.att.com

dan@ccnysci.UUCP (Dan Schlitt) (09/15/88)

Yes, it is certainly true that many people who will have an interest
in the results of the standardization process for both fortran and C
didn't (and don't) know about it.  I have learned by sad experience
that that is just the way it is.  "I didn't know that the city was
planning to put in that dump right next door to me!"  "Well, you
should have.  There was the required legal notice in the paper and
there were stories on TV and in the newspaper."  "Yes, but I didn't
think they meant next door to ME."  You can lead a student to
information but you can't make him think.

I'm very cynical about the utility of public notice about events of
potential significance.  If you think that an organization of which
you are a member let you down by not giving you information about the
C standardization then write to the organization and complain.  Then
maybe it won't happen next time.  And maybe you will pay more
attention next time -- I know that I will.

ACM SIGNUM has given attention to the floating point arithmetic and
fortran standardization.  I think they missed out on C.  That is too
bad.  Those of us who are members probably goofed by not getting
something in the Newsletter.  Of course, the Fortran Forum of ACM gave
good coverage of the fortran process.  USENIX covered the C
standardization in ;login and also POSIX etc.  A quick look through
the back issues of SIAMNEWS makes me think that SIAM let its members
down on the standarization issues.

So next time let's not make the same mistake; make a new one.

dan@ccnysci.UUCP (Dan Schlitt) (09/15/88)

In article <4203@adobe.COM> burgett@steel.UUCP (Michael Burgett) writes:
>These discussions about the flaws of the C language in dealing with complex
>floating point ops, and the *failure* of X3J11 to solicit input and rectify
>these things are getting _old_....
>
>1) C is not now, has not been in the past, and (hopefully) will not be in
>the future, a lanugage designed for writing scientific applications
>
>2) C was designed and implemented to remove the onus of using assembly language
>to write operating systems, utilities, device drivers and the ilk.  In this
>regard, it has no equal.
>
>In light of 1 & 2... where's the beef?  C is doing what it is designed to do,
>and from what I've seen of the ANSI standard, will continue to do so.  My hat
>off to the committee for not bowing to public pressure to try to make C all
>things to all people (can you say PL/1... I knew you could.)
>
>you just might have to learn more than one language.... (shock! disbelief!!)
>
If, as you claim, C is designed to "remove the onus of using assembly
language" then you should take note that that is frequently what must
be done to get good numerical code -- write in assembly language.

If floating point isn't necessary for writing your operating systems,
device drivers, etc.  then why not take ALL of the floating point
stuff out of C.  Then the numerical people will go away and not bother
you.  On the other hand .... if you need the floating point then do it
right.  That is all many of us are asking for.  If you don't use
floating point then the changes shouldn't get in your way anyhow.

There is no need to make C into FORTRAN or PL/1.  I personally don't
see the need for adding  the fortran ** to C.  The people who argued
for a function call made good _numerical analysis_ sense in their
arguments.  But remember that the numerical people weren't the only
ones who wanted to add useless binary operators to the language.

MY hat is off to the committee for making a number of changes in C
which will make it do floating point better.

dan@ccnysci.UUCP (Dan Schlitt) (09/15/88)

In article <1988Aug27.231342.15447@utzoo.uucp> henry@utzoo.uucp (Henry Spencer) writes:
>In article <225800058@uxe.cso.uiuc.edu> mcdonald@uxe.cso.uiuc.edu writes:
>>Chemists such as I, and other scientists don't read CACM. Most of
>>us don't read ANYTHING about computers, we just use them. BUT, we
>>are vitally interested in serious things like changing languages...
>
>Can you explain why you never read any of the journals that discuss
>something that is of vital interest to you?
>-- 
>Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
>they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

As Chair of my physics department computer committee in a previous
life I put out a notice about the fortran8x standard and offered the
loan of the document published by SIGNUM.  ONE graduate student was
interested enough to look at the document and make a couple of
comments.  This was at a stage in the process where changes would be
fairly easy.  Several computational chemists were on the distribution
list for the notice.  NONE of these big time fortran users would take
the time to look at something which was going to have a big effect on
their future programing.  I'm glad I'm not there to hear their
yells of anguish when they finally have to live with a f87 compiler.
Please note that they didn't have to read any journals ... I did that
part for them and they still weren't interested.

I would not have put out a notice about C.  They would not have even
known about the language.

As system administrator for the science division computer facility I
try to notify the users of things like the standardization which
should be of interest.  That is one reason why I put effort into
keeping our news system in good shape.  But people don't read the news
groups and they don't react to the notices.

Now where did I put that 2x4?  (Nah, no one knows that joke about
getting the mule to move by whispering in its ear.)