joseph@chromo.ucsc.edu (Joseph Reger) (08/20/88)
In article <8358@smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: >I don't mean to discourage comments on the draft; however, you should be >advised that you'll need some extremely strong arguments for making any >substantive changes. Examples showing that the current draft is badly >broken would help. The draft may not be 'badly broken' but is missing out on the opportunity to make C a convenient language for numerical computing as well. It is a pity that many of the 'real programmers' feel that any change that would allow C to be the language of choice for 'non-real programmers' (scientists) somehow would hurt their feeling/interests. I did not participate in the debates about the power operator, noalias, conformant arrays etc., because I was scared by some the vehemence of the 'defender of the faith'. It is sad that never seemed to be enough time to discuss some recommendations in detail. There are many scientist that I know (mostly younger people) who really came to like C, and we are using it despite its problems and deficiencies as far as numerical computing is concerned. I strongly feel that it is an unacceptable situation that many of us has to program around these problems, although some of them could be easily fixed. Much of today's (computational) science is done in a workstation environment, mostly under Unix. In the future this is going to be even more so, especially now that the supercomputer manufacturers are adopting Unix, too. The best compilers in these environments are the C compilers, period. Since the manufacturer often uses the same compilers for his own development, the user can be fairly confident that most of the bugs have already been eliminated. So there will be ever more scientist who program in C. Why is it such a good idea to have a growing amount of code around that contains ugly, difficult to understand "fixes"? The power operator is a small issue, I agree. Noalias (no flames please, I am afraid of you) is definitely going to come, since the vector machines need it. Only that it will come in many (vendor specific) colors and flavors. Conformant arrays? We (scientists) need them very much and I do not see how they would mean any grand problem for C --and the end of the western civilization-- in the simple version proposed by David Hough (see his "Comments on Proposed ANSI C Standard"). All these problems could be solved, of course, by the inclusion of the following statement into the Draft: "Scientist and other non-real programmers are not allowed to use the programming language C". (The funny thing is that some scientist would actually like to see this statement, not only in the Draft, but everywhere). Joseph D. Reger, joseph@chromo.ucsc.edu
gwyn@smoke.ARPA (Doug Gwyn ) (08/21/88)
In article <4566@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes: >The draft may not be 'badly broken' but is missing out on the opportunity >to make C a convenient language for numerical computing as well. I happen to use C for numerical programming, despite occasional flaws such as those you mention, primarily because it offers much better support for data structures than do other alternatives such as FORTRAN. I agree that there are some changes that could make C more convenient for such applications. Hough's suggestions are for the most part good ones, but they haven't been receiving sufficient committee support. The fundamental problem is that IT IS MUCH TOO LATE to be making significant changes to the proposed standard. Look at all the trouble the last-minute addition of "noalias" caused. The public review period is intended as a REVIEW of work done by the committee, not as an opportunity for language design. Where were all these scientific users of C when the design work was being done? By leaving that up to people who didn't think the flaws you perceive were significant, you did not get those flaws addressed in the proposed C standard. It's easy to complain about other people's work; much easier than helping with the work. I suggest that you GET INVOLVED in drafting the NEXT (revised) standard. Obviously I am not speaking for X3J11 officially here..
joseph@chromo.ucsc.edu (Joseph Reger) (08/23/88)
In article <8365@smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: >In article <4566@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes: >>The draft may not be 'badly broken' but is missing out on the opportunity >>to make C a convenient language for numerical computing as well. > > >The fundamental problem is that IT IS MUCH TOO LATE to be making >significant changes to the proposed standard. It seemed to me - and I admittedly did not follow it from the very beginning - that it was always MUCH TOO LATE. >It's easy to complain about other people's work; much easier than >helping with the work. I suggest that you GET INVOLVED in drafting >the NEXT (revised) standard. I certainly will. Joseph Reger, joseph@chromo.ucsc.edu
cik@l.cc.purdue.edu (Herman Rubin) (08/23/88)
In article <8365@smoke.ARPA>, gwyn@smoke.ARPA (Doug Gwyn ) writes: > In article <4566@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes: > >The draft may not be 'badly broken' but is missing out on the opportunity > >to make C a convenient language for numerical computing as well. > > I happen to use C for numerical programming, despite occasional flaws > such as those you mention, primarily because it offers much better > support for data structures than do other alternatives such as FORTRAN. > I agree that there are some changes that could make C more convenient > for such applications. Hough's suggestions are for the most part good > ones, but they haven't been receiving sufficient committee support. > > The fundamental problem is that IT IS MUCH TOO LATE to be making > significant changes to the proposed standard. Look at all the trouble > the last-minute addition of "noalias" caused. The public review > period is intended as a REVIEW of work done by the committee, not as > an opportunity for language design. Where were all these scientific > users of C when the design work was being done? By leaving that up > to people who didn't think the flaws you perceive were significant, > you did not get those flaws addressed in the proposed C standard. > It's easy to complain about other people's work; much easier than > helping with the work. I suggest that you GET INVOLVED in drafting > the NEXT (revised) standard. > > Obviously I am not speaking for X3J11 officially here.. I use C for numerical programming, and then have to edit the resulting .s file. All of the languages, including C, are woefully deficient is letting the user use the capacities of the machines. If C is to be a good flexible language, the committee should widely advertise for complaints about the deficiencies of the language before starting out. I would have no trouble coming up with pages of these items. But the last time I did something like this, in reply to the open invitation to attend the meeting on the IEEE floating-point convention, was to receive an invi- tation to attend! I do not have the time to attend meetings on software. Another problem is that the language gurus are unsympathetic to ideas which run counter to their perception of computing needs. They see integer arithmetic as primarily for addressing and looping; I see integer arithmetic as important for number-crunching. What about fixed-point (_not_ integer) arithmetic? What about the use of overflow? What about division with simultaneous quotient and remainder? What about an operation or function returning a string of values? What about table-driven branches? What about inserting new operators, using the processor syntax to specify the argument structure of these operators? In fact, what about using the easy-to-use hardware operators on most machines? A good example is &~, which is more useful than &, and is hardware on many machines, including the ones for which C was initially written. Many of those machines do not even have a hardware &. How many useful instructions have disappeared from hardware because they do not occur in the HLLs? Multiprecision arithmetic needs unsigned multiplication and division to be efficient, and not floating point arithmetic. The presence of a single hardware instruction can be essential to an algorithm being worthwhile; if the instruction is in software, it is more likely to appear in hardware. -- Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907 Phone: (317)494-6054 hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)
henry@utzoo.uucp (Henry Spencer) (08/23/88)
In article <4566@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes: >The draft may not be 'badly broken' but is missing out on the opportunity >to make C a convenient language for numerical computing as well... Well, remember two things. First, that there was opportunity for input along these lines earlier, and little was received; it is now much too late for major changes. Second, that X3J11's mission was to standardize an existing language, not to invent a new one; they did make some small steps toward making C friendlier for numerical work, and that is probably about all one should expect from a standards committee. If you really want to see C improved as a language for numerical computing, the first thing to do is to scream at your compiler supplier until he/she/it does some of the things you want. Then, when the time rolls around for the next revision of the C standard, you can propose changes based on *actual experience*. This will carry a lot more weight than untried inventions. Given the time lags involved in all this, if you are serious about it, the time to start haranguing your supplier is *now*. -- Intel CPUs are not defective, | Henry Spencer at U of Toronto Zoology they just act that way. | uunet!attcan!utzoo!henry henry@zoo.toronto.edu
leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leic) (08/23/88)
In article <887@l.cc.purdue.edu>, cik@l.cc.purdue.edu (Herman Rubin) writes... >I use C for numerical programming, and then have to edit the resulting .s >file. All of the languages, including C, are woefully deficient is letting >the user use the capacities of the machines. If C is to be a good flexible >language, the committee should widely advertise for complaints about the >deficiencies of the language before starting out. On the contrary: C is NOT woefully deficient for the vast majority of applications to which the vast majority of "paying users" are interested in applying it. As later comments make clear, the kinds of users Mr. Rubin has in mind are rather different. The fact of the matter is, hardly anyone thinks that "fixed-point arithmetic" (as opposed to integer) is important. It just does not come up in the vast majority of uses to which computers are put. Developing software is an expensive proposition. Everything added to a language has to be implemented somewhere, by someone. Then it has to be debugged, supported, and maintained. There are only two ways this will happen: If someone is willing to pay for it; or when someone is willing to do it out of their own love for the subject. >I would have no trouble coming up with pages of these items. But the last >time I did something like this, in reply to the open invitation to attend >the meeting on the IEEE floating-point convention, was to receive an invi- >tation to attend! I do not have the time to attend meetings on software. Ah, so Mr. Rubin is willing to COMPLAIN, but he is NOT willing to do the work out of his own love for the subject. He certainly gives no indication that he is willing (or able) to pay to have it done either. >Another problem is that the language gurus are unsympathetic to ideas which >run counter to their perception of computing needs. I am a "language guru", though my interests happen to be in parallel program- ming languages. Again, why should I care what Mr. Rubin thinks "computing needs" are when he can't provide money, isn't willing to invest his own time, and can only provide the most specialized examples of what such features might be used for? > They see integer >arithmetic as primarily for addressing and looping; I see integer arithmetic >as important for number-crunching. What about fixed-point (_not_ integer) >arithmetic? What about the use of overflow? What about division with >simultaneous quotient and remainder? What about an operation or function >returning a string of values? What about table-driven branches? What >about inserting new operators, using the processor syntax to specify the >argument structure of these operators? In fact, what about using the >easy-to-use hardware operators on most machines? A good example is &~, >which is more useful than &, and is hardware on many machines, including >the ones for which C was initially written. Many of those machines do not >even have a hardware &. What about all these things? Being absolutely brutal about it: Why should I (or other readers) care? What will it gain us to worry about this? >How many useful instructions have disappeared from hardware because they >do not occur in the HLLs? Along the same brutal lines, my answer is: No USEFUL instructions have disappeared at all. What has disappeared are a lot of non-essential ideas that were tossed in back in the days when computer architecture was a new field, with a large research component. No one really knew what would turn out to be "useful". Well, for better or for worse, computer architecture isn't like that any more. Computer design is a multi-billion dollar industry. It is driven, not by what people might WANT in some abstract sense, but by what they are willing and able to pay for. THAT is the only workable definition of "useful", and on that scale the things Mr. Rubin wants have long ago fallen to the bottom of the list. > Multiprecision arithmetic needs unsigned >multiplication and division to be efficient, and not floating point >arithmetic. The presence of a single hardware instruction can be >essential to an algorithm being worthwhile; if the instruction is in >software, it is more likely to appear in hardware. >-- >Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907 >Phone: (317)494-6054 >hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP) It's painful to see economics dominating a field one loves and pushing it in directions one is not inclined to go. I'm not unsympathetic to Mr. Rubin's position; my own background, way back when, is in mathematics (complex analy- sis and a bit of analytic number theory). Even the work I do now is beyond the current "commercial" leading edge, and I am sometimes frustrated by the way hardware manufacturers put roadblocks in the way of doing "obviously useful" things, because they are too busy heading in other directions. But that's life. The USEFUL thing for Mr. Rubin to do, if he really thinks these issues are important, is to work at convincing others of it. Not by complaining in this and other newsgroups about how he is being ignored. But exactly by spending some time with those committees, by offering some real alternatives, by showing how what he proposes is useful to people other than himself. Frankly, I doubt anything he can do will ever get major commercial ventures interested. But that doesn't mean he can't get other researchers interested. Many people are able to design and build special-purpose hardware and software today; if Mr. Rubin talked to some of them, he might discover that many good research hardware hackers have the tools, but are lacking interesting problems. I will say, however, that his chances of getting people interested would improve markedly if he stopped complaining about how he didn't "have the time to attend meetings on software". Very few computer scientists have the time to attend meetings on statistics either. -- Jerry
dhesi@bsu-cs.UUCP (Rahul Dhesi) (08/24/88)
In article <887@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
[wish list for HLLs]
I agree that many of the features wished for ought to be in higher-
level languages. But to put all of them in C would no longer leave
the relatively small, simple, low-level language that C was designed
to be.
Nearly all of the features that Herman Rubin wishes to see *are*
already in HLLs, only not all are in each HLL. C++ has some. Ada has
*many* of them, especially fixed point arithmetic and functions
returning structured values.
The real problem is not with C designers. The real problem is with
Fortran designers, who have always had an explicit mandate to design a
language for scientific computing, and have continued to fail miserably
to achieve this. In a way the C users who do numerical computing want
to put on C the burden that Fortran was supposedly designed to carry.
The trouble with doing so is that other users will lose. Each new
feature added to a language increases the complexity of the language
translator, and *all* users, even those who don't need to use these
features, will pay in money, disk space, and CPU time.
--
Rahul Dhesi UUCP: <backbones>!{iuvax,pur-ee,uunet}!bsu-cs!dhesi
prh@actnyc.UUCP (Paul R. Haas) (08/24/88)
Given: 1. ANSI C (as proposed) does not support numerical computing adequately. 2. There is not enough time to fix it for the current standard. There are several ways of coping: 1. Use Fortran. 2. Hope your compiler vendor comes up with reasonable extensions. 3. Do something to encourage your compiler vendor to come up with something reasonable. I would favor writing a standard for "correct" math extensions to the C language. If the standard is adopted by at least one compiler vendor (or producer so as to include the FSF) then you can show prior art for the next round of X3J11. If the "correct" math extensions are simple enough to implement then many manufacturers will put it into their compilers. A proposal for a standard to be produced by an individual, an independent committee or a committee from one of the user groups (ACM, IEEE, /usr/group, Usenix, etc...). A committee could meet in person or electronically, etc... Unfortunately, I lack the skills to produce such a document. ---- Paul Haas, uunet!actnyc!prh (if that doesn't work: haas@msudoc.egr.msu.edu)
chasm@killer.DALLAS.TX.US (Charles Marslett) (08/24/88)
In article <36243@yale-celray.yale.UUCP>, leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu (Jerr writes: > On the contrary: C is NOT woefully deficient for the vast majority of > applications to which the vast majority of "paying users" are interested in > applying it. I find this comment and the attitude of the author woefully parochial -- I do not program in COBOL and I might not even recognize a either a data entry language or a data base language if it hit me in the face, but I do know that more money (real dollars, payroll hours or however you want to look at it) is spent on programs that are much more difficult to write in C than in the language they are written in (and in some cases -- heresy -- that language is even 8086 assembly language!). I am quite certain that spreadsheets garner more user dollars than C compilers for any computers other than Crays and Suns (and Fortran compilers are probable ahead of C compilers on at least the Crays). C is rapidly catching up with Pascal as the second most well known language but it has a long way to go before it becomes as well know (and perhaps as useful)as BASIC (more heresy?). For my purposes, C is the language of choice most of the time (by a fair margin -- I have no second choice, except maybe Modula were C to vanish from the face of the earth). But C is not a universal language and she does not appear to be expanding into other areas of applicability any more rapidly than her elder brother and sister, FORTRAN and LISP. And I think this is both A GOOD THING, and the reason that it is unlikely to be a major language 20 years from now. I have plenty of spare time in 20 years to learn several new small languages and I have no real need to program in Ada or PL/I. (How do you like my personification of programming language? Shall we create a few mythic tales to describe her birth?) Charles Marslett chasm@killer.dallas.tx.us
henry@utzoo.uucp (Henry Spencer) (08/24/88)
In article <4581@saturn.ucsc.edu> joseph@chromo.ucsc.edu (Joseph Reger) writes: >It seemed to me - and I admittedly did not follow it from the very beginning - >that it was always MUCH TOO LATE. X3J11 has been trying to get the %#@%$% thing out the door for quite some time now. The combination of lengthy public-review cycles and official meetings held only quarterly means that a standard which needs *three* public-review cycles will be in "almost finished, no substantive changes without a damn good reason" status for a long time. Sounds like you came on the scene after that phase started. -- Intel CPUs are not defective, | Henry Spencer at U of Toronto Zoology they just act that way. | uunet!attcan!utzoo!henry henry@zoo.toronto.edu
ian@argosy.UUCP (Ian L. Kaplan) (08/25/88)
In article <3732@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes: > >The real problem is not with C designers. The real problem is with >Fortran designers, who have always had an explicit mandate to design a >language for scientific computing, and have continued to fail miserably >to achieve this. In a way the C users who do numerical computing want >to put on C the burden that Fortran was supposedly designed to carry. The Fortran 8x committee has its problems, but lack of features is not one of them. The April '87 Fortran draft standard includes a number of "modern programming language" features, including something like structures (referred to as derived types) and modules, with imports and exports. The real problem with the Fortran standardization process is the in ability of the Fortran community to arrive at a standard. The decade is almost over. Soon it will be Fortran 9x. Ian Kaplan "I don't know what the most popular numeric programming language will look like in the year 2000, but it will be named Fortran." These opinions are my own.
chris@mimsy.UUCP (Chris Torek) (08/25/88)
In article <5282@killer.DALLAS.TX.US> chasm@killer.DALLAS.TX.US (Charles Marslett) writes: >In article <36243@yale-celray.yale.UUCP> leichter@venus.ycc.yale.edu >(Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu >(Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu >(Jerry Leichter (LEICHTER-JERRY@CS.YALE.EDU)leichter@venus.ycc.yale.edu >(Jerr writes: [A rather unusual name :-) .] >>On the contrary: C is NOT woefully deficient for the vast majority of >>applications to which the vast majority of "paying users" are interested in >>applying it. [back to chasm@killer:] >I find this comment and the attitude of the author woefully parochial >-- I do not program in COBOL and I might not even recognize a either a >data entry language or a data base language if it hit me in the face, >but I do know that more money ... is spent on programs that are [done >in other languages] .... C is not a universal language and she does >not appear to be expanding into other areas of applicability any more >rapidly than her elder brother and sister, FORTRAN and LISP. And I >think this is both A GOOD THING, and the reason that it is unlikely to >be a major language 20 years from now. This is curious, because I see Jerry Leichter and Charles Marslett as basically in agreement---so why should this attitude be `woefully parochial'? That C does not make a good functional programming language is no surprise; that people who pay for programs written in C are not paying for such code should also be no surprise; and hence that there is no great push for C to be augmented with everything out of Miranda and FP combined should likewise be no surprise. To return somewhat to the original subject: If you believe that, with a few tweaks that would either improve, or at least not damage, the language, C could become an ideal language for numerical software, it is then your job to demonstrate it. Make the changes---write yourself a compiler, or have someone else write it---and show that the new language is better than the old. If it is sufficiently better, programmers will beat a path to your mailbox, and the new language will become popular in the same way that C became popular. And if *you* are not willing to put in the effort, why then should *we* be? -- In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7163) Domain: chris@mimsy.umd.edu Path: uunet!mimsy!chris
hankd@pur-ee.UUCP (Hank Dietz) (08/25/88)
I've been using C for most programs since 1978. I've taught and am currently teaching a C programming course at Purdue University. However, C isn't supposed to be all things to everyone: it is a systems programming language and has little real competition as such (Ada? Modula 2?). Making C a numerical applications language has never been a priority, nor should it be. For example, fixed-point arithmetic would never be used by most of the originally-intended C user community; it would simply clutter the language definition and impede the development of good quality compilers. I personally feel that X3J11 has done an outstanding job of resisting the "kitchen sink" syndrome, keeping the language reasonably clean and implementable, while resolving more than a few ambiguous/omitted details. Propose a new language if you're not happy with any existing one. As for the language standardization process, if you're not willing to attend the meetings nor to correspond in a reasonably formal way, I don't think you've got much of a reason to complain. Now, I'm a bit unhappy in that I wasn't invited to be on X3J11 and would like to have had more input, but even so I have had no trouble in getting X3J11 folk to listen to me. My number one remaining beef with X3J11 is that they changed the function declaration syntax in an incompatible way without simultaneously providing public-domain software to automatically convert old C programs to the new notation... but this is a problem I personally intend to remedy. So, let's not flame on about X3J11. It isn't perfect, but it is C and it is a better definition than we had before. Enough said. -hankd
smryan@garth.UUCP (Steven Ryan) (08/25/88)
Sounds like somebody wants an extensible C. Are you crazy? Extensibilty implies the gods are mortal and a rational mode system exists. Shame for mentioning this is comp.lang.c.
tnvscs@eutrc3.UUCP (c.severijns) (08/25/88)
We have been using C for scientific computing for some time now and so far we only feel the need for a very few changes to the language ( we use a non-ANSI C compiler). One of these changes is already made in the ANSI standard, the possibility to pass a float as an argument to a function. The second change we would like to be made is the possibility to compile C with "intrinsic" function to be able to use a floating point processor like the MC68881 more efficiently. This requires only an extra option for the compiler. For the rest we consider C a good language for scientific computing that generates code that is not much slower than FORTRAN and has the advantage of structures. In one case were we needed complex data structures our C version turned out to be even more than twice as fast as a similar code in FORTRAN. Camiel Severijns UUCP: mcvax!eutrc3!eutnv1!camiel Surface Physics Group, Dept. of Physics Eindhoven Universtiy of Technology The Netherlands
henry@utzoo.uucp (Henry Spencer) (08/25/88)
In article <887@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes: >... I do not have the time to attend meetings on software. In other words, you want it fixed, but you can't be bothered investing your own time and effort in getting it fixed? Don't expect much sympathy. Standards are hard work; if you can't be bothered helping with it, those who do put in long hours on them are likely to feel that you don't really care all that much. > What about fixed-point (_not_ integer) arithmetic? What about it? Last time I did something along those lines, there wasn't any formidable difficulty in implementing it on top of integer arithmetic. That was a long time ago, and the stuff I was doing was specialized and simple, mind you. > What about the use of overflow? A nice idea, but it's hard to make it portable. > What about division with simultaneous quotient and remainder? Already in X3J11 C; see div() and ldiv() in section 4.10.6. If your compiler supplier doesn't implement them or implements them inefficiently, complain to him, not to X3J11 or to the net. >What about an operation or function returning a string of values? What about it? Can be done right now, although a bit clumsily, using pointers; see scanf for an example. It's not at all clear that adding it as an explicit construct would improve efficiency; in fact it could well reduce it. >What about table-driven branches? See the "switch" construct, which has been in C all along. If your compiler doesn't do this well, again, complain to the supplier. >What >about inserting new operators, using the processor syntax to specify the >argument structure of these operators? Again, perfectly possible now if you're willing to live with distasteful syntax (function calls). The past experiments with user control of syntax have mostly been limited successes at best. >In fact, what about using the >easy-to-use hardware operators on most machines? A good example is &~, >which is more useful than &, and is hardware on many machines, including >the ones for which C was initially written... And which any sensible compiler on those machines will use if you write x & ~y, just as you'd expect. See above comments on compiler defects. >How many useful instructions have disappeared from hardware because they >do not occur in the HLLs? How many useless instructions have appeared in hardware because some clot had the mistaken idea that they could be useful to HLLs? Exacting a speed and cost penalty from the customers as a result of the extra complexity, too. Such things are always compromises. -- Intel CPUs are not defective, | Henry Spencer at U of Toronto Zoology they just act that way. | uunet!attcan!utzoo!henry henry@zoo.toronto.edu
news@ism780c.isc.com (News system) (08/26/88)
In article <9@argosy.UUCP> ian@argosy.UUCP (Ian L. Kaplan) writes: > The Fortran 8x committee has its problems, but lack of features is >not one of them. The April '87 Fortran draft standard includes a >number of "modern programming language" features, including something >like structures ... >"I don't know what the most popular numeric programming language will > look like in the year 2000, but it will be named Fortran." Deja vu. In 1963 a commitee (Share) got together to produce FORTRAN V. It had structures, if-then-else, switch statements (spelled 'GOTO <expression>'), eight types of numeric data, and a whole bunch more. After the commitee saw what they had wrought, they decided that it was good. But FORTRAN V was a bad name. So they called the language NPL (New programming Language). When the Naval Physics Lab complained, the commitee changed the name again. And Voila! PL/I was born. Marv Rubinstein
henry@utzoo.uucp (Henry Spencer) (08/26/88)
In article <1290@garth.UUCP> smryan@garth.UUCP (Steven Ryan) writes: >Sounds like somebody wants an extensible C. It's been done, it works well, and it's readily available: C++. -- Intel CPUs are not defective, | Henry Spencer at U of Toronto Zoology they just act that way. | uunet!attcan!utzoo!henry henry@zoo.toronto.edu
cik@l.cc.purdue.edu (Herman Rubin) (08/27/88)
In article <1988Aug26.162706.22671@utzoo.uucp>, henry@utzoo.uucp (Henry Spencer) writes: > In article <1290@garth.UUCP> smryan@garth.UUCP (Steven Ryan) writes: > >Sounds like somebody wants an extensible C. > > It's been done, it works well, and it's readily available: C++. There are gross weaknesses in C++. It does not allow the introduction of new operators, for example. It does not address the problem of multiword hardware types, using machine dependencies where they can profitably be used (see the discussion about short x short -> long), and other such goodies. I have used one type when C would assume another type; C++ would complain. Fortunately, the newer C++ compilers do not reduce to C; that gave such atrocious code that if there was another way it would be preferable. C++ addresses a few of the weaknesses of C. However, it ignores the worst of the problems. -- Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907 Phone: (317)494-6054 hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)
bill@proxftl.UUCP (T. William Wells) (08/27/88)
In article <13180@mimsy.UUCP> chris@mimsy.UUCP (Chris Torek) writes:
: Make the changes---write yourself
: a compiler, or have someone else write it---and show that the new
: language is better than the old.
Anticipating at least one possible complaint: compiler writing is
*hard* work. Agreed. But you don't have to write the whole
thing. If you are going to make what are essentially minor
changes, you can do them in available compilers. For example,
the Gnu compiler which is more-or-less ANSI compatible and which
does not cost money (this is not an endorsement of Stallman et
al. just recognizing that they exist), or the Minix C compiler
which does cost (but only ~$100), or the Amsterdam Compiler Kit
(which costs a whopping $10,000). No doubt there are others as
well.
However, I suspect that the essential work would have to be done
in the libraries, but, given that the existing libraries are not
adequate (mostly the point of the complaints, I think), and that
numerical computing is your field, that should be, rather than a
problem, the heart of your activity. (Urk! The structure of
that sentence!)
---
Bill
novavax!proxftl!bill
smryan@garth.UUCP (Steven Ryan) (08/28/88)
>>Sounds like somebody wants an extensible C. > >It's been done, it works well, and it's readily available: C++. Some people have been asking for access to machine specific features. C is good at getting at machine features for one particular machine whether they exist or not. Query: Does C++ do the same or does it define its machine independent operators in terms of specific machine features and give programmers access to the same mechanism? (Why bother buying an unavailable book if I can con someone else in to doing my research for me?)
henry@utzoo.uucp (Henry Spencer) (08/28/88)
In article <899@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes: >> >Sounds like somebody wants an extensible C. >> >> It's been done, it works well, and it's readily available: C++. > >There are gross weaknesses in C++... I didn't say it was perfect, I said it worked well. There is a difference. Nobody expects a language to keep everybody happy. (Personally I doubt that any language would keep Herman Rubin happy.) C++ is a fairly well-done and highly usable extensible C. >It does not allow the introduction of new operators, for example. There is room for debate about whether dynamic alteration of language syntax is a good idea. C++ does provide for new operators, provided that you are willing to use function-call syntax for them. Call syntax is admittedly clumsy for anything complicated, but user-defined syntax is a real minefield for both users and implementors. >It does not address the problem of multiword >hardware types, using machine dependencies where they can profitably be >used (see the discussion about short x short -> long)... You mean, the current *implementations* do not provide for this. There is no reason why the implementation of a C++ type can't use hardware- specific extensions when they exist. The client interface can remain machine-independent, as it generally should be. -- Intel CPUs are not defective, | Henry Spencer at U of Toronto Zoology they just act that way. | uunet!attcan!utzoo!henry henry@zoo.toronto.edu
henry@utzoo.uucp (Henry Spencer) (08/29/88)
In article <1317@garth.UUCP> smryan@garth.UUCP (Steven Ryan) writes: >Some people have been asking for access to machine specific features. C is >good at getting at machine features for one particular machine whether they >exist or not. >... Does C++ do the same or does it define its machine independent operators >in terms of specific machine features and give programmers access to the same >mechanism? C++ is essentially a superset of C, so it takes the same approach as C. In both, there is no reason why a perceptive implementor can't provide machine-specific hooks for users to use to implement packages which have machine-independent interfaces. This works rather better in C++, mind you, because package interfaces are much nicer in C++. -- Intel CPUs are not defective, | Henry Spencer at U of Toronto Zoology they just act that way. | uunet!attcan!utzoo!henry henry@zoo.toronto.edu
john@uw-nsr.UUCP (John Sambrook) (08/29/88)
Mr. Rubin has been an active contributor to comp.lang.c for many months now. He has argued his points with great vigor and seems genuinely interested in proving his case. However, it seems to me that few people share his concern. It seems likely that this debate will continue for a very long time. While that may not be a bad thing, in and of itself, it isn't as satisfying as, say, an implementation of the ideas that Mr. Rubin has advanced. I would like to ask Mr. Rubin what he is doing, outside of posting to this and other USENET newsgroups, to bolster his position. Is there any research and/or design work in progress, or is it just talk. It is my feeling that such work would be useful, and that everyone would benefit. Perhaps a good first start would be a carefully considered paper that presents the fundamental issues Mr. Rubin would like to see addressed. John Sambrook Internet: john@nsr.bioeng.washington.edu University of Washington RC-05 UUCP: uw-nsr!john Seattle, Washington 98195 Dial: (206) 548-4386 -- John Sambrook Internet: john@nsr.bioeng.washington.edu University of Washington RC-05 UUCP: uw-nsr!john Seattle, Washington 98195 Dial: (206) 548-4386
karl@haddock.ima.isc.com (Karl Heuer) (08/29/88)
In article <309@eutrc3.UUCP> tnvscs@eutrc3.UUCP (c.severijns) writes: >We have been using C for scientific computing for some time now and so far we >only feel the need for a very few changes to the language. [One is passing >float by value, which is already in ANSI C.] The second change we would like >to be made is the possibility to compile C with "intrinsic" function This also is already in ANSI C. Karl W. Z. Heuer (ima!haddock!karl or karl@haddock.isc.com), The Walking Lint
smryan@garth.UUCP (Steven Ryan) (08/30/88)
>I didn't say it was perfect, I said it worked well. There is a difference. >Nobody expects a language to keep everybody happy. (Personally I doubt >that any language would keep Herman Rubin happy.) C++ is a fairly well-done >and highly usable extensible C. That's a very nice compliment. Complacency is a sign of death.