miker@wjvax.UUCP (Michael Ryan) (03/31/88)
In article <297@ho7cad.ATT.COM>, ekb@ho7cad.ATT.COM (Eric K. Bustad) writes: > In article <1238@wjvax.UUCP>, miker@wjvax.UUCP (Michael Ryan) writes: > > [stuff deleted] > > why not use the cdecl keyword , as supported by Msoft 5.0 .... > > void cdecl foo( int x, double y) > > [more stuff deleted] > No one else has asked, so I guess that I must reveal my ignorance. > Can someone explain the meaning of Microsoft's "cdecl" keyword? > = Eric as the perpetrator I offer a quick [trans/exp]lation for those who haven't had the pleasure of using usoft C. cdecl ----- re: K & R appendix A, pg. 180 'Some implementations also reserve the words "fortran" and "asm."' ok, usoft reserves the words 'fortran', 'pascal', and 'cdecl' all for similar reasons. these words signify that the object so identified is to be handled using that language's conventions.( e.g. how to push arguments onto the stack, how large/small is an integer ... ) ex: --- double pascal dave( int x, double y) dave here is a pascal routine that would pass arguments in order, blah, blah.. with PASCAL integer and double precision sizes , blah, blah... ex: --- char cdecl steve( int f , char g) steve is a C language routine that pushes arguments according to C conventions, g then f, with appropriate sizes, etc. cdecl -> C DEClaration. thus, mixed language modules can be tagged by language and LINKED together. usoft takes great pride in their mixed language capabilities. now you know ... -- michael -- ==== *michael j ryan *{..!pyramid,..!decwrl!qubix}!wjvax!miker *Watkins-Johnson Co., San Jose CA : (408) 435 1400 x3079 * above views are not necessarily those of Watkins-Johnson Co.
gwyn@brl-smoke.ARPA (Doug Gwyn ) (03/31/88)
In article <1242@wjvax.UUCP> miker@wjvax.UUCP (Michael Ryan) writes: > char cdecl steve( int f , char g) >steve is a C language routine that pushes arguments according to C conventions, >g then f, with appropriate sizes, etc. Of all the useless additions to C, this one has to take the cake! #define cdecl /*nothing*/
dhesi@bsu-cs.UUCP (Rahul Dhesi) (04/01/88)
In article <7595@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes (about the "cdecl" keyword): >Of all the useless additions to C, this one has to take the cake! >#define cdecl /*nothing*/ The real usefulness of cdecl comes when one is doing mixed language programming. Now if you compile a routine that needs to be linked with a Pascal program, you may compile it thus: cc -pascal mystuff.c If mystuff.c uses any C library function, you still want that library function to be called using the C calling convention, even though you asked the compiler to generate code for the Pascal calling convention. If mystuff.c includes a header file declaring library functions, and the header file uses the cdecl keyword for these, then the compiler will correctly use the Pascal or C calling convention, whichever is needed by the called routine. With a little foresight Microsoft could have just used the same calling convention in all its language translators. The "cdecl" keyword is a belated fix for a problem that need not have existed. -- Rahul Dhesi UUCP: <backbones>!{iuvax,pur-ee,uunet}!bsu-cs!dhesi
shirono@grasp.cis.upenn.edu (Roberto Shironoshita) (04/01/88)
In article <2521@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes: >[ cc -pascal mystuff.c ] >If mystuff.c uses any C library function, you still want that library >function to be called using the C calling convention, even though you >asked the compiler to generate code for the Pascal calling convention. I would much rather do away with the switch to generate foreign calling conventions, and assume anything is a C function unless otherwise declared. >With a little foresight Microsoft could have just used the same calling >convention in all its language translators. I am of the belief that most languages have their own calling conventions: FORTRAN uses pass-by-reference for everything (at least, that's what I heard last); PASCAL uses pass-by-value for regular parameters, and pass-by-reference for VAR parameters. C uses pass-by-value for everything. Roberto Shironoshita ------------------------------------------------------------------------- 1- The University doesn't know I exist. | Internet: 2- Of course I may be wrong. | shirono@grasp.cis.upenn.edu
gwyn@brl-smoke.ARPA (Doug Gwyn ) (04/01/88)
In article <2521@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes: > cc -pascal mystuff.c This is even more misguided (if possible) than cdecl. C and Pascal do not correspond 1-1, so no global mapping such as this should be applied to arbitrary C code. Pascal interfaces should be flagged AT THE INTERFACE (only).
barmar@think.COM (Barry Margolin) (04/02/88)
In article <3867@super.upenn.edu> shirono@grasp.cis.upenn.edu (Roberto Shironoshita) writes: >In article <2521@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes: >>With a little foresight Microsoft could have just used the same calling >>convention in all its language translators. >I am of the belief that most languages have their own calling >conventions: FORTRAN uses pass-by-reference for everything (at least, >that's what I heard last); PASCAL uses pass-by-value for regular >parameters, and pass-by-reference for VAR parameters. C uses >pass-by-value for everything. Yes, but that doesn't preclude them all using the same internal mechanism to actually make the call. For example, if the internal mechanism is pass-by-value, the languages whose semantics specify pass-by-reference can automatically pass addresses. Alternatively, if the internal mechanism is pass-by-reference, by-value languages can simply copy arguments to stack temporaries and pass references to those locations (just as is done when a computed value is passed by a by-reference language). Another possibility is for the caller to pass a flag indicating whether the arguments are values or references. If the callee is in a language that has the opposite semantics it can do the appropriate conversion at that time. Since most calls would be between modules in the same language, this would be pretty cheap. Barry Margolin Thinking Machines Corp. barmar@think.com uunet!think!barmar
Devin_E_Ben-Hur@cup.portal.com (04/02/88)
Doug Gwyn writes: > In article <1242@wjvax.UUCP> miker@wjvax.UUCP (Michael Ryan) writes: > > char cdecl steve( int f , char g) > > is a C language routine that pushes arguments according to C conventions, > > g then f, with appropriate sizes, etc. > > Of all the useless additions to C, this one has to take the cake! > #define cdecl /*nothing*/ Actually, there is a reason for both MSC and TC to have introduced the cdecl modifier. Both these compilers support multi-language environments using modifier keywords such as "pascal" & "fortran" to specify the linkage and naming conventions. They also both support compile switches which allow C functions to be compiled using the foreign language conventions as the default. This is convenient if you intend to have your C functions used as library routines for the foreign language. It also is nice because it produces faster and more compact code (the callee pops args using RET <n> instead of the caller having to restore the stack frame after every function call). A problem arrises when using standard library functions while having the foreign language switch on. Unless the user has bought and re-compiled the library source code, the linkage conventions will be wrong! This is the justification for cdecl. All the prototypes for library functions are declared as cdecl functions in the .H files supplied by the vendor to allow the user to take advantage of the better code quality without having to shell out the $$$ for the library source. Please don't be so quick to flame extensions that you don't understand. The MS-DOS/80x86 environment is not pretty. Some of us have to sweat blood to get high quality applications to run fast and fit in a severely constrained, somewhat-crippled architecture. Borland's & uSoft's extensions for supporting the 80x86 environment (near,far,huge,cdecl, pascal,fortran,interrupt,asm,psuedo-registers,etc.) are a blessing to those of us without the luxury of large virtual linear address spaces and real operating systems. ucbvax!sun!portal!devin.e.ben-hur%cupertino.pcc
chang@premise.ZONE1.COM (John Chang) (04/02/88)
Roberto Shironoshita writes: > Rahul Dhesi writes: > >[ cc -pascal mystuff.c ] > >If mystuff.c uses any C library function, you still want that library > >function to be called using the C calling convention, even though you > >asked the compiler to generate code for the Pascal calling convention. > > I would much rather do away with the switch to generate foreign > calling conventions, and assume anything is a C function unless > otherwise declared. > >With a little foresight Microsoft could have just used the same calling > >convention in all its language translators. > > I am of the belief that most languages have their own calling > conventions: FORTRAN uses pass-by-reference for everything (at least, > that's what I heard last); PASCAL uses pass-by-value for regular > parameters, and pass-by-reference for VAR parameters. C uses > pass-by-value for everything. ^^^^^^^^^^ What about arrays? The calling convention we're talking about here isn't call-by-name or call-by-value. It's simply the order in which parameters are passed on the stack. And in C, that order *must* be right to left in order to support variable length functions like printf. If it were left to right, then the printf function wouldn't know where to find its format string. However, the right to left parameter passing takes more code space (one extra instruction to fix the stack after a function call) and hence takes longer to execute than 'pascal' calling convention (left-to-right parameter passing). That's the reason the cdecl and pascal keywords exist. Although Microsoft could have made all their languages use the same parameter passing convention, this would have meant generating less efficient code. The extra instruction needed to pop the stack after a function call seems unimportant, yet I once heard Microsoft claim that the change to pascal calling convention made Microsoft Windows "substantially" smaller and faster. I don't remember the figures, but they were hard to believe. If you're writing functions with a variable number of parameters, and want to take advantage of the more efficient pascal calling convention, you still have to worry about the portability of these keywords. One solution is to #define environment-dependent tokens for the keywords. How about the far and huge keywords? Parsing declarations was bad enough without these. Quick, what's the difference between far int (*f)() and int far (*f)()? But this is more a gripe about the Intel architecture than anything else. John Chang { ...harvard!eddie,...cbosgd!mirror}!premise!chang Premise, Inc. chang@premise.zone1.com 3 Cambridge Center Cambridge, MA 02142 (617) 225-0422 When you have the advantage, get the money out.
bobmon@silver.bacs.indiana.edu (Skizofrenio the Elder...Younger) (04/02/88)
Barry Margolin proposes that all language-compilers (on a given architecture, perhaps) could internally handle function-calling in the same manner. But that means that no language would be as efficient as it could be. In parti- cular, Pascal can check some things at compile-time by forbidding options such as variable-length parameter lists. Who would want to write a Pascal compiler that has to silently add back in the functionality to decide how many arguments are being received? (Turbo C goes the other way, and allows "Pascal-type" declarations for functions with a fixed number of arguments. Then you have to be sure that your standard library calls are forced to "C-type" anyway. They don't recommend all this for beginners.) On another level, I assume (no, I don't know) that C has a standard for whether the caller or the callee saves working registers. How could you internally adjust for another language with a conflicting standard? Other than by a programmer-specified switch that says "violate the standard at this point"? disclaimer: If I knew what I was talking about, I probably wouldn't bother to say anything.
gwyn@brl-smoke.ARPA (Doug Gwyn ) (04/03/88)
In article <4259@cup.portal.com> Devin_E_Ben-Hur@cup.portal.com writes: > Please don't be so quick to flame extensions that you don't understand. Oh, I understand it all right. But it is a botch. cdecl acts as a kuldge to counteract a more global kludge, rather than solving the inter-language linkage problem directly at the interfaces involved. The very fact that you have to declare that the C source code is intended to follow C rules and not some other rules is ludicrous. >The MS-DOS/80x86 environment is not pretty. One of the reasons for this is that instead of devising elegant, clean solutions to some of the genuine problems, instead ill-advised kludges have been foisted off on the programming public. There are other segmented architectures that have not turned into such a morass of incompatible conventions; some support inter-language linkage. The 80x86 world has no excuse for the mess it has gotten itself into.
mjs@ernie.Berkeley.EDU (Marc J. Sabatella) (04/03/88)
Microsoft (and Borland) both push use of the pascal keyword (and corresponding cdecl) for efficiency purposes as well as mixed language applications. Pascal calling sequence is more efficient than C, so gains in space and time can be achieved by compiling an entire file with Pascal calling conventions, even if there is no Pascal code being linked anywhere. This is why they include a command line option to process a whole file as Pascal calling, and it is also why there is a cdecl keyword to override this. If you include the correct header files for all library functions, they are all declared cdecl in the prototype, so your source file can be compiled using Pascal or C conventions with no change, and without ever using the cdecl or pascal keywords yourself. The reason Pascal sequences are more efficient than C is that varargs constructions are not allowed. The called routine knows EXACTLY how many arguments it has been passed, and hence it can clean up the stack itself. This can save space, because a routine that is called several times will only have the code to restore the stack appear once, as opposed to C, in which it must appear after each call. In addiion, the 80x86 has a form of the return instruction that taks an argument: "RET n". This says to pop the return address (and load it into PC), and, while you're at it, adjust the stack pointer by n. Thus the procedure return and stack restoration may be combined into one instruction, provided the called routine knows how many bytes of arguments it got. By the way, Pascal compilers often push the arguments left to right, whereas C compilers almost always go right to left (so the first argument is on top of the stack) because implementing varargs is much easier that way [besides, I find it more convenient to access the parameters, anyhow]. This is another difference between Pascal and C conventions, but it is really only a byproduct of the the above discussion; there is no reason Pascal compilers can't push arguments right to left. Marc Sabatella mjs@ernie.Berkeley.EDU
james@bigtex.uucp (James Van Artsdalen) (04/04/88)
IN article <185@premise.ZONE1.COM>, chang@premise.ZONE1.COM (John Chang) wrote: > > [...] C uses pass-by-value for everything. > ^^^^^^^^^^ What about arrays? In C, arrays are _never_ passed to functions. Only the addresses of arrays. These are passed by value only. > The extra instruction needed to pop the stack after a function call > seems unimportant, yet I once heard Microsoft claim that the change to > pascal calling convention made Microsoft Windows "substantially" > smaller and faster. I don't remember the figures, but they were > hard to believe. Hard to believe, but true. Ultima IV became several K smaller and noticeably quicker after the conversion. Unfortunately, I don't remember the numbers either, but it was enough to be worth the trouble (though to be fair, a game must make especially pessimistic assumptions concerning available memory and processor speed). -- James R. Van Artsdalen ...!uunet!utastro!bigtex!james "Live Free or Die" Home: 512-346-2444 Work: 328-0282; 110 Wild Basin Rd. Ste #230, Austin TX 78746
chip@ateng.UUCP (Chip Salzenberg) (04/05/88)
In article <185@premise.ZONE1.COM> chang@premise.ZONE1.COM (John Chang) writes: >However, the right to left parameter passing takes more code space >(one extra instruction to fix the stack after a function call) and >hence takes longer to execute than 'pascal' calling convention >(left-to-right parameter passing). The Pascal calling sequence is faster than C, but for a different reason. The Pascal calling sequence is not faster because of its backwards (:-]) order, but because it does not allow variadic functions. Since the called function is always called with the same number of arguments, it can clean up the stack when it returns instead of making the caller do it. -- Chip Salzenberg "chip@ateng.UU.NET" or "codas!ateng!chip" A T Engineering My employer's opinions are a trade secret. "Anything that works is better than anything that doesn't."
bright@Data-IO.COM (Walter Bright) (04/05/88)
In article <18733@think.UUCP> barmar@fafnir.think.com.UUCP (Barry Margolin) writes: >In article <3867@super.upenn.edu> shirono@grasp.cis.upenn.edu (Roberto Shironoshita) writes: >>I am of the belief that most languages have their own calling >>conventions. >Yes, but that doesn't preclude them all using the same internal >mechanism to actually make the call. In the quest for ever more performance, this isn't done, because the semantics of a language are taken advantage of to minimize function call overhead, which is frequently THE major cost in a program. For example, in Pascal the callee knows how many parameters there were on the stack, so the callee can do the stack cleanup (on the 8086 there is a special instruction for this). For C, the callee doesn't know how many parameters there are, so the caller must clean up the stack. I presume that uSoft chose Pascal calling sequences for OS/2 and Windows to take advantage of the improved function call efficiency possible. >Another possibility is for the caller to pass a flag indicating >whether the arguments are values or references. No compiler vendor would ever implement this unless it is done in hardware. There are too many benchmark wars. As a programmer, I also wouldn't want the overhead of this, my programs are too slow anyway (they take a finite time!). There is a hidden gem in the ANSI C spec. That is, all functions are presumed to have a fixed number of parameters unless they are specifically prototyped as having variable args, and that prototype must appear before any use of the function. The big advantage here is that now the compiler can select the most efficient function call sequence on a case-by-case basis. This can be a big win, and would eliminate a major reason for coding in assembly (arguments could be passed in registers!). The only problem would be backwards compatibility with programs that don't prototype their varargs functions and/or don't #include the proper .h files. By the way, when are the unix compilers going to be updated to support function prototyping? Unix compilers are starting to look primitive compared to PC compilers... :-)
guy@gorodish.Sun.COM (Guy Harris) (04/05/88)
> There is a hidden gem in the ANSI C spec. That is, all functions are > presumed to have a fixed number of parameters unless they are specifically > prototyped as having variable args, and that prototype must appear before > any use of the function. The big advantage here is that now the compiler > can select the most efficient function call sequence on a case-by-case > basis. This can be a big win, and would eliminate a major reason for > coding in assembly (arguments could be passed in registers!). Umm, the lack of function prototyping certainly didn't stop *us* from passing arguments in registers; I don't know if MIPS's or Pyramid's compilers have it, but if they don't it didn't stop them either. If the compiler can somehow detect, when compiling a function, that the function is intended to be called in "varargs" form, it can have it dump the requisite arguments onto the stack. In our compiler, it detects this by having "va_alist" be a #define for "__builtin_va_alist", and have the compiler recognize that magic name; I think MIPS detects attempts to take the address of an argument. I don't know how Pyramid handles varargs. Our scheme only requires functions that "cheat" to stop doing so and use "varargs"; MIPS's may not even require that.
greg@csanta.UUCP (Greg Comeau) (04/05/88)
In article <2521@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes: >In article <7595@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) >writes (about the "cdecl" keyword): >>Of all the useless additions to C, this one has to take the cake! >>#define cdecl /*nothing*/ > >The real usefulness of cdecl comes when one is doing mixed language >programming. > Right. Another equally valid reason for having cdecl and pascal is that you may choose to use the pascal modifier on your C routine even though you do not have any mixed model code. Why would you want to do this? Because C builds the stack with the current arguments every time it goes to do a subsroutine call and it does not know or care how may arguments there actually is. Pascal does. So? Well this means that the overhead for pascal function calls during execution is faster because pascal type functions can clean up the stack when they are finished with a 'ret #bytes' instruction. With the "standard" C calling concentions, the caller usually does the cleaning up with an increment (decrement?) of the sp *after* the call to the function. Result is: faster & smaller code. Not too often that you can kill them two birds with the same stone.
greggy@infmx.UUCP (greg yachuk) (04/06/88)
In article <213@ateng.UUCP>, chip@ateng.UUCP (Chip Salzenberg) writes: > In article <185@premise.ZONE1.COM> chang@premise.ZONE1.COM (John Chang) writes: > >However, the right to left parameter passing takes more code space > >(one extra instruction to fix the stack after a function call) and > >hence takes longer to execute than 'pascal' calling convention ????????????????????????????? > >(left-to-right parameter passing). > > The Pascal calling sequence is faster than C, but for a different reason. > The Pascal calling sequence is not faster because of its backwards (:-]) > order, but because it does not allow variadic functions. Since the called > function is always called with the same number of arguments, it can clean ?????????????????????????????????????? > up the stack when it returns instead of making the caller do it. I realize that this is somewhat of a digression from the original posting, and that I might be stating the obvious, but I figured that I should get my 2000 lira in sometime. My first thought was that both of the above posters were wrong, but when put in the context of the IBM PC (see newsgroup line), it seems that they are both sort-of correct. The reason that one is faster than the other is not because of anything inherent in the order of pushing parameters, or in who can clean up the stack, but rather in the 80x8x instruction set. The 80x8x return instruction can take an argument which specifies how many bytes are to be added to the stack pointer after popping the return address. This is (probably) faster than returning and then increasing the stack pointer as two separate instructions (as is usually generated by C compilers). Since this argument is a constant word that follows the RET instruction (in the OBJ or EXE), it must be fixed. Hence, Pascal can take advantage of it, but C cannot (because of varargs)(except for very clever compilers :-). If it wasn't for this instruction (or on machines that do not provide it), the Pascal model would have to pop the return address, pop the args off the stack, and then return (either through an indirect jump, or by again pushing the return address, and then returning). This would probably take more execution time than using the C model, and possibly even more code. So while both of the above posters are correct about who can clean up the stack and when, it is only because of the RET instruction that the Pascal model can result in smaller code AND faster execution. As an aside, I attended a Microsoft Windows course last summer, and the instructor mentioned that for Windows Release 1.0x, they compiled the system with C and Pascal calling conventions, and the latter was 17% smaller. This makes sense if some functions are called more than once. However, smaller is not necessarily the same as faster (it is in this case, I hope :-), and if it were not for this Intel RET instruction, it could actually be slower. Greg Yachuk Informix Software Inc., Menlo Park, CA (415) 322-4100 pyramid!infmx!greggy !yes, I chose that login myself, wutzit tooya? So, like, uh, where do you guys get all these way cool .sig's, anyways?
tanner@ki4pv.uucp (Dr. T. Andrews) (04/08/88)
In article <185@premise.ZONE1.COM>, chang@premise.ZONE1.COM (John Chang) writes:
) The calling convention we're talking about here isn't call-by-name or
) call-by-value. It's simply the order in which parameters are passed
) on the stack. And in C, that order *must* be right to left in order
) to support variable length functions like printf.
Not true. I remember many dark moons ago (real-world example, theory
experts please ignore) a compiler, on a z-80 yet, which pushed its
args left-to-right. It handled [a pre-determined list of] variadic
functions as special cases, pushing (invisibly to the user) an extra
word containing the argument count.
--
{allegra clyde!codas decvax!ucf-cs ihnp4!codas killer}!ki4pv!tanner
wes@obie.UUCP (Barnacle Wes) (04/11/88)
In article <7595@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes (about the "cdecl" keyword): | Of all the useless additions to C, this one has to take the cake! | #define cdecl /*nothing*/ In article <2521@bsu-cs.UUCP> dhesi@bsu-cs.UUCP (Rahul Dhesi) writes: > The real usefulness of cdecl comes when one is doing mixed language > programming. In article <121@csanta.UUCP>, greg@csanta.UUCP (Greg Comeau) writes: % [...] this means that the overhead for pascal function calls during % execution is faster because pascal type functions can clean up the % stack when they are finished with a 'ret #bytes' instruction. With the % "standard" C calling concentions, the caller usually does the cleaning % up with an increment (decrement?) of the sp *after* the call to the function. % % Result is: faster & smaller code. Not too often that you can kill them two % birds with the same stone. So why should we force these MS-DOSisms on the rest of the C-speaking world? This is yet another kludge around the Intel iAPX?86 architecture (or lack thereof :-), just like "near" and "far". Tell me, what does "near" mean on a 68000 system? I can call C routines from the Unix fortran compiler WITHOUT declaring the C routine to be of "fortran calling sequence." Let's not put these kludges in a C language standard. If your programming environment needs them, let them be (non-portable) extensions to the standard needed to support your (non-) operating system! -- /\ - "Against Stupidity, - {backbones}! /\/\ . /\ - The Gods Themselves - utah-cs!utah-gr! / \/ \/\/ \ - Contend in Vain." - uplherc!sp7040! / U i n T e c h \ - Schiller - obie!wes
flaps@utcsri.UUCP (Alan J Rosenthal) (04/11/88)
chang@premise.ZONE1.COM (John Chang) wrote that in C parameters must be pushed in right-to-left order to support functions with a variable number of arguments (henceforth known as `variadic' functions). tanner@ki4pv.uucp (T. Andrews) cited a compiler which pushes its arguments left-to-right, handling a pre-determined list of variadic functions (i.e. including printf()) as special cases, pushing (invisibly to the user) an extra word containing the argument count. My comment is: Such a compiler would not correctly compile all correct C programs. In C it is permissible to call any function (including user-defined ones) with the wrong number of arguments, so long as any arguments not actually passed are not accessed by the function being called. The accessing can be done in printf style, although variable types and unlimited numbers of arguments are not supported. (It is also possible to use the <varargs.h> package on compilers which support it, but presumably the cited compiler did not, which is fine. The varargs.h package provides functionality like that required to implement printf, at the expense of additional awkwardness.) The new ANSI C standard will change all this (as well as nearly everything else!). It prohibits calls to variadic functions except in scope of a declaration which says that that function is variadic; I believe that it also requires that varargs.h be used (but they've changed the name). This new rule will validate the cited compiler's behaviour if it is upgraded to be an ANSI standard C compiler, and also sort of retroactively validates its previous behaviour by saying that the old standard's allowing of implicitly variadic functions wasn't very important. ajr
gwyn@brl-smoke.ARPA (Doug Gwyn ) (04/12/88)
In article <5980@utcsri.UUCP> flaps@utcsri.UUCP (Alan J Rosenthal) writes: >In C it is permissible to call any function (including >user-defined ones) with the wrong number of arguments, so long as any >arguments not actually passed are not accessed by the function being >called. Not true in general. If you read Dennis's anti-noalias diatribe, near the end you may recall that he identified two botches in original C, one of which was that printf()-like functions existed without adequate language support. The reason for the ,... approach of ANSI C is to provide adequate language support for variadic arguments.
karl@haddock.ISC.COM (Karl Heuer) (04/13/88)
In article <5980@utcsri.UUCP> flaps@utcsri.UUCP (Alan J Rosenthal) writes: >tanner@ki4pv.uucp (T. Andrews) cited a compiler which [handles pre-defined >variadic functions but not user-defined ones] > >Such a compiler would not correctly compile all correct C programs. In C it >is permissible to call any function (including user-defined ones) with the >wrong number of arguments, so long as any arguments not actually passed are >not accessed by the function being called. I don't believe this has ever been guaranteed. Dennis Ritchie recently mentioned this as one of the internal inconsistencies of K&R C (variadic functions are forbidden, yet printf exists). Karl W. Z. Heuer (ima!haddock!karl or karl@haddock.isc.com), The Walking Lint Followups to comp.lang.c only.
ray@micomvax.UUCP (Ray Dunn) (04/13/88)
In article <4259@cup.portal.com> Devin_E_Ben-Hur@cup.portal.com writes: > Please [Doug] don't be so quick to flame extensions that you don't > understand. In article <7606@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn) <gwyn>) writes: > Oh, I understand it all right. But it is a botch. cdecl acts as a > kuldge to counteract a more global kludge, rather than solving the > inter-language linkage problem directly at the interfaces involved. What do you suggest as an alternative? To put the onus on the *other* languages? Sure, that is possible, but who says 'C' has to be the reference language? The Microsoft approach is *not* a kludge, in fact within the MSDos environment the inter-language abilities are clean, useful, and fairly consistent. [Good grief, I'm actually defending Microsoft.....] >....The >80x86 world has no excuse for the mess it has gotten itself into. We all have our favourite rankings of machine architectures. They are usually in reverse order than the number of units sold (:-(. It is immaterial whether one likes or dislikes the architecture or "mess" of the 80x86 world or whatever it has "gotten itself into". It exists. It has to be lived with - although, in fact, when programming in other than assembler it *rarely* has to be given a second thought! Let us not hide our heads in the C-sand. The tools have to be provided. cdecl is such a tool. In fact, together with its pascal, fortran and associated keywords and the other inter-language features, it is part of a consistant and elegant solution to something that isn't even addressed in other architectures and operating systems (and I wont even name any n*mes). The full capabilities and libraries of various languages are open for mutual use by all. If you have been following the various C/Fortran/etc wars going on in comp.whatever you might appreciate why this might be important. If I could be so bold, from the documentary evidence of the exchange on this subject, at the time of Doug's initial one line sarcastic dismissal of cdecl he either did *NOT* understand its full ramifications, or he still does not understand its usefulness or the power hybrid language programming can provide. > The very fact that you have to declare that the C source code is > intended to follow C rules and not some other rules is ludicrous. Yup, he doesn't understand. Sure, on the fa'C'e of it, to have a declaration in 'C' saying "this is a C declaration" is daft! In a wider context than the narrow one of 'C', however, it indeed makes sense. If the "main" language is Pascal (or Fortran), then why not put the onus on the C parts to define that they should be Pascal (or Fortran) like. The inter-language interfaces *are* handled at the interfaces involved - the external variable and procedure interfaces! Now this can be specified locally at individual interfaces, or *globally*, not only as a shorthand, but also if you like, as lip service to the fact that Pascal is the reference language. This Pascalitis can be overridden in specific instances by a cdecl for the various good reasons alluded to in various other informative postings. (I also happen to very much like the simultaneous code and space optimization of procedure calls available when the caller does not have to worry about parameter cleanup - something I've always thought of as a minor irritant in 'C'). Ray Dunn. ..{philabs,mnetor}!micomvax!ray
henry@utzoo.uucp (Henry Spencer) (04/13/88)
>... In C it >is permissible to call any function (including user-defined ones) with the >wrong number of arguments, so long as any arguments not actually passed are >not accessed by the function being called. Sorry, wrong. You are confusing what you can get away with in certain common implementations (notably, on the VAX) with what is guaranteed by C. With a few sort of kludged-in exceptions like printf, type and number of arguments must match if you want to be sure it will work. -- "Noalias must go. This is | Henry Spencer @ U of Toronto Zoology non-negotiable." --DMR | {allegra,ihnp4,decvax,utai}!utzoo!henry
gwyn@brl-smoke.ARPA (Doug Gwyn ) (04/14/88)
In article <982@micomvax.UUCP> ray@micomvax.UUCP (Ray Dunn) writes: >What do you suggest as an alternative? To put the onus on the *other* >languages? Sure, that is possible, but who says 'C' has to be the reference >language? Your entire response shows that you missed what I was saying. Point 1: There need not be a "reference language". However, access to system-provided objects and functions (e.g. ROM facilities) may well be designed with a particular language's interface techniques in mind. That by no means necessitates the concept of a single system-wide "reference language", however. Point 2: Each language implementation can provide its own extensions for interfacing to foreign objects and functions. Thus "__pascal" would be a reasonable type qualifier to add to the C language in such an environment. Point 3: There is not a one-to-one mapping between C and other languages even at the external interface level, so it is inappropriate to apply a blanket mapping to an entire C source file. A local mapping should be used, for example in the external declarations for foreign externs in a header. I think the blanket mapping facility was most likely provided so that programmers didn't have to be aware of what they were doing, which in my experience is not an approach to be recommended. The other likely reason would be to speed up the interface, but all that shows is that a poor choice was made for the C interface design. It need not be slower than a Pascal interface. People who wish to continue the discussion on this point are advised to read Bell Labs CSTR #102 before wasting any more time. Point 4: My home computer uses a scheme exactly as I have described; it even has a segmented architecture. Don't accuse me of lack of experience!
ok@quintus.UUCP (Richard A. O'Keefe) (04/14/88)
In article <982@micomvax.UUCP>, ray@micomvax.UUCP (Ray Dunn) writes: [defending Microsoft's 'cdecl']. If the goal is to provide intercallability between several languages using different calling conventions, one could do worse than to imitate Apollo. Way back when, Apollo provided a Fortran/Pascal system, and made Fortran and Pascal use the same calling convention. (Actually, this seems rather silly to me. Why pass something by address when you can pass it by value?) And then they added C. So what did they do? They added a std$_call keyword (I don't remember the exact spelling), so that you only have to provide this keyword when you are calling something which isn't C. That's the right way to go. Anything which requires a different calling convention couldn't have been part of a "portable" C program, because it couldn't have been written in C. Requiring a special keyword just so a language can use its own calling convention seems, um, odd. > In article <7606@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn) <gwyn>) writes: > > > Oh, I understand it all right. But it is a botch. cdecl acts as a > > kludge to counteract a more global kludge, rather than solving the > > inter-language linkage problem directly at the interfaces involved. > Ray Dunn replied > What do you suggest as an alternative? To put the onus on the *other* > languages? No, Doug Gwyn is right. The "interfaces involved" are the extern declarations. His point is that you should need a special keyword only when something in language X imports something from language Y, or exports something to language Y, where X != Y. For example: file foodef.c defines a function foo() in C, file bazdef.p defines a function baz() in Pascal, and there are a C file see.c and a Pascal file pas.p using both functions. One might write /* in see.c */ extern int foo(int, int) extern$pascal int baz(int, int*); (* in pas.p *) function baz(i: integer; var j: integer): integer; external; function foo(i: integer; j: integer): integer; external "C"; You see the point? You don't need anything outside the standard for language X to call a subroutine written in language X. It's only if you call some other language that the interface has to say which one.
daveb@geac.UUCP (David Collier-Brown) (04/14/88)
The parameter-passing is something to look up at what we typically call link time. In fact, at the time the two routines are bound together, either before linking (at a cost to the compiler, as epitomized by Ada[1,2]) or during linking (at a higher cost to a typically kludgy program). It should neither be programmer-visible nor interpreted (as I once did on an iAPX-something-86) --dave c-b [1] Tm, The God of War (Ada Joint Program Office). [2] It actually does just that in an early Honeywell implementation. -- David Collier-Brown. {mnetor yunexus utgpu}!geac!daveb Geac Computers International Inc., | Computer Science loses its 350 Steelcase Road,Markham, Ontario, | memory (if not its mind) CANADA, L3R 1B3 (416) 475-0525 x3279 | every 6 months.
tneff@atpal.UUCP (Tom Neff) (04/14/88)
Correct me if I'm wrong, but wasn't /cdecl/ added solely for the purpose of overriding the interface model when a user specifies (to compilers offering this) that a module is to be compiled with, say, Pascal interfaces? The idea being that the vendor's RTL interfaces have to stay 'C' compatible even if all the user's entry points use another model. I thought that was why all the header files use /cdecl/ in the prototype; I also thought that for the same reason, it would normally be unnecessary for a user to add /cdecl/ to his own code unless he was writing a module with several different interface models at once. (Sounds messy.) I agree that this could be done via #pragma, and possibly should. TMN -- Tom Neff
franka@mmintl.UUCP (Frank Adams) (04/19/88)
In article <146@obie.UUCP> wes@obie.UUCP (Barnacle Wes) writes: >So why should we force these MS-DOSisms on the rest of the C-speaking >world? ... If your programming environment needs them, let >them be (non-portable) extensions to the standard needed to support >your operating system! That's exactly what they are! I didn't hear anybody suggesting that they be standardized; the question that brought about the whole discussion was how to use them in the MS-DOS environment. It is true that, with the (proposed) ANSI standard, one could define a calling convention for C on the 8086 which was as efficient as the Pascal standard: namely, callee pops arguments for fixed argument list functions, and caller pops for variadic functions. But the C calling convention for the machine antedates the ANSI standard; and with K&R C, the calling program does not know whether a function is variadic. Thus, there are good reasons for providing the pascal convention as a compilation default; and this necessitates a cdecl keyword. (Yes, one *could* redefine the calling interface. This would invalidate a lot of existing code -- assembler routines are quite common in 8086 products.) -- Frank Adams ihnp4!philabs!pwa-b!mmintl!franka Ashton-Tate 52 Oakland Ave North E. Hartford, CT 06108
ray@micomvax.UUCP (Ray Dunn) (04/22/88)
In article <7682@brl-smoke.ARPA> (Doug Gwyn) writes: >In article <982@micomvax.UUCP> ray@micomvax.UUCP (Ray Dunn) writes: >>What do you suggest as an alternative? To put the onus on the *other* >>languages? Sure, that is possible, but who says 'C' has to be the reference >>language? > >Your entire response shows that you missed what I was saying. > Your entire response shows that you missed what I was saying. Your pointed (:-) reply does not quote my posting which it is supposed to be rebutting, I presume because most of the points in it are not rebutted. >Point 4: My home computer uses a scheme exactly as I have described; it > even has a segmented architecture. Don't accuse me of lack of > experience! Tch. Tch. Doug. Now we are in fantasy land. This is a fine example of going off at half cock, as in the recent "goto's" fiasco. I will pay a handsome reward to the first person who can point out any accusation of lack of *experience* in my posting. I wonder what an analyst of Freudian bent would make of all of this? Followups, perhaps, to alt.flame. Ray Dunn. ..{philabs, mnetor}!micomvax!ray
Glenn_A_Story@cup.portal.com (04/23/88)
>IN article <185@premise.ZONE1.COM>, chang@premise.ZONE1.COM (John Chang) wrote >> > [...] C uses pass-by-value for everything. >> ^^^^^^^^^^ What about arrays? >In C, arrays are _never_ passed to functions. Only the addresses of arrays. >These are passed by value only. Wait a minute! What's the difference between "the address of 'x' passed by value" and "'x' passed by reference"? I've always considered C's passing of addresses as a form of call by reference, especially in the case of arrays where you don't even explicity specify "address of" (&). Regards, Glenn
limes@sun.uucp (Greg Limes) (04/26/88)
In article <4737@cup.portal.com> Glenn_A_Story@cup.portal.com writes: >Wait a minute! What's the difference between "the address of 'x' passed >by value" and "'x' passed by reference"? It all depends on what the caller is expecting. In the first, the caller is expecting an address, and is free to muck about with the formal parameter holding it; this is the convention used by C. In the second, the function is expecting something of the same type as 'x', and if this value is changed, the value of 'x' in the caller (who is passing 'x') is changed. If I remember correctly, this convention is used in FORTRAN and Pascal. Thus, if I have a library function written in FORTRAN that expects an integer, my calling function written in C must call that function with a pointer to an integer; likewise, if I write a C function that will be called by a FORTRAN program, then if FORTRAN passes an integer, C must expect a pointer to an integer. -- Greg Limes [limes@sun.com] frames to /dev/fb
tainter@ihlpg.ATT.COM (Tainter) (04/26/88)
In article <50731@sun.uucp>, limes@sun.uucp (Greg Limes) writes: > In article <4737@cup.portal.com> Glenn_A_Story@cup.portal.com writes: >> Wait a minute! What's the difference between "the address of 'x' passed >> by value" and "'x' passed by reference"? > In the first [address of X], the caller is expecting an address, and is free > to muck about with the formal parameter holding it. This is the answer to his question. This gives the former a pure super set of the capabilities provided by the latter. Of course, the syntax to use these two parameter types equivalently is different as well. The rest of Greg Limes response was tangential and simply a discussion of what makes the two types similar. > Greg Limes [limes@sun.com] --j.a.tainter