nagel@ics.uci.edu (Mark Nagel) (06/10/88)
I have been using function prototypes a bit in LightspeedC for the Macintosh
and I was under the impression that the following was a correct prototype:
char *foo(char *c, int i);
for the function:
char *foo(c, i)
char *c;
int i;
{
/* function body */
}
A friend of mine who is installing the GNU C compiler here tells me that the
ANSI standard has changed this so that the function body header must look
like the function prototype. The GNU documentation says that something like
the above is now an error. While I don't mind altering my habits to adapt to
the new way (if this is really the case), I am curious as to why this change
was adopted. It seems to me that prototypes were designed such that modules
compiled separately would be able to determine the proper arguments without
access to the actual source of the other module(s). What added benefit does
forcing duplication of the function prototype offer? Or have we misunderstood
the GNU documentation? I've read this group for quite a while and I thought
I would have heard about so drastic a change, but perhaps I missed it.
Mark D. Nagel Department of Information and Computer Science, UC Irvine
nagel@ics.uci.edu (ARPA)
I'm not a graduate student, {sdcsvax|ucbvax}!ucivax!nagel (UUCP)
but I play one on TV...
gwyn@brl-smoke.ARPA (Doug Gwyn ) (06/11/88)
In article <654@orion.cf.uci.edu> nagel@ics.uci.edu (Mark Nagel) writes: >A friend of mine who is installing the GNU C compiler here tells me that the >ANSI standard has changed this so that the function body header must look >like the function prototype. The GNU documentation says that something like >the above is now an error. It's not an error, so long as the argument types are what you would get from the default argument promotions (as they are in this case). However, it is not a good idea to mix new-style (prototype) and old-style function syntax. The proposed ANSI C provides rules for resolving the collision of these two ways of doing business, but that's intended for grandfathering in existing code, not to be used for new code. New code should consistently use prototype style for both declarations and definitions.
gunars@spsspyr.UUCP (Gunars V. Lucans) (06/14/88)
In article <8073@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: > ... New code should consistently use prototype style for both declarations >and definitions. Can anyone suggest how to code new style prototype definitions in a manner that would be portable to older compilers? For declarations, the following would suffice (from T.Plum's "Reliable Data Structures in C"): #ifdef PROTO_OK #define PARMS(x) x #else #define PARMS(x) () #endif void foo PARMS( (int arg1, char *arg2) ); Definitions are another matter. Is there an alternative (other than not using prototypes at all) to: void foo ( #ifdef PROTO_OK int #endif arg1, #ifdef PROTO_OK char * #endif arg2) { <body> } What is the general level of compliance to the proposed standard in existing UNIX compilers? How soon can we expect the majority of them to be ANSI conforming, given that the market for UNIX compilers is different than that for MS-DOS compilers? ____________________________________________________________________________ Gunars V. Lucans -- SPSS Inc, Chicago -- ...!att!chinet!spsspyr!gunars -- ____________________________________________________________________________ Gunars V. Lucans -- SPSS Inc, Chicago -- ...!att!chinet!spsspyr!gunars
chris@mimsy.UUCP (Chris Torek) (06/14/88)
In article <273@spsspyr.UUCP> gunars@spsspyr.UUCP (Gunars V. Lucans) writes: >... For declarations, the following would suffice (from T.Plum's ... >[magic PARMS macro] > void foo PARMS( (int arg1, char *arg2) ); I use something like this already (although I used /* * A rather ugly way to hide prototypes from the old compiler. */ #ifndef _PROTO_ #if defined(__STDC__) || defined(c_plusplus) || defined(__cplusplus) #define _PROTO_(x) x #else #define _PROTO_(x) () #endif #endif int isalpha _PROTO_((int _c)); etc.). >Definitions are another matter. Indeed. >Is there an alternative (other than not using prototypes at all) to: > > void foo ( > #ifdef PROTO_OK > int > #endif > arg1, > #ifdef PROTO_OK > char * > #endif > arg2) > { > <body> > } This is missing one section: #ifndef PROTO_OK int arg1; char *arg2; #endif I think I prefer #ifdef PROTO_OK void foo(int arg1, char *arg2) #else void foo(arg1, arg2) int arg1; char *arg2; #endif { even if it does name everything twice (or three times!). -- In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7163) Domain: chris@mimsy.umd.edu Path: uunet!mimsy!chris
Devin_E_Ben-Hur@cup.portal.com (06/15/88)
Someone asked about how to use the pre-processor to support pre-dpANS function definitions. Try this (admittedly ugly) scheme: #ifdef __STDC__ #define _PROTO1_(t1,a1) (t1 a1) #define _FNCDF1_(t1,a1) (t1 a1) #define _PROTO2_(t1,a1,t2,a2) (t1 a1, t2 a2) #define _FNCDF2_(t1,a1,t2,a2) (t1 a1, t2 a2) /* ... _xxxxxN_ */ #else #define _PROTO1_(t1,a1) () #define _FNCDF1_(t1,a1) (a1) t1 a1; #define _PROTO2_(t1,a1,t2,a2) () #define _FNCDF2_(t1,a1,t2,a2) (a1,a2) t1 a1; t2 a2; /* ... _xxxxxN_ */ #endif extern int foo _PROTO2_(int,an_integer, char *,a_char_ptr); int foo _FNCDF2_(int,an_integer, char *,a_char_ptr) { /* ... */ }
gwyn@brl-smoke.ARPA (Doug Gwyn ) (06/15/88)
In article <273@spsspyr.UUCP> gunars@spsspyr.UUCP (Gunars V. Lucans) writes: >Definitions are another matter. Is there an alternative (other than not >using prototypes at all) to: > void foo ( > #ifdef PROTO_OK > int > #endif > arg1, > #ifdef PROTO_OK > char * > #endif > arg2) > { > <body> > } The above isn't even correct. Try #if __STDC__ void foo( int arg1, char *arg2 ) #else void foo( arg1, arg2 ) int arg1; char *arg2; #endif { /* body */ } >What is the general level of compliance to the proposed standard in existing >UNIX compilers? How soon can we expect the majority of them to be ANSI >conforming, given that the market for UNIX compilers is different than that >for MS-DOS compilers? Obviously AT&T must be planning to release an ANSI-compatible C compiler as soon as they can after the standard stabilizes. Give vendors who use the AT&T CCS as a base about 6 months to a year after that to pick it up. Those who base their compiler on 4BSD PCC have a harder task ahead of them, although Chris Torek and others have been trying to bring the 4BSD CCS up to ANSI/POSIX standards (not done yet). GNU CC is already almost there. Most other C vendors I know of are preparing ANSI C releases. My guess is that two years after the official standard you'll be able to obtain a Standard-conforming implementation for practically all systems worth worrying about.
throopw@xyzzy.UUCP (Wayne A. Throop) (06/15/88)
> chris@mimsy.UUCP (Chris Torek) >> gunars@spsspyr.UUCP (Gunars V. Lucans) >>... For declarations, [...] >> void foo PARMS( (int arg1, char *arg2) ); > I use something like this already [...] >>Definitions are another matter. > Indeed. >>Is there an alternative (other than not using prototypes at all) to: >> [... using ifdef around each argument of a function ...] > I think I prefer [... to put the ifdefs around the whole definition ...] > #ifdef PROTO_OK > void foo(int arg1, char *arg2) > #else > void > foo(arg1, arg2) > int arg1; > char *arg2; > #endif > { > even if it does name everything twice (or three times!). I defined three macros, PT (prototype), PF (prototype formals), and PP (prototype punctuation). Thus, for declarations, the familiar void *foo PT(( char *, int )); and for the more troublesome declarations, the odd-looking-at-first void foo PF(( cp, i), char *cp PP int i PP ) { /* body */ } This has the same amount of redundancy as the old-style declarations. On the other hand, it has the new requirement that the variables be in the correct order in both lists instead of just in the first, but this doesn't seem too onerous. I'll leave the definitions of the macros as excersizes for the reader... they aren't too difficult. I do have one question for the net, however. Clearly, we can see that PP expands to either a comma or a semicolon. Are there any preprocessors that would barf on this? I haven't discovered any so far, knock plastic. -- Nature abhors a hero. --- Solomon Short -- Wayne Throop <the-known-world>!mcnc!rti!xyzzy!throopw
scs@athena.mit.edu (Steve Summit) (06/17/88)
In article <8073@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: >However, it is not a good idea to mix new-style (prototype) and old-style >function syntax. The proposed ANSI C provides rules for resolving the >collision of these two ways of doing business, but that's intended for >grandfathering in existing code, not to be used for new code. New code >should consistently use prototype style for both declarations and >definitions. How important is this? I'm starting to use prototypes for external declarations, because they're easy to #ifdef out and they give me some (not all :-( ) of the advantages of lint on systems without it, but I'm going to use the old style in function definitions (i.e. the thing that sits at the top of the actual body of the function) for quite a while, to ensure portability to non-ANSIfied compilers (such as my '11 at home which is not likely to get an ANSI-style compiler, ever). How much, and why, are old-style declarations disparaged in new-style code? Microsoft's compiler warns you (with the maximum warning level set); I wish it wouldn't, because then the maximum warning level (/W3) would be almost as good as lint. Is it wrong for a compiler to construct some kind of Miranda prototype, when an old-style function definition is seen, which could be used for lint-style argument mismatch warnings (_n_o_t ANSI-style implicit argument coercions)? Steve Summit scs@adam.pika.mit.edu
gwyn@brl-smoke.ARPA (Doug Gwyn ) (06/18/88)
In article <5808@bloom-beacon.MIT.EDU> scs@adam.pika.mit.edu (Steve Summit) writes: >How much, and why, are old-style declarations disparaged in new-style code? Mixing of old- and new-style declarations/definitions should work okay so long as the functions take a fixed number of arguments, each of which has a type that is not changed by the old-style argument widening rules. The main reason that the interface is allowed to differ between old- and new-style functions is to permit implementations to avoid widening, for example of float to double or of char to int. It also makes for stricter type checking. >Is it wrong for a compiler to construct some kind of Miranda prototype, when >an old-style function definition is seen, which could be used for lint-style >argument mismatch warnings (*not* ANSI-style implicit argument coercions)? The compiler can issue any warnings it wishes, but to be Standard conforming it must follow the rules (i.e. produce valid code from a valid program).
henry@utzoo.uucp (Henry Spencer) (06/19/88)
> How much, and why, are old-style declarations disparaged in > new-style code? ... Officially, the old-style declarations are in the standard primarily for backward compatibility, with a strong hint that they will eventually be removed. In practice, it will be many years before implementors even think seriously about rejecting them. -- Man is the best computer we can | Henry Spencer @ U of Toronto Zoology put aboard a spacecraft. --Von Braun | {ihnp4,decvax,uunet!mnetor}!utzoo!henry