[comp.std.c] ANSI prototypes, the right choice...

peter@sugar.hackercorp.com (Peter da Silva) (02/05/91)

For all you Lattice-C programmers, I have a little hint for writing more
portable programs:

	int foo(int a, int b);

This is *not* compatible with a function declared:

	int foo(a, b)
	int a, b;
	{
		...
	}

While some compilers will accept this declaration while this prototype is
in scope, it is not portable and should at least generate a warning. If you
use a prototype, *declare* it with a prototype. If you don't want to, then
leave the prototype out as well.
-- 
Peter da Silva.   `-_-'
<peter@sugar.hackercorp.com>.

giguere@csg.uwaterloo.ca (Eric Giguere) (02/05/91)

In article <7708@sugar.hackercorp.com> peter@sugar.hackercorp.com (Peter da Silva) writes:
>For all you Lattice-C programmers, I have a little hint for writing more
>portable programs:
>
>	int foo(int a, int b);
>
>This is *not* compatible with a function declared:
>
>	int foo(a, b)
>	int a, b;
>	{
>		...
>	}

Strictly speaking, this is not true.  What you are describing must be
a Lattice-specific warning.  Problems WOULD arise if you had the
following declarations:

    int foo( char a, char b );

    int foo( a, b )
      char a, b;
      {
        ...
      }

The prototype won't agree because the "char a, b" in the second declaration
will get promoted to "int a, b" using the old-style default promotion rules.
(I.e., char & short --> int, float --> double). 

If you're not going to be using prototypes all the time, then you should
make sure that any functions you declare in the new style, such as:

	int fubar( char a, float b, int c )
      {
      }

should actually look like:

	int fubar( int a, double b, int c )
      {
      }

If you take a look at the ANSI standard library you'll see they made all
the library functions look like this so that lack of prototypes won't
bite you...

--
Eric Giguere                                       giguere@csg.UWaterloo.CA
           Quoth the raven: "Eat my shorts!" --- Poe & Groening

gwyn@smoke.brl.mil (Doug Gwyn) (02/06/91)

In article <7708@sugar.hackercorp.com> peter@sugar.hackercorp.com (Peter da Silva) writes:
>	int foo(int a, int b);
>This is *not* compatible with a function declared:
>	int foo(a, b)
>	int a, b;
>	{
>	}
>While some compilers will accept this declaration while this prototype is
>in scope, it is not portable and should at least generate a warning.

No, this is perfectly okay.  If Lattice C has a problem with it, it is a
bug in Lattice C.

There ARE function interfaces where the "old style" definition would not
be compatible with a prototype using the same types, but only when default
argument promotions change the type, which is not the case for int.

peter@sugar.hackercorp.com (Peter da Silva) (02/09/91)

In article <1991Feb5.155953.26790@maytag.waterloo.edu> giguere@csg.uwaterloo.ca (Eric Giguere) writes:
> Strictly speaking, this is not true.

Point. But...

>     int foo( char a, char b );

>     int foo( a, b )
>       char a, b;
>       {
>         ...
>       }

Yeh, stuff like that. Apparently Lattice C for the Amiga accepts this and
silently fixes it up. Other C compilers (some UNIX compilers, apparently)
do the same thing, because I keep getting UNIX and Lattice-C/Amiga source
that looks like this... so I get to go through fixing up all the definitions
so it compiles. It's getting old.

Please... folks... write your code consistently or don't bother.
-- 
Peter da Silva.   `-_-'
<peter@sugar.hackercorp.com>.

scs@adam.mit.edu (Steve Summit) (02/09/91)

In article <7708@sugar.hackercorp.com> peter@sugar.hackercorp.com (Peter da Silva) writes:
[mixture of new-style prototype declaration and old-style definition]
>While some compilers will accept this declaration while this prototype is
>in scope, it is not portable and should at least generate a warning.

In article <15089@smoke.brl.mil> gwyn@smoke.brl.mil (Doug Gwyn) writes:
>No, this is perfectly okay.  If Lattice C has a problem with it, it is a
>bug in Lattice C.

Not necessarily a bug, although I'd probably be annoyed by it.
Old-style declarations and definitions are both officially
obsolescent (sections 3.9.4 and 3.9.5); subject to deletion, at
X3J11's whim, in a future version of the Standard.  As the
Rationale says,

	Characterizing the old style as obsolescent is meant to
	discourage its use, and serve as a strong endorsement by
	the Committee of the new style.  It confidently expects
	that approval and adoption of the prototype style will
	make it feasible for some future C Standard to remove the
	old syntax.

K&R2 additionally asserts (p. 202) that "Mixtures are to be
avoided if possible."  Since compilers may issue any warning
messages they want to, I suspect that Lattice is just trying to
prod people towards the Party line.

(I'm actually not trying to defend the deprecation of the old
style; in fact I prefer that style.  Prototypes don't do anything
for me that lint doesn't do, and they're a pain to use during
transition.  But it's clear they're here to stay, so you don't
need to chastise me for being hidebound.)

                                            Steve Summit
                                            scs@adam.mit.edu

peter@sugar.hackercorp.com (Peter da Silva) (02/11/91)

In article <1991Feb9.075215.26939@athena.mit.edu> scs@adam.mit.edu writes:
> Since compilers may issue any warning
> messages they want to, I suspect that Lattice is just trying to
> prod people towards the Party line.

You have it backwards. Lattice accepts mixtures. No other Ansi-compatible
compiler I've used does... including Manx. Most of them take advantage of
the prototypes to generate faster, more efficient function calls. If you
mix declarations you *will* break code. If not now with your current compiler
then eventually with a future compiler.

Please, folks. It gets tiring fixing broken code... either go all the way
with ANSI or stick with K&R.
-- 
Peter da Silva.   `-_-'
<peter@sugar.hackercorp.com>.

henry@zoo.toronto.edu (Henry Spencer) (02/12/91)

In article <1991Feb11.030811.25074@sugar.hackercorp.com> peter@sugar.hackercorp.com (Peter da Silva) writes:
>You have it backwards. Lattice accepts mixtures. No other Ansi-compatible
>compiler I've used does...

How curious; an ANSI-conforming compiler has to accept mixtures.  Given
some attention to parameter types, a program which prototypes a function and
then gives an old-style definition of it is completely, 100% ANSI-conforming,
and any compiler which refuses to accept it is not.
-- 
"Read the OSI protocol specifications?  | Henry Spencer @ U of Toronto Zoology
I can't even *lift* them!"              |  henry@zoo.toronto.edu  utzoo!henry

dhesi%cirrusl@oliveb.ATC.olivetti.com (Rahul Dhesi) (02/12/91)

My experience is that if you want to keep your code reasonably
portable to non-ANSI-C environments but also take advantage of
the additional compile-time checking provided by function
prototypes, then the best way to do this is mix the old and
new styles like this:

     int myfunc PARAMS((int i, char *p));
     ...

     int myfunc (i, p)
     int i;
     char *p;
     {...}

What's good about this style is:  (a) It is compatible with both ANSI
and non-ANSI C environments;  (b) In an ANSI C environment, most
reasonable compilers will give you all of the intra-file checking that
you get with new-style definitions; (c) In a UNIX environment lint is
still fully usable and will do cross-file checking (which ANSI C
compilers can't do).

The only disadvantage of the above strategy is that it does not take
advantage of the slight increase in efficiency that you get if you have
char or float parameters and they are declared in the new style so that
runtime widening is not needed.  I don't think this advantage is enough
to sacrifice portability by using new-style definitions.
--
Rahul Dhesi <dhesi%cirrusl@oliveb.ATC.olivetti.com>
UUCP:  oliveb!cirrusl!dhesi

scs@adam.mit.edu (Steve Summit) (02/12/91)

In article <1991Feb11.030811.25074@sugar.hackercorp.com> peter@sugar.hackercorp.com (Peter da Silva) writes:
>In article <1991Feb9.075215.26939@athena.mit.edu> scs@adam.mit.edu writes:
>> Since compilers may issue any warning
>> messages they want to, I suspect that Lattice is just trying to
>> prod people towards the Party line.
>
>You have it backwards. Lattice accepts mixtures. No other Ansi-compatible
>compiler I've used does... including Manx. Most of them take advantage of
>the prototypes to generate faster, more efficient function calls. If you
>mix declarations you *will* break code.

I have what backwards?

You'll have to provide an example of this code that *will* break.
If I have

	f(i, f, d)
	int i;
	float f;
	double d;

and somewhere else I invoke

	f(1, 2., 3.);

it's going to work, regardless of whether I have a prototype for
f in scope or not, and regardless of any optimizations the
compiler is trying to do.

The standard (section 3.3.2.2) makes it clear that function calls
without prototypes in scope are to be treated as they have always
been:

	If the expression that denotes the called function has a
	type that does not include a prototype, the integral
	promotions are performed on each argument[,] and
	arguments that have type float are converted to double.

A simpleminded, incorrect prototype for f such as

	extern f(int, float, double);

could certainly cause trouble, as it would allow the compiler
to pass the second argument unwidened as a float, but this is not
a consistent prototype for f.  The correct prototype for the
above definition of f, if you want to mix things, is

	extern f(int, double, double);

This particular issue (the confusing appearance of prototypes for
old-style function definitions containing float parameters) is
discussed in the comp.lang.c Frequently Asked Questions list.

A compiler's "efficient" calling sequences do not enter into this
question.  Calls to functions without prototypes in scope are
made as if a prototype were spontaneously invented, based only on
information visible in the call, and taking into account the
float => double promotion rule quoted above.  (I have heard
people speak of a clause explicitly mentioning this on-the-fly
prototype construction, but it must have been in an earlier
draft.)

Typically, the case in which the "efficient" calling sequences
cannot be used is functions with a variable number of arguments.
However, ANSI C explicitly requires that these functions be
defined using new-style syntax and called with prototypes in
scope.  Old-style functions may be assumed to be fixed-argument.
(This is actually consistent with the original definition of C;
printf has always been an anomaly.)  X3.159 section 3.3.2.2
(quoted above) continues (still talking about called functions
without prototypes):

	If the number of arguments does not agree with the number
	of parameters, the result is undefined.

This makes it clear that a compiler may (and in fact, should) use
its efficient calling sequences when prototypes are not in scope.
The variable upon which to base the choice of function-call
linkage is not the presence or absence of prototypes, but the
presence or absence of variable-length argument lists.  (I
suspect that it is a misunderstanding of this point that is
causing the confusion.)

None of this is accidental; the requirement that functions with
variable numbers of arguments be called with a prototype in scope
was made precisely so that a compiler could use one function
calling mechanism for them, and another one for all others.

To be sure, mixing prototypes with old-style definitions is
tricky, and error-prone if the old-style definitions include
"narrow" parameters (i.e. float).  This probably explains K&R2's
suggestion that "mixtures are to be avoided if possible."
However, correct mixtures are supposed to work.

If a compiler sees

	extern f(int, double, double);
	f(1, 2., 3.);

and emits code for an "efficient" function call, and then turns
around and compiles a file containing

	f(i, f, d)
	int i;
	float f;
	double d;

into code which expects a different, incompatible, "inefficient"
stack frame, that compiler is nonconforming.  (Says he, sticking
his neck out.  "I'm not a language lawyer, but I play one on
comp.lang.c.")

As it stands, the real reason to avoid old-style function
definitions is that they are officially "obsolescent."  There are
a number of situations in which mixing old-style definitions and
prototype declarations is fairly silly -- if you've got a
compiler which accepts the prototypes, why not use prototype-
style definitions as well?  (On the other hand, there are
situations in which it makes sense.  #ifdefs within function
definitions are ugly, so a viable interim strategy is to use
old-style definitions, augmented by prototypes inside
#ifdef __STDC__ so that old compilers won't choke on them but new
compilers won't scream about "call of function with no prototype
in scope.")

>Please, folks. It gets tiring fixing broken code... either go all the way
>with ANSI or stick with K&R.

Good advice, for those whose code will never see a pre-ANSI
compiler.  The rest of us have to straddle the fence a bit.
(In my own code, I use mostly K&R style, except for functions
with variable numbers of arguments.)

Followups to comp.std.c; the Amiga folks are probably getting
sick of this.

                                            Steve Summit
                                            scs@adam.mit.edu

sef@kithrup.COM (Sean Eric Fagan) (02/12/91)

In article <2941@cirrusl.UUCP> dhesi%cirrusl@oliveb.ATC.olivetti.com (Rahul Dhesi) writes:
>     int myfunc PARAMS((int i, char *p));
>     int myfunc (i, p)
>     int i;
>     char *p;
>     {...}

>What's good about this style is:  (a) It is compatible with both ANSI
>and non-ANSI C environments;

And

	myfunc (PARAMS((char c, float f));
	myfunc (c,f) char c; float f; {}

is *not* right, it is a syntax error, as much as declaring myfunc to be a
float the second time is.

-- 
Sean Eric Fagan  | "I made the universe, but please don't blame me for it;
sef@kithrup.COM  |  I had a bellyache at the time."
-----------------+           -- The Turtle (Stephen King, _It_)
Any opinions expressed are my own, and generally unpopular with others.

ridder@elvira.enet.dec.com (Hans Ridder) (02/13/91)

In article <1991Feb11.030811.25074@sugar.hackercorp.com> peter@sugar.hackercorp.com (Peter da Silva) writes:
>You have it backwards. Lattice accepts mixtures. No other Ansi-compatible
>compiler I've used does... including Manx.
                                                  ^^^^^^^^!!!!

While I agree with you about not mixing old and new C styles, I can't
help but think you are generalizing just a bit here.  I happen to know
that Microsoft C for the PC (Ptui!) allows this type of atrocity.  I'd
imagine it's a rather popular compiler too....

Perhaps you could back up your claim with some names?

>Peter da Silva.   `-_-'

-hans
------------------------------------------------------------------------
  Hans-Gabriel Ridder			Digital Equipment Corporation
  ridder@elvira.enet.dec.com		Customer Support Center
  ...decwrl!elvira.enet!ridder		Colorado Springs, CO

peter@sugar.hackercorp.com (Peter da Silva) (02/13/91)

In article <1991Feb11.164636.22675@zoo.toronto.edu> henry@zoo.toronto.edu (Henry Spencer) writes:
> How curious; an ANSI-conforming compiler has to accept mixtures.  Given
> some attention to parameter types, a program which prototypes a function and
> then gives an old-style definition of it is completely, 100% ANSI-conforming,
> and any compiler which refuses to accept it is not.

I think the key here is the phrase "Given some attention to parameter types".

That is, "int foo(int, int);" is compatible with "int foo(a, b) char a, b;",
but "int foo(char, char);" isn't. A compiler that accepts the latter as
compatible with a non-prototyped definition without so much as a warning
isn't ANSI compliant, no?

And it's that usage that's causing the problem.
-- 
Peter da Silva.   `-_-'
<peter@sugar.hackercorp.com>.

peter@sugar.hackercorp.com (Peter da Silva) (02/13/91)

In article <1991Feb12.005726.22447@athena.mit.edu> scs@adam.mit.edu writes:
> 	f(i, f, d)
> 	int i;
> 	float f;
> 	double d;

> 	f(1, 2., 3.);

> A simpleminded, incorrect prototype for f such as

> 	extern f(int, float, double);

> could certainly cause trouble,

Certainly. And I'm running into more and more code that does that very
thing, particularly from Lattice-C programmers on the Amiga. Which is
why I brought it up.

> This probably explains K&R2's
> suggestion that "mixtures are to be avoided if possible."

My point exactly.

> Good advice, for those whose code will never see a pre-ANSI
> compiler.  The rest of us have to straddle the fence a bit.

Yes, the other point is "we need a *working* unprotoize that puts casts
into the code, so we can use ANSI prototypes and have a hope in hell
of making them portable.

I've run into other problems with fence-straddling code on older
compilers just for this very reason. It becomes quite painful to
go through and fix every place where people depended on the prototype
instead of casting arguments to functions.

> Followups to comp.std.c; the Amiga folks are probably getting
> sick of this.

It's the Amiga folks who need to pay attention to this.
-- 
Peter da Silva.   `-_-'
<peter@sugar.hackercorp.com>.

rfg@NCD.COM (Ron Guilmette) (02/14/91)

In article <1991Feb13.105852.540@sugar.hackercorp.com> peter@sugar.hackercorp.com (Peter da Silva) writes:
>
>Yes, the other point is "we need a *working* unprotoize that puts casts
>into the code, so we can use ANSI prototypes and have a hope in hell
>of making them portable.

Hey!  Cut me some slack!  I'm doing the best I can! :-)

Wadaya want for nuttin'? :-)

But seriously folks, someday unprotoize will take this:

	void *(*foobar (int, float))(float, struct s*);

	struct s {
		double (*func_ptr[10]) (char, enum e);
	} svar;

	void *vp;

	void f () {  svar.func_ptr[3] = (double (*)(char, enum e)) vp; }

and turn it into this:

	void *(*foobar ())();

	struct s {
		double (*func_ptr[10]) ();
	} svar;

	void *vp;

	void f () {  svar.func_ptr[3] = (double (*)()) vp; }

But don't hold your breath.

-- 

// Ron Guilmette  -  C++ Entomologist
// Internet: rfg@ncd.com      uucp: ...uunet!lupine!rfg
// Motto:  If it sticks, force it.  If it breaks, it needed replacing anyway.

andrew@teslab.lab.OZ (Andrew Phillips) (02/19/91)

In article <1991Feb11.030811.25074@sugar.hackercorp.com> peter@sugar.hackercorp.com (Peter da Silva) writes:
>You have it backwards. Lattice accepts mixtures. No other Ansi-compatible
>compiler I've used does...

It depends on what you mean by "mixtures".  If you mean using float,
short or char in prototypes and old style function definitions, ie:

double f(float x);

double f(x) float(x); { ... }

then this would be a big problem and I would expect any ANSI compiler
which found these in the same source file to generate an error.  If
Lattice 5.10 does this then this is very bad, since the caller is
going to pass a float and the callee expects a double.

HOWEVER, I know that a lot of ANSI draft compilers, which had
prototypes but came out before the ANSI new style function
definitions were invented allowed this (but the caller would pass x
as a double so it was OK).  Examples are older Lattice compilers
(4.0?) and Microsoft C 4.0 (MSDOS) and many UNIX compilers for a long
time.  So this is definitely a problem to look out for if you are
compiling old code (esp. if the prototype does not appear in the same
source file as the function defn).

OF COURSE, all new code should be written using prototypes and the
new style function definitions.  But mixing old and new is OK if you
avoid float, short and char in prototypes.

Andrew.
-- 
Andrew Phillips (andrew@teslab.lab.oz.au) Phone +61 (Aust) 2 (Sydney) 289 8712

mwm@pa.dec.com (Mike (My Watch Has Windows) Meyer) (02/20/91)

In article <1205@teslab.lab.OZ> andrew@teslab.lab.OZ (Andrew Phillips) writes:
   It depends on what you mean by "mixtures".  If you mean using float,
   short or char in prototypes and old style function definitions, ie:

   double f(float x);

   double f(x) float(x); { ... }

   then this would be a big problem and I would expect any ANSI compiler
   which found these in the same source file to generate an error.  If
   Lattice 5.10 does this then this is very bad, since the caller is
   going to pass a float and the callee expects a double.

   HOWEVER, I know that a lot of ANSI draft compilers, which had
   prototypes but came out before the ANSI new style function
   definitions were invented allowed this (but the caller would pass x
   as a double so it was OK).  Examples are older Lattice compilers
   (4.0?)

Lattice (SAS) 5.10 accepts this. Note that it's billed as a "pre-ANSI"
compiler; the 5.1 release was an upgrade to add AmigaDOS 2.0 support,
not ANSI.

What this compiler does when it sees an old style definition that has
a prototype in scope is to see if the types were "compatable" (all of
them being identical before implicit promotions was), and if so
compile the function as if it were defined with the prototype style.
After all, the compiler had a prototype, and the function definition
agreed with it. Turns out this causes problems when you try and write
code that needs to port across multiple ANSI compilers and non-ansi
compilers.

	<mike
--
Cats will be cats and cats will be cool			Mike Meyer
Cats can be callous and cats can be cruel		mwm@pa.dec.com
Cats will be cats, remember this words!			decwrl!mwm
Cats will be cats and cats eat birds.