[comp.lang.c] Func Protos with K&R Func Defs

david@jpl-devvax.JPL.NASA.GOV (David E. Smyth) (02/28/91)

I do this all the time:

in WcCreate.h:

    #ifdef _FUNCTION_PROTOTYPES
    ...
    extern void WcWidgetCreation ( Widget root );
    ...
    #else
    ...
    extern void WcWidgetCreation();
    ...
    #endif

in WcCreate.c:

    void WcWidgetCreation ( root )
        Widget       root;
    {
        ...
    }

This seems like the easiest way to use prototyped function declatations
when your compiler supports it, and K&R function definitions in any
case.  Then only the *.h files need to have #ifdef's.

Then "gcc -ansi -D_FUNCTION_PROTOTYPES" still does return type and
argument checking across function calls, as long as it sees the
prototyped function declaration before it is used somewhere.  And "cc"
still does little or no checking...

Does anybody have any problems with this on their compilers?

-------------------------------------------------------------------------
David Smyth				david@jpl-devvax.jpl.nasa.gov
Senior Software Engineer,		seismo!cit-vax!jpl-devvax!david
X and Object Guru.			(818)393-0983
Jet Propulsion Lab, M/S 230-103, 4800 Oak Grove Drive, Pasadena, CA 91109
------------------------------------------------------------------------- 
	One of these days I'm gonna learn:  Everytime I throw
	money at a problem to make it disappear, the only thing
	that disappears is my money...
-------------------------------------------------------------------------

jik@athena.mit.edu (Jonathan I. Kamens) (02/28/91)

  That may be the easiest way to do things, but it's not legal ANSI C.

  In ANSI C, a function declaration must match its definition.  That means
that if the declaration is prototyped, then the definition must also be
prototyped, with the same types; if the declaration is not prototyped, then
the definition cannot be prototyped either.

  This isn't just a matter of being picky, either; the differences between how
prototyped and unprototyped functions are compiled will cause your programs to
break if your declarations don't match your definitions.  That's why the file
that defines functions should always include the header file that declares
them, so you'll get errors if the declaration and the definition don't match.

  An example of where this might fail.  Suppose that I have this function
definition:

	random(a, b, c, d, e)
	char a, b, c, d, e;
	{
	     printf("%c %c %c %c %c\n", a, b, c, d, e);
	}

Since this is an old-style definition, the compiler will assume when compiling
it that all arguments to the function have been promoted to ints, and will
therefore reference the arguments on the stack accordingly.  Now supposed I
have this file that uses the function defined in the file above:

	extern random(char a, char b, char c, char d, char e);
	
	main()
	{
	     random('a', 'b', 'c', 'd', 'e');
	}

When compiling this file, the compiler may, if it so chooses, only leave one
byte on the stack for each argument.  Since there is a prototype which
declares the arguments as chars, the compiler does not have to promote them or
leave extra space on the stack for them.

  Now, you may get lucky for a while any only run into compilers that leave
tehe same amount of space on the stack for chars and ints.  But eventually
you're going to run into a compiler that doesn't.  And you're going to lose.

  Summary: Function declarations and definitions must match.  If you have
#ifdef's in hour headers to decide whether or not to use prototypes, then you
must #ifdef your definitions similarly.

-- 
Jonathan Kamens			              USnail:
MIT Project Athena				11 Ashford Terrace
jik@Athena.MIT.EDU				Allston, MA  02134
Office: 617-253-8085			      Home: 617-782-0710

scs@adam.mit.edu (Steve Summit) (02/28/91)

In article <11614@jpl-devvax.JPL.NASA.GOV> david@jpl-devvax.JPL.NASA.GOV (David E. Smyth) writes:
>I do this all the time:
>
>    #ifdef _FUNCTION_PROTOTYPES
>    extern void WcWidgetCreation ( Widget root );
>    #else
>    extern void WcWidgetCreation();
>    #endif
>
>    void WcWidgetCreation ( root )
>        Widget       root;
>    {
>        ...
>
>This seems like the easiest way to use prototyped function declatations
>when your compiler supports it, and K&R function definitions in any
>case.  Then only the *.h files need to have #ifdef's.

Indeed, and this is essentially the technique I use.  (In
external function declarations, I omit the #else, and leave the
nonprototyped form visible to both kinds of compilers, which adds
a bit of consistency checking.)

This technique works well, although there are two important
caveats which require some care in applying, which is why mixing
prototyped declaration with "old style" definitions is not
generally recommended.

The two caveats are:

     1.	The prototype declaration must use the widened types (int
	or double) for any parameters in the old-style definition
	which are "narrow" (char, short, or float).

     2.	The prototype must not contain an ellipsis, and hence the
	function must not accept a variable number of arguments.

Caveat 2 means that this "reduced-#ifdef" technique cannot be
used everywhere; functions which take a variable number of
arguments must still be defined with #ifdefs or other tricks if
the code is to be acceptable to both kinds of compilers.

In article <1991Feb28.021715.18153@athena.mit.edu> jik@athena.mit.edu (Jonathan I. Kamens) writes:
>  That may be the easiest way to do things, but it's not legal ANSI C.
>  In ANSI C, a function declaration must match its definition.  That means
>that if the declaration is prototyped, then the definition must also be
>prototyped, with the same types; if the declaration is not prototyped, then
>the definition cannot be prototyped either.
>  [mentions problem with parameters of type char; this is one of the caveats]
>  Summary: Function declarations and definitions must match.  If you have
>#ifdef's in hour headers to decide whether or not to use prototypes, then you
>must #ifdef your definitions similarly.

This advice is overly restrictive.  (In particular, the third
quoted sentence does not reflect a requirement of the Standard.)
The relevant sections from ANSI X3.159 are 3.3.2.2:

	If the function is defined with a type that is not compatible
	with the type (of the expression) pointed to by the expression
	that denotes the called function, the behavior is undefined.

and 3.5.4.3:

	For two function types to be compatible, both shall specify
	compatible return types.  Moreover, the parameter type lists, if
	both are present, shall agree in the number of parameters and in
	use of the ellipsis terminator; corresponding parameters shall
	have compatible types.  If one type has a parameter type list
	and the other type is specified by a function declarator that is
	not part of a function definition and that contains an empty
	identifier list, the parameter list shall not have an ellipsis
	terminator and the type of each parameter shall be compatible
	with the type that results from the application of the default
	argument promotions.  If one type has a parameter type list and
	the other type is specified by a function definition that
	contains a (possibly empty) identifier list, both shall agree in
	the number of parameters, and the type of each prototype
	parameter shall be compatible with the type that results from
	the application of the default argument promotions to the type
	of the corresponding identifier.

That's a lot of words, but if you read it carefully, you'll find
that the four sentences cover:

     1.	the return type

and the cases in which:

     2.	both declaration and definition have prototypes
     3.	definition (or one declaration) has a prototype, another
	declaration does not
     4.	declaration has a prototype, definition is old-style

There is a minor omission in the fourth sentence; it should
probably emphasize that the type with a parameter type list shall
not have an ellipsis terminator.

We are most interested in the fourth sentence, which says that
the correct (new-style) prototype declaration for the (old style)
function definition

	x(c, s, i, f, d)
	char c;
	short s;
	int i;
	float f;
	double d;

is

	extern x(int, int, int, double, double);

Obviously it is best to avoid such anomalous and apparently
contradictory declarations, by avoiding parameters of type char,
short, and float.  (Many people suspect gcc to be buggy when it
correctly diagnoses, with a long, verbose error message, a float
in a prototype erroneously corresponding to a float in an
old-style definition, which is why a question on this part of the
problem appears in the comp.lang.c frequently-asked questions
list.)

However, as long as either

     1.	the old-style definition contains no char, short, or int
	parameters 
or
     2. the prototype declaration is careful to use the promoted
	type of each "narrow" parameter

, and as long as the prototype does not contain an ellipsis, the
"mixture" is guaranteed to work.

If you want to be safe and follow the sheep, use prototypes
consistently.  If you want to keep using old tools (compilers,
lint, etc.), and if you're not using an automated ANSI->K&R tool,
and if you don't like #ifdefs or other preprocessor tricks in
function definitions, use these mixtures in good health.  (Just
be very careful of those narrow parameters and ellipses.)

                                            Steve Summit
                                            scs@adam.mit.edu

gwyn@smoke.brl.mil (Doug Gwyn) (03/01/91)

In article <1991Feb28.021715.18153@athena.mit.edu> jik@athena.mit.edu (Jonathan I. Kamens) writes:
>  In ANSI C, a function declaration must match its definition.  That means
>that if the declaration is prototyped, then the definition must also be
>prototyped, with the same types; if the declaration is not prototyped, then
>the definition cannot be prototyped either.

This is not true.  The types must be compatible, but subject to that
constraint the use of "old-style definitions" is compatible with the
use of prototypes.  (Basically, this means that the declared
parameter types must be those resulting from default argument
promotion, i.e. "widened" types.)

henry@zoo.toronto.edu (Henry Spencer) (03/01/91)

In article <1991Feb28.021715.18153@athena.mit.edu> jik@athena.mit.edu (Jonathan I. Kamens) writes:
>  In ANSI C, a function declaration must match its definition.  That means
>that if the declaration is prototyped, then the definition must also be
>prototyped, with the same types; if the declaration is not prototyped, then
>the definition cannot be prototyped either.

Not so.  The types of declaration and definition must match, after some
complicated considerations of default argument promotions etc. are taken
into account.  But there is no requirement that use and definition match
in the sense of using prototypes.  It is quite permissible to provide
a prototype for an old-style function or use a new-style definition for 
an unprototyped function.  You just have to know what you are doing; it
is not quite as easy as it looks.  The example Jonathan gives is indeed
illegal, but it's simple to fix it.
-- 
"But this *is* the simplified version   | Henry Spencer @ U of Toronto Zoology
for the general public."     -S. Harris |  henry@zoo.toronto.edu  utzoo!henry

david@jpl-devvax.JPL.NASA.GOV (David E. Smyth) (03/01/91)

jik@athena.mit.edu (Jonathan I. Kamens) writes:
>
>  In ANSI C, a function declaration must match its definition.  That means
>that if the declaration is prototyped, then the definition must also be
>prototyped, with the same types; if the declaration is not prototyped, then
>the definition cannot be prototyped either.
>
>  This isn't just a matter of being picky, either; the differences between how
>prototyped and unprototyped functions are compiled will cause your programs to
>break if your declarations don't match your definitions.  That's why the file
>that defines functions should always include the header file that declares
>them, so you'll get errors if the declaration and the definition don't match.

I always include the file which has the prototyped declarations in the
file which has the K&R style definitions.  You are correct, the following
does not compile:

----- random.h -----
extern void random(char a, char b, char c, char d, char e);
----- random.c -----
#include "random.h"

void random(a, b, c, d, e)
char a, b, c, d, e;
{
    printf("%c %c %c %c %c\n", a, b, c, d, e);
}
----- main.c -----
#include "random.h"

void main()
{
    random('a', 'b', 'c', 'd', 'e');
}
----- how I tried compiling it -----
bugs:david <34> gcc -ansi main.c random.c -o ran
random.c: In function random:
random.c:5: argument `a' doesn't match function prototype
random.c:5: a formal parameter type that promotes to `int'
random.c:5: can match only `int' in the prototype
random.c:5: argument `b' doesn't match function prototype
random.c:5: a formal parameter type that promotes to `int'
random.c:5: can match only `int' in the prototype
random.c:5: argument `c' doesn't match function prototype
random.c:5: a formal parameter type that promotes to `int'
random.c:5: can match only `int' in the prototype
random.c:5: argument `d' doesn't match function prototype
random.c:5: a formal parameter type that promotes to `int'
random.c:5: can match only `int' in the prototype
random.c:5: argument `e' doesn't match function prototype
random.c:5: a formal parameter type that promotes to `int'
random.c:5: can match only `int' in the prototype

Bleech!  I stand corrected - I see I do have to use that
horribly ugly programming style of having both the
declarations in include files and the definitions in
the source files BOTH enclosed by #ifdefs

Bleech!  Gag!  *&@#$^ &*#Q$ &@*#$ @*&#$ !!!

-------------------------------------------------------------------------
David Smyth				david@jpl-devvax.jpl.nasa.gov
Senior Software Engineer,		seismo!cit-vax!jpl-devvax!david
X and Object Guru.			(818)393-0983
Jet Propulsion Lab, M/S 230-103, 4800 Oak Grove Drive, Pasadena, CA 91109
------------------------------------------------------------------------- 
	One of these days I'm gonna learn:  Everytime I throw
	money at a problem to make it disappear, the only thing
	that disappears is my money...
-------------------------------------------------------------------------

scs@adam.mit.edu (Steve Summit) (03/01/91)

In article <1991Feb28.072947.28885@athena.mit.edu>, I wrote:
>[carefully mixing prototype declarations with old-style function
>definitions] works well, although there are two important
>caveats which require some care in applying, which is why mixing
>prototyped declaration with "old style" definitions is not
>generally recommended.

I left out a third, important caveat: "old style" definitions (as
well as external function declarations without prototypes) are
officially "obsolescent," and may be removed from future revisions
of the standard.  (See X3.159 sections 3.9.4 and 3.9.5 .)
Therefore, my last paragraph should read:

> If you want to keep using old tools (compilers,
> lint, etc.), and if you're not using an automated ANSI->K&R tool,
> and if you don't like #ifdefs or other preprocessor tricks in
> function definitions,

and if you won't mind having to rewrite your code in ten years or
so,

> use these mixtures in good health.

(This is not supposed to sound sarcastic; you should think about
what will happen to your code ten years down the line.  A
surprising amount of code, coddled by compilers which continued
to accept them, still used =+ and the like ten years after =op
was deprecated.)

                                            Steve Summit
                                            scs@adam.mit.edu

david@jpl-devvax.JPL.NASA.GOV (David E. Smyth) (03/01/91)

scs@adam.mit.edu (Steve Summit) writes that ANSI C may well stop
supporting old-style function definitions at some time in the future.
Therefore he recommends that writing K&R style definitions is probably
a bad idea in the long term.

Well, the problem is this: ANSI is not portable, K&R is.

Oh, I know I know I know.  But hear me out:

I actually wanted to use ANSI for the Widget Creation Library, and
initially implemented it using ANSI.  Problem:  Many, many people could
not compile it!  So, I went back to K&R.  Now its portable.  The reason
I provide the ANSI function protos in decls is just in case somebody,
somewhere ONLY has and ANSI compiler (which of course still understands
K&R function definitions).

It is true that Wcl is not totally portable, and perhaps one of the
problems is that I may not have been careful enough somewhere in making
certain that I'm never using non-promoted arguments to functions.

It seems that one *must* write code which works with or without
prototypes just to be sure that it is portable.  I.e., perhaps the
correct approach is that suggested by the person who write that
whenever you have ANSI prototypes, you also have K&R declarations as
well as K&R definitions, is correct.

Perhaps even more important that portability is simply interoperability
on a single platform.  If your functions work identically with or
without prototypes (int-size or double arguments only) then people can
mix libraries and applications which have been built using different
compilers.

So, I'll bet that if a C compiler did NOT compile K&R C, it probably
would in fact be alot less useful than one that did.  Since compilers
are generally written by people who for one reason or another want the
compiler to be useful to many people, I doubt if we will see good
compilers which understand only ANSI declarations and definitions.

I could very, very, very easily be wrong.  I'm not a prophet, like
Saddam or anything ;^)

-------------------------------------------------------------------------
David Smyth				david@jpl-devvax.jpl.nasa.gov
Senior Software Engineer,		seismo!cit-vax!jpl-devvax!david
X and Object Guru.			(818)393-0983
Jet Propulsion Lab, M/S 230-103, 4800 Oak Grove Drive, Pasadena, CA 91109
------------------------------------------------------------------------- 
	One of these days I'm gonna learn:  Everytime I throw
	money at a problem to make it disappear, the only thing
	that disappears is my money...
-------------------------------------------------------------------------

scs@adam.mit.edu (Steve Summit) (03/01/91)

In article <11634@jpl-devvax.JPL.NASA.GOV> david@jpl-devvax.JPL.NASA.GOV (David E. Smyth) writes:
>scs@adam.mit.edu (Steve Summit) writes that ANSI C may well stop
>supporting old-style function definitions at some time in the future.
>Therefore he recommends that writing K&R style definitions is probably
>a bad idea in the long term.
>Well, the problem is this: ANSI is not portable, K&R is.
>I actually wanted to use ANSI for the Widget Creation Library, and
>initially implemented it using ANSI.  Problem:  Many, many people could
>not compile it!  So, I went back to K&R.  Now its portable.

I was so busy covering all the bases it was probably hard to tell
what I was recommending.  It sort of depends on who's asking.
Remember, programming is an essentially pragmatic activity; we
don't have to be idealists about these things.  The best
long-term strategy is not necessarily the best short-term
strategy.  (I *hate* that attitude, and I'd rather it didn't
apply, but I suppose sometimes it does.)

In my own work, I use the "disrecommended" old-style function
definitions almost exclusively (with the exception of varargs
functions).  Most of the time, I don't even bother with a bunch
of externally-visible prototype declarations: they're not
required, and all they do for me is cost me effort to create and
figure out where to put them, and cause extra recompilations
because the right place to put them is in header files which
there then have to be more of, and more widely #included.
(Many people claim that prototypes are also good for cross-file
argument type checking; I use lint for that.)

Another popular technique is to write code using new-style,
prototype function definitions, using a simple converter when the
source must be passed to an old-style tool.  If you always write
your function definitions in the same, stylized way, the
converter can be quite simple indeed, practically an awk or sed
script.

However, if you're shipping code all over the place, it's
probably easier if you ship old-style code, rather than having to
ship the conversion tool (and teach people how to use it).

My previous two articles on this topic did *not* come out and say
"I use old-style function definitions for portability; doing so
is clearly superior and everybody else should, too."  There
aren't always easy, simplistic answers; that's why programming
can be hard work and why debates here (and elsewhere) can rage
for so long.  Whether I like it or not, the use of modern,
prototyped declarations and function definitions is the currently
recommended practice, and I would not be doing anybody a favor if
I went around actively discouraging their use.

Certainly, there are situations in which interim avoidance of
full-scale function prototypes is advisable; it sounds like the
X toolkit is a good example.

There are several internally-consistent strategies I can think
of, with a number of tradeoffs between them:

Strategy 1: "Reactionary."  Use "Old C" exclusively.

Of course, "Old C" is not as precisely defined as "ANSI C;" there
are a number of variants.  Someone using this strategy codes only
for a small number of machines, none of which has an ANSI
compiler.  The code is not terribly portable outside of that
group of machines.  The coding style is tailored to the "Old C"
dialects present on the old machines in question.

Strategy 2: "Curmudgeon."  Use the intersection of "Old C" and
ANSI C.

This is probably the most demanding strategy, because you can't
use any ANSI features, and you can't use a number of "Old C"
features which have been disallowed, or had their definitions
changed, by X3J11.  Like all good, simplistic tradeoffs, however,
this most demanding strategy is also the most portable, at least
for now.

You can modify this strategy, to encompass certain kinds of ANSI
features, if necessary.  For example, any functions which accept
a variable number of arguments must be defined for an ANSI
compiler using a new style, prototype definition, inside #ifdef
__STDC__.  You can use newer standard library routines (strtoi,
vfprintf, etc.), even if your old systems don't support them, if
you can provide your own implementations.

Strategy 3: "Schizophrenic."  Use old-style function definitions
and new-style, prototype declarations, keeping the declarations
inside #ifdef __STDC__ or the equivalent.

This is like 2, except that you start adding prototypes.  (It's
therefore more work, so maybe 2 isn't the most demanding after
all.  Never mind.)  There are a couple of reasons for adding the
prototypes, while leaving the definitions old style: to placate
people who have "grown up" with prototypes, and are used to
seeing them, and to placate new compilers, which often issue
warnings for function calls without prototypes in scope.
If the code is compiled on systems with new compilers but without
lint, the prototypes can help keep the calls correct there.

There is a drawback to this "fence sitting" strategy, though: if
several people are developing, maintaining, and modifying the
code, some with new compilers and some with old ones, it's easy
for the function prototypes to get screwed up, because only half
the programmers care about them, and only half the compilers see
(and check) them.  (You can use some Stupid Preprocessor Tricks
to arrange for there to be one set of declarations and/or
definitions, rather than two, but they typically require
counterintuitive double parentheses.  Several people claim this
is a reasonable tradeoff; I'll not argue with them.)

Strategy 4: "Liberal."  Use a subset of ANSI C, such that simple
tools can reliably convert the code back to old-style, when
necessary.

This is a nice, forward-thinking strategy.  (Until you run it
through the converter, the code is both 100% ANSI compliant and
doesn't run afoul of the "future directions," namely that
old-style support may go away some day.)  It's a bit of work to
steer clear of new features (especially those in the
preprocessor, such as # and ##), which can't be converted back to
old-style trivially, and to stay within whatever formatting
conventions the converters use, if simplistic.

Strategy 5: "Don't look back."  Use ANSI C exclusively and
comprehensively.

Obviously, you can use this only if you aren't likely to move the
code to a system with an old compiler, and you aren't interested
in making it easy for anyone else who might have to.  It's also
the easiest strategy, since all the new "Programming in C" books
describe ANSI C.  (Soon enough, converters will exist which will
do a good job of translating function prototypes back to
old-style, but you'd need a full compiler, with a "K&R" back-end
"code generator," to translate code which makes full use of any
and all available ANSI features, as whole-hog "Don't Look Back"
code inevitably would.)


We could sit around and discuss the interlocking tradeoffs
between these and other strategies for a long, long time.  If
widespread portability is important, choose 2 or 3.  If your main
development compiler accepts prototypes, strategy 4 is
attractive; if most of your work is under older compilers,
strategies 2 and 3 look better.  If you'd rather not worry about
the caveats associated with mixing prototypes and old-style code
(these have been discussed in this thread already), lean towards
5, or maybe 4.  If you'd like to avoid a wholesale rewrite on the
day when the "obsolescent" old-style forms become truly obsolete,
definitely use 4 or 5.  If you believe that decent automated
converters will certainly be available by then, don't worry about
using 2 and 3.

Already, large numbers of people appear to be jumping in to
strategy 5 with both feet.  This strategy appears to hold the
fewest surprises (you don't have to worry about f(x) float x;
stuff), but it is a little surprising how fast the cry of "I just
picked up this nice program but it's written in ANSI C and my
old compiler chokes on it; what should I do?" has become
widespread.


I don't know whether a future revision of the Standard will
actually delete support for old-style functions or not.  Karl
Heuer and I discussed it once, and agreed that the situation was
comparable to the old =op operators (=+, =-, etc.).  These were
already obsolescent in 1978, and the compilers were starting to
issue warnings for them.  Ten years later, most (but not all!) of
the =op's in old code had been weeded out, and X3J11 could delete
support for them with impunity.

On the other hand, there were fewer compilers and fewer lines of
code around in the late 70's, so there probably wasn't the kind
of backwards pressure then (to keep using =op in code, for
portability's sake) as there is now (to keep using old-style
functions).  If the reactionaries, curmudgeons, and
schizophreniacs keep it up, there may be so much "old" code
around when the standard comes up for review that the old
functions will be left in (or supported by most compilers as an
extension.)

On the other other hand, there's an awful lot of extra baggage in
an ANSI compiler to handle both types of function declarations,
and it would be best to pare that down eventually.  Since
deleting prototypes isn't an option, old-style functions probably
ought to go some day.  (Anyway, I'm sure there will be good
converters available by then, to automate the conversion of any
remaining old-style code.)

Personally, I'll probably switch over to prototype-style
definitions within the next few years, although I'll never be big
on prototyped external declarations.  (The implicit declaration
rule remains in force, and I don't think it's slated for removal,
so it will remain legal to define int-return, fixed-number-of-
arguments functions using new-style, prototype syntax, and to
call them without any external declarations in scope at all,
prototypes or no.  Once old-style syntax has been fully retired,
this could be classed as a "New Reactionary" or "Wishful Thinker"
strategy.)

                                            Steve Summit
                                            scs@adam.mit.edu

usenet@nlm.nih.gov (usenet news poster) (03/01/91)

In article <11614@jpl-devvax.JPL.NASA.GOV> david@jpl-devvax.JPL.NASA.GOV (David E. Smyth) writes:
>I do this all the time:
>in WcCreate.h:
>    #ifdef _FUNCTION_PROTOTYPES
>    ...
>    extern void WcWidgetCreation ( Widget root );
>    ...
>    #else
>    ...
>    extern void WcWidgetCreation();
>    ...
>    #endif

Better still:

#ifdef _FUNCTION_PROTOTYPES
#define PROTO(x) x
#else
#define PROTO(x)
#endif

extern void WcWidgetCreation PROTO((Widget root));
extern void WcWidgetCompaction PROTO((Widget leaves));
extern void WcWidgetDestruction PROTO((Widget branches));
extern void WcWidgetDemolition PROTO((Widget knots, Widget tegdiw, ...));
extern void WcWidgetEmulation PROTO((Widget worms));


  Warren Gish                           phone:  (301) 496-2475
  National Center                       FAX: (301) 480-9241
     for Biotechnology Information      Internet:  gish@ncbi.nlm.nih.gov
  National Library of Medicine
  Building 38A, Room 8S-806
  8600 Rockville Pike
  Bethesda, MD 20894-0001

jdarnold@pondsquid.mv.com (Jonathan Arnold) (03/01/91)

usenet@nlm.nih.gov (usenet news poster) writes:
> In article <11614@jpl-devvax.JPL.NASA.GOV> david@jpl-devvax.JPL.NASA.GOV (Dav
> #ifdef _FUNCTION_PROTOTYPES
> #define PROTO(x) x
> #else
> #define PROTO(x)
> #endif
>   Warren Gish                           phone:  (301) 496-2475

A friend and I wrote a medium-sized program and ran into this exact
problem - he has a fairly old, vanilla UNIX cc, without ANSI, while I'm
using Turbo C++ V1.0 and didn't want to give up prototypes.  This is the
solution we came up with (quite similar to the above):

#ifdef  USE_PROTOS
#define PROTO( def )    def
#define NOARGLIST       ( void )
#define ARGLIST( args ) (
#define NFARG( def )    def,
#define FARG( def )     def )
#else
#define PROTO( def )    ()
#define NOARGLIST       ()
#define ARGLIST( args ) args
#define NFARG( def )    def;
#define FARG( def )     def;
#endif

As examples:
      PROTOTYPES
      ----------
int tkline PROTO( (char *, char *, char **, int, char *, char *) );
char *gettoken PROTO( (char *, char *, int, char *, char *) );
int gettkline PROTO( ( FILE *, char *, int, int *, char **, int) );
int getline PROTO( (FILE *, char *, int) );

      ACTUAL FUNCTIONS DECLARATIONS
      -----------------------------
int tkline ARGLIST( ( bufP, tbufP, tokv, tokmax, vsepP, isepP ) )
	NFARG ( char *bufP )		/* Buffer ptr */
	NFARG ( AREG1 char *tbufP)	/* Token buffer pointer */
	NFARG ( AREG2 char **tokv)	/* Arg vectors */
	NFARG ( int tokmax)		/* Max # tokens */
	NFARG ( char *vsepP)		/* Visible separators */
	FARG ( char *isepP)		/* Invisible separators */
{
   .....
}


and

int gettkline ARGLIST( (fP, bufP, bufsize, tokcP, tokv, tokmax) )
	NFARG( FILE *fP )		/* Input file ptr */
	NFARG( char *bufP )		/* Buffer ptr */
	NFARG( int bufsize )	/* Room in buffer */
	NFARG( int *tokcP )		/* Variable to hold token count */
	NFARG( char **tokv )		/* Arg vectors */
	FARG( int tokmax )		/* Max # tokens */


It's is kind of ugly, but it works and you get full function prototyping. 
What does everyone out there think of this method?


--
Jonathan Arnold                  | Blue Sky Productions
Bus. Phone: (603)894-5336        | 59 Stiles Rd, Ste. 106
Home Phone: (617)335-5457        | Salem NH 03079
uucp: jdarnold@pondsquid.MV.COM
  or ...{decvax|elrond|harvard}!zinn!pondsquid!jdarnold

brianh@hpcvia.CV.HP.COM (brian_helterline) (03/02/91)

scs@adam.mit.edu (Steve Summit) writes:
:In article <11614@jpl-devvax.JPL.NASA.GOV> david@jpl-devvax.JPL.NASA.GOV (David E. Smyth) writes:
:>I do this all the time:
:>
:>    #ifdef _FUNCTION_PROTOTYPES
:>    extern void WcWidgetCreation ( Widget root );
:>    #else
:>    extern void WcWidgetCreation();
:>    #endif
:>
:>    void WcWidgetCreation ( root )
:>        Widget       root;
:>    {
:>        ...
:>
:>This seems like the easiest way to use prototyped function declatations
:>when your compiler supports it, and K&R function definitions in any
:>case.  Then only the *.h files need to have #ifdef's.

:Indeed, and this is essentially the technique I use.  (In
:external function declarations, I omit the #else, and leave the
:nonprototyped form visible to both kinds of compilers, which adds
:a bit of consistency checking.)

:This technique works well, although there are two important
:caveats which require some care in applying, which is why mixing
:prototyped declaration with "old style" definitions is not
:generally recommended.
:
:The two caveats are:
:
:     1.	The prototype declaration must use the widened types (int
:	or double) for any parameters in the old-style definition
:	which are "narrow" (char, short, or float).
:
I also use prototypes with "old-stye" definitions but I let the compiler
generate the prototypes and it uses the widened types when necessary so
I don't have to worry about it. e.g.

Function( arg1 )
float arg1;
{return 0}

would produce the prototype: int Function( double arg1 );

[caveat 2 deleted]

peter@ficc.ferranti.com (Peter da Silva) (03/02/91)

In article <15354@smoke.brl.mil> gwyn@smoke.brl.mil (Doug Gwyn) writes:
> This is not true.  The types must be compatible, but subject to that
> constraint the use of "old-style definitions" is compatible with the
> use of prototypes.

Only if the prototype is in scope when it sees the definition. The compiler
might generate a different calling sequence for prototyped functions because
it knows the argument count.
-- 
Peter da Silva.  `-_-'  peter@ferranti.com
+1 713 274 5180.  'U`  "Have you hugged your wolf today?"

scs@adam.mit.edu (Steve Summit) (03/03/91)

In article <MLT9QB7@xds13.ferranti.com> peter@ficc.ferranti.com (Peter da Silva) writes:
>In article <15354@smoke.brl.mil> gwyn@smoke.brl.mil (Doug Gwyn) writes:
>> The [function] types must be compatible, but subject to that
>> constraint the use of "old-style definitions" is compatible with the
>> use of prototypes.
>
>Only if the prototype is in scope when it sees the definition. The compiler
>might generate a different calling sequence for prototyped functions because
>it knows the argument count.

Doug's statement is true, without exception, and reflects clear
language in X3.159 (which I have quoted from in two previous
articles on this subject), which defines exactly what is required
for two function types to be compatible.

If a function takes a fixed number of arguments, none of which is
a "narrow" type (char, short, or float), it may be defined using
old or new style syntax, defined with or without a prototype in
scope, and called with or without a prototype in scope, and under
any of these conditions the compiler is obliged to generate
consistent, compatible calling sequences.

It is only useful and legal for a compiler to use a "different"
calling sequence for functions which take variable numbers of
arguments, and such functions are specifically required to be
defined using new-style, prototype syntax, and to be called with
a valid prototype in scope.

If you understand this, you can stop reading now.  The rest of
this article is mostly taken from a comp.std.c article of mine
last month.

Peter stated that:
> The compiler
> might generate a different calling sequence for prototyped functions because
> it knows the argument count.

This is a very misleading statement.  The correct statement is

	The compiler might generate a different calling sequence
	for a function prototyped with an ellipsis, because it
	does NOT know the argument count.

As I explained in <1991Feb12.005726.22447@athena.mit.edu>,
old-style functions may be assumed to be fixed-argument.  (This
is actually consistent with the original definition of C; printf
has always been an anomaly.)  A compiler may (and in fact,
should) use its efficient calling sequences when prototypes are
not in scope.  The variable upon which to base the choice of
function-call linkage is not the presence or absence of
prototypes, but the presence or absence of variable-length
argument lists.  (I suspect that it is a misunderstanding of this
point that is causing the confusion.)

None of this is accidental; the requirement that functions with
variable numbers of arguments be called with a prototype in scope
was made precisely so that a compiler could use one function
calling mechanism for them, and another one for all others.

If a compiler somewhere is using a different, presumably older,
presumably less efficient calling sequence for non-prototyped
functions, it is doing its customers a double disservice.  First
of all, it is confusing them badly, and probably incorrectly
rejecting compliant code.  Secondly, there is no reason to use
the less efficient calling sequence for old functions; the
"newer," presumably more efficient (though not admitting of
variable numbers of arguments) calling sequence can and should be
used.  Efficiency is, after all, the most important
consideration, and the main reason prototypes were adopted.
(Don't buy this line that they're for checking function call
consistency across source files, "for your convenience."  If
they're for your convenience, then why do you have to work to
supply them?)

                                            Steve Summit
                                            scs@adam.mit.edu

gwyn@smoke.brl.mil (Doug Gwyn) (03/03/91)

In article <MLT9QB7@xds13.ferranti.com> peter@ficc.ferranti.com (Peter da Silva) writes:
>In article <15354@smoke.brl.mil> gwyn@smoke.brl.mil (Doug Gwyn) writes:
>> This is not true.  The types must be compatible, but subject to that
>> constraint the use of "old-style definitions" is compatible with the
>> use of prototypes.

>Only if the prototype is in scope when it sees the definition.

Wrong.

ok@goanna.cs.rmit.oz.au (Richard A. O'Keefe) (03/03/91)

In article <g8w5X2w163w@pondsquid.mv.com>, jdarnold@pondsquid.mv.com (Jonathan Arnold) writes:
> usenet@nlm.nih.gov (usenet news poster) writes:
[about a way of coding for both ANSI and Classic C, with examples]
> int tkline ARGLIST( ( bufP, tbufP, tokv, tokmax, vsepP, isepP ) )
> 	NFARG ( char *bufP )		/* Buffer ptr */
> 	NFARG ( AREG1 char *tbufP)	/* Token buffer pointer */
> 	NFARG ( AREG2 char **tokv)	/* Arg vectors */
> 	NFARG ( int tokmax)		/* Max # tokens */
> 	NFARG ( char *vsepP)		/* Visible separators */
> 	FARG ( char *isepP)		/* Invisible separators */
> It is kind of ugly, but it works and you get full function prototyping. 
> What does everyone out there think of this method?

That it is redundant when there is no need at all for redundancy.
All you have to do is take your ANSI C headers, stick a few more commas
in, and add an identifier:

#ifdef	__STDC__

#define	PROTO(x)	x

#define	A1(t1,v1) \
	(t1 v1)
#define A2(t1,v1,t2,v2) \
	(t1 v1, t2 v2)
#define A3(t1,v1,t2,v2,t3,v3) \
	(t1 v1, t2 v2, t3 v3)
#define A4(t1,v1,t2,v2,t3,v3,t4,v4) \
	(t1 v1, t2 v2, t3 v3, t4 v4)
#define A5(t1,v1,t2,v2,t3,v3,t4,v4,t5,v5) \
	(t1 v1, t2 v2, t3 v3, t4 v4, t5 v5)
#define A6(t1,v1,t2,v2,t3,v3,t4,v4,t5,v5,t6,v6) \
	(t1 v1, t2 v2, t3 v3, t4 v4, t5 v5, t6 v6)

#else

#define	PROTO(x)	()

#define	A1(t1,v1) \
	(v1) t1 v1;
#define A2(t1,v1,t2,v2) \
	(v1,v2) t1 v1; t2 v2;
#define A3(t1,v1,t2,v2,t3,v3) \
	(v1,v2,v3) t1 v1; t2 v2; t3 v3;
#define A4(t1,v1,t2,v2,t3,v3,t4,v4) \
	(v1,v2,v3,v4) t1 v1; t2 v2; t3 v3; t4 v4;
#define A5(t1,v1,t2,v2,t3,v3,t4,v4,t5,v5) \
	(v1,v2,v3,v4,v5) t1 v1; t2 v2; t3 v3; t4 v4; t5 v5;
#define A6(t1,v1,t2,v2,t3,v3,t4,v4,t5,v5,t6,v6) \
	(v1,v2,v3,v4,v5) t1 v1; t2 v2; t3 v3; t4 v4; t5 v5; t6 v6;

#endif

The example then becomes
    int tkline A6(
	char*,       bufP,		/* Buffer pointer */
	AREG1 char*, tbufP,		/* Token buffer pointer */
	AREG2 char**,tokv,		/* Arg vectors */
	int,         tokmax,		/* Max # of tokens */
	char*,       vsepP,		/* Visible separators */
	char*,       isepP)		/* Invisible separators */
	{ ... }

All done by kindness.
-- 
The purpose of advertising is to destroy the freedom of the market.

dbrooks@osf.org (David Brooks) (03/05/91)

jik@athena.mit.edu (Jonathan I. Kamens) writes:
|>   Now, you may get lucky for a while any only run into compilers that leave
|> the same amount of space on the stack for chars and ints.  But eventually
|> you're going to run into a compiler that doesn't.  And you're going to lose.

For all the lambasting Jonathan has gotten, he speaks truth here
(although I would put "un" before "lucky").

I recently saw the dangerous situation referred to in previous posts:
some routines with narrow arguments, and an attempt to cover both
prototyping and non-prototyping compilers with #ifdefs.

So, player A compiles the library with prototypes. She hands the
binary to player B, who compiles an application without using
prototypes (because he's got an old compiler).  Potential chaos
results: the application widens its arguments and the library expects
them unwidened.

Unfortunately, the whole mess was tested with gcc as the prototyping
compiler, and our gcc was built so that it leaves the same amount of
space on the stack for ints and chars (I understand this is a gcc
configuration option).  So the chaos remained potential, and
unnoticed, until it met up with a customer whose compiler happened to
like saving space on the stack.

This was not nice from a customer service standpoint.  There's a moral
here somewhere; maybe it's "read the FAQ".
-- 
David Brooks				dbrooks@osf.org
Systems Engineering, OSF		uunet!osf.org!dbrooks
"It's not easy, but it is simple."

peter@ficc.ferranti.com (Peter da Silva) (03/07/91)

In article <1991Mar2.181953.15401@athena.mit.edu> scs@adam.mit.edu writes:
> Doug's statement is true, without exception, and reflects clear
> language in X3.159 (which I have quoted from in two previous
> articles on this subject), which defines exactly what is required
> for two function types to be compatible.

So you're saying that the compiler is not allowed to take advantage of the
prototype information and use (for example) callee saves or special LINK
instructions when calling prototyped functions?

This conflicts with...

> Secondly, there is no reason to use
> the less efficient calling sequence for old functions; the
> "newer," presumably more efficient (though not admitting of
> variable numbers of arguments) calling sequence can and should be
> used.

I don't believe this statement is true. If there is no prototype is
in scope, the compiler cannot make any assumption about the calling
sequence: it may be dealing with older code that uses a pre-ANSI method
of handling variadic functions, for example. For that matter the customer
might be working with older, pre-ansi libraries without source code
available.

> Efficiency is, after all, the most important
> consideration, and the main reason prototypes were adopted.
> (Don't buy this line that they're for checking function call
> consistency across source files, "for your convenience."  If
> they're for your convenience, then why do you have to work to
> supply them?)

It's not just a matter of "convenience". You need to have function
prototypes to generate correct code in the general case, due to (among
other things) the common use of uncast null pointer constants in
function calls and other vaxisms that depend on sizeof(int) ==
sizeof(char *).

Every other algol-type language I know of, even the execrable PL/M, does
require full prototypes for all functions. The compromise in ANSI C is
a reasonable compromise with current usage, and a long-needed enhancement
to the language.

In an environemnt where things like X-windows are accepted as the best
that we can do, efficiency arguments for prototypes must be taken with
a grain of salt.
-- 
Peter da Silva.  `-_-'  peter@ferranti.com
+1 713 274 5180.  'U`  "Have you hugged your wolf today?"

torek@elf.ee.lbl.gov (Chris Torek) (03/07/91)

In article <BPX9AT9@xds13.ferranti.com> peter@ficc.ferranti.com
(Peter da Silva) writes:
>So you're saying that the compiler is not allowed to take advantage of the
>prototype information and use (for example) callee saves or special LINK
>instructions when calling prototyped functions?

This statement is correct (in a strict logical sense), but all the
implications it makes are wrong.  That is, the statement

	[compiler treats prototyped functions specially] implies
		[compiler may not take advantage of...]

is true but only because the clause [compiler treats prototyped
functions specially] is false.  Aside from default promotions,
prototyped functions and non-prototyped functions are treated
identically.  In particular, the compiler make take advantage of
special LINK or RET instructions for non-prototyped functions as
well as for the prototyped versions.

The only special case is varargs functions, which *must* be declared
using the new prototype varargs syntax.  All other functions may be
assumed to take fixed argument lists.  In other words,

	f(fmt, a1, a2, a3, a4, a5) char *fmt; {
		printf(fmt, a1, a2, a3, a4, a5);
	}
	...
		f("%d\n", 3);

is illegal, because the caller did not supply the proper number and
type of arguments.
-- 
In-Real-Life: Chris Torek, Lawrence Berkeley Lab EE div (+1 415 486 5427)
Berkeley, CA		Domain:	torek@ee.lbl.gov

peter@ficc.ferranti.com (Peter da Silva) (03/08/91)

In article <10714@dog.ee.lbl.gov> torek@elf.ee.lbl.gov (Chris Torek) writes:
> This statement is correct (in a strict logical sense), but all the
> implications it makes are wrong.  That is, the statement

> 	[compiler treats prototyped functions specially] implies
> 		[compiler may not take advantage of...]

> is true but only because the clause [compiler treats prototyped
> functions specially] is false.

This means, then, that an ANSI compiler is allowed to make the sorts of
optimisations that prototypes allow it to safely make, without the prototypes
being in scope? This seems to rather the optimisations that an ANSI compiler
can make...
-- 
Peter da Silva.  `-_-'  peter@ferranti.com
+1 713 274 5180.  'U`  "Have you hugged your wolf today?"

scs@adam.mit.edu (Steve Summit) (03/09/91)

In article <USY9_VA@xds13.ferranti.com> peter@ficc.ferranti.com (Peter da Silva) writes:
>This means, then, that an ANSI compiler is allowed to make the sorts of
>optimisations that prototypes allow it to safely make, without the prototypes
>being in scope?

Some of them.  If no prototype is in scope during a function
call, a compiler:

     1.	should use a calling convention which assumes a fixed
	number of arguments, if there is a different convention
	which is used for functions with a variable number of
	arguments;

     2.	must perform the "integral promotions" (char and short to
	int) and promote float to double (see ANSI X3.159 section
	3.3.2.2), and must therefore not use any special "narrow"
	argument passing conventions; and

     3.	may pass values in registers (as opposed to "on a
	stack," if this distinction even means anything) as long
	as doing so does not conflict with 2.

All of this can be deduced from language in the Standard, which
(though verbose and occasionally convoluted) is quite
unambiguous.

>This seems to rather the optimisations that an ANSI compiler
>can make...

There's a missing, so I can't figure out what comment this makes
on ANSI prototypes.  The rules relating prototyped and old-style
functions are somewhat complicated, but they make sense, and are
formulated correctly to allow maximum interoperability (and, yes,
portability) between the two forms.  Given the desire to adopt
prototypes at all, no other decisions could have (consistently)
been made, other than perhaps to disallow old-style functions
altogether.

More reruns (in a futile attempt to hammer the point home):
Old-style functions may be assumed to be fixed-argument.  (This
is actually consistent with the original definition of C; printf
has always been an anomaly.)  A compiler may (and in fact,
should) use its efficient calling sequences when prototypes are
not in scope.  The variable upon which to base the choice of
function-call linkage is not the presence or absence of
prototypes, but the presence or absence of variable-length
argument lists.  (I suspect that it is a misunderstanding of this
point that is causing the confusion.)

Also, to repeat what Chris just said (with a quote from an
earlier article of mine):

The statement

> The compiler
> might generate a different calling sequence for prototyped functions because
> it knows the argument count.

is quite misleading.  The correct statement is

	The compiler might generate a different calling sequence
	for a function prototyped with an ellipsis, because it
	does NOT know the argument count.

If there are still questions on any of this, please send mail; I
think enough has been posted on this by now.

                                            Steve Summit
                                            scs@adam.mit.edu

peter@ficc.ferranti.com (Peter da Silva) (03/12/91)

In article <1991Mar8.204142.14568@athena.mit.edu> scs@adam.mit.edu writes:
> >This seems to rather the optimisations that an ANSI compiler
> >can make...

> There's a missing, so I can't figure out what comment this makes
> on ANSI prototypes.

The missing is "limit", you should be able to that from context. Just
at the sentence for a while, and it will make.

> The rules relating prototyped and old-style
> functions are somewhat complicated, but they make sense, and are
> formulated correctly to allow maximum interoperability (and, yes,
> portability) between the two forms.

Perhaps there should be a lengthy and detailed discussion on this in
the std or an associated docco, because of the compilers that get it
wrong and all the folks who are having so much trouble writing code
that satisfies both sides. I sure haven't found the balance yet.

Sigh...
-- 
Peter da Silva.  `-_-'  peter@ferranti.com
+1 713 274 5180.  'U`  "Have you hugged your wolf today?"

scs@adam.mit.edu (Steve Summit) (03/12/91)

In article <A+-95J6@xds13.ferranti.com> peter@ficc.ferranti.com (Peter da Silva) writes:
>In article <1991Mar8.204142.14568@athena.mit.edu> scs@adam.mit.edu writes:
>> >This seems to rather the optimisations that an ANSI compiler
>> >can make...
>
>> There's a missing, so I can't figure out what comment this makes
>> on ANSI prototypes.
>
>The missing is "limit", you should be able to that from context. Just
>at the sentence for a while, and it will make.

It didn't.  (The word might as easily have been "augment.")  I
still don't understand how prototypes can be thought to be less
useful, or the optimizations which they permit less potent, just
because some of the optimizations are also permitted (and
implicitly required) even in the absence of prototypes.  (If
someone wants old programs, which implement variable-length
argument lists without using ellipses, to keep working, that
would have some additional bearing on the choice of
implementations and optimizations, but those programs are *not*
required to work, and no one is doing anybody any favors by
coddling them.)

>Perhaps there should be a lengthy and detailed discussion on this in
>the std or an associated docco,

It may not be lengthy and detailed, but I think there's adequate
documentation right there in the Standard.  When I was quoting
chapter and verse in several earlier articles on this subject, I
forgot to check the Rationale, which makes the same point I was
making, quite explicitly (Rationale section 3.3.2.2, discussing
compatibility between prototyped and non-prototyped declarations
of functions with no "narrow" types and no variable argument
lists):

	This provision constrains the latitude of an implementor
	because the parameter passing of prototype and non-
	prototype function calls must be the same for functions
	accepting a fixed number of arguments.  Implementations
	in environments where efficient function calling
	mechanisms are available must, in effect, use the
	efficient calling sequence either in all "fixed argument
	list" calls or in none.

>because of the compilers that get it
>wrong and all the folks who are having so much trouble writing code
>that satisfies both sides. I sure haven't found the balance yet.

If there are compilers getting this wrong, let's get 'em fixed
now, before they induce too many people to write incorrect code.
Name names here, or in comp.std.c, if you have to.

I think "the folks who are having so much trouble writing code"
are, for the most part, making their own lives miserable
by insisting on trying to declare "narrow" arguments -- char,
short, and float.  "Doctor, doctor, it hurts when I do that!"
"Well, then don't do that."

If you steer clear of narrow arguments, you can code away to your
heart's content, rarely worrying about whether declarations
and/or definitions use old-style or prototype syntax.  (There are
two exceptions.  First, functions with variable-length argument
lists must be defined using prototyped syntax, and must be called
with prototyped declarations in scope.  If the code is to be
portable to pre-ANSI compilers, #ifdefs are required in both
declarations and definitions of varargs functions.  Second, it
may be important to emphasize prototyped declarations to help
insure correct calls, if lint is not available or is not being
used.)

What other cases are there that it's hard to find a balance for?

                                            Steve Summit
                                            scs@adam.mit.edu

peter@ficc.ferranti.com (Peter da Silva) (03/13/91)

In article <1991Mar12.052025.18801@athena.mit.edu> scs@adam.mit.edu writes:
> If there are compilers getting this wrong, let's get 'em fixed
> now, before they induce too many people to write incorrect code.
> Name names here, or in comp.std.c, if you have to.

Lattice C for the Amiga seems to accept prototypes with narrow
arguments and old-style definitions in the same scope.

> I think "the folks who are having so much trouble writing code"
> are, for the most part, making their own lives miserable
> by insisting on trying to declare "narrow" arguments -- char,
> short, and float.  "Doctor, doctor, it hurts when I do that!"
> "Well, then don't do that."

Well, I don't know. I keep seeing people saying things like "I tried to
get this working both with and without prototypes, and finally gave up."
Or "I think this should work on a pre-ANSI compiler" followed by gobs
of complaints. And all the people saying "for portability, for now, stick
to K&R1".
-- 
Peter da Silva.  `-_-'  peter@ferranti.com
+1 713 274 5180.  'U`  "Have you hugged your wolf today?"

amodeo@dataco.UUCP (Roy Amodeo) (03/14/91)

In article <4865@goanna.cs.rmit.oz.au> ok@goanna.cs.rmit.oz.au (Richard A. O'Keefe) writes:
>
>All you have to do is take your ANSI C headers, stick a few more commas
>in, and add an identifier:
>
>#ifdef	__STDC__
>
>#define	PROTO(x)	x
>
>	...
>#define A2(t1,v1,t2,v2) \
>	(t1 v1, t2 v2)
>	...
>
>#else
>
>#define	PROTO(x)	()
>
>	...
>#define A2(t1,v1,t2,v2) \
>	(v1,v2) t1 v1; t2 v2;
>	...
>
>#endif
>
>The example then becomes
>    int tkline A2(
>	char*,       bufP,		/* Buffer pointer */
>	AREG1 char*, tbufP)		/* Token buffer pointer */
>	{ ... }
>
>All done by kindness.

A problem arises in the following case:

	int insert( char a[MAX_A], int (*func)( char a[] ) )
	{
	...
	}

So perhaps you should extend the macro to be

#ifdef	__STDC__

#define	A2( t1, v1, x1, t2, v2, x2 ) \
	(t1 v1 x1, t2 v2 x2 )

#else

#define	A2( t1, v1, x1, t2, v2, x2 ) \
	(v1, x1 ) t1 v1 x1; t2 v2 x2;

#endif

Then my example becomes:

	int insert( char,a,[MAX_A], int (*, func, ) PROTO(( char a[] )) )
	----------------------------------^^^^^^^

I don't think this will work because the commas inside the brackets are
supposed to separate the macro arguments, but I don't think they will because
they are in brackets.

So perhaps you have to do it with typedefs:

	typedef	int	(*intfunc_char) PROTO(( char a[] ));

	int insert A2( char,a,[MAX_A], intfunc_char, func, )
	{ ... }

which should work. Unfortunately the original example becomes:

	int tkline A2( char*, bufP,, intfunc_char, func, )

The double commas are obnoxious and errors involving them might be a pain to
find. The necessity to typedef function pointers is worse. The best solution
is to use all ANSI tools if possible. I wish we could.

Just thought you'd like to know what you're getting into.

rba iv
dataco!amodeo


***--------------------------------------------------------------***
* DISCLAIMER:                                                      *
* ==========:                                                      *
*    The opinions expressed are solely of the author and do not    *
*    necessarily reflect the opinions of Canadian Marconi Company. *
***--------------------------------------------------------------***