[net.lang.c] Declarations and defaulting in C

boykin@datagen.UUCP (07/04/84)

RE: Hans Albertson's question about declarations and defaults
in C, the following, taken from both K&R and the ANSI draft standards
document is legal:

	a;
	int b[10];
	c = 4;
	main()
	{
	...
	}

In call cases the declarations are of type 'int' and storage class
'extern' i.e. global variables.  Personally I think the declarations
of both 'a' and 'c' are terrible programming practice, but it IS legal.

As Hans said in his note, K&R allows defaults for both type-declaration
and storage class.  After those two, there isn't much left to a declaration!

As also pointed out, how do you tell the difference between the
declaration of 'c' and an assignment to 'c' if this is within
a function body?  Answer, you can't.  Hence this type of declaration
is only allowable outside the scope of a function.

The ANSI committee has ruled that this type of declaration has been
and will continue to be legal even though none of the members of
the committee like that type of declaration.  The point is is that
taking it out would break some user code.  One of the chief
concerns of the ANSI committee is that we never break user code,
even if we don't like the 'cracks' in the language that that code
takes advantage of.

I hope that clears up the question(s).

Joe Boykin
Member, ANSI Committee X3J11

Data General Corp.
Distributed Systems Group
(allegra, ihnp4)!datagen!boykin

west@sdcsla.UUCP (07/08/84)

Sure, you may break a few users' programs by dis-allowing things like:

	{ a; int c; b = 5; foo() /* call or decl? */; ...}

but people who write code in that manner probably haven't given
a thought to portability.   That is, the programs you'll break
by changing the language on this point are not likely to be
large, critical, or widely-used.   And the breakage is easily
fixed, assuming reasonable error messages are produced.

On the other hand, by allowing that sort of garbage to persist,
people who inadvertently make a mistake which ends up looking
like that will (perhaps) spend hours/days trying to figure out
what's wrong.

Why bother to standardize to a poor standard?   Why not, at this
extremely opportune time, push for a minor, easily-implemented improvement
for which everyone will be grateful later on?

---

The other point which bothers me, even more, is the limitation of
six significant characters in external names.   It seems to me that
the cost of converting a few linkers from 6 characters to some
larger number (say, 16 -- even 10 or 12 would be a vast improvement)
is much less than the cost of having programmers figure out
meaningful six-character names to use.   There aren't really that
many informative identifiers with six characters -- maybe a few
hundred at most.   Add to the cost of figuring out a group of
6-character identifiers (also not conflicting with any system
call or subroutine name) the cost of trying to decipher such
things.

And who really has 6-char-max linkers that they plan to support,
unchanged, for the next ten years?   I've never come accross any.

---

Finally, a suggestion.   Instead of allowing varargs function
declarations like this:		int oof( char *, );
how about using ellipses:	int oog( char * ... );
or perhaps:			int oog( char *, ... );
This also has the benefit that:	int oog( ... );
looks more natural than:	int oof( void, );

Purely a matter of style.

Thanks	-- Larry West, UC San Diego, Institute for Cognitive Science
	-- decvax!ittvax!dcdwest!sdcsvax!sdcsla!west
	-- ucbvax!sdcsvax!sdcsla!west
	-- west@NPRDC

DBrown@HI-MULTICS.ARPA (07/08/84)

  Not break user code.... hmmn.
  I wonder if the committee might consider publishing a discussion on
how to evolve a language.  Obviously programs with "a =- 3" are broken
by the release of the S5V1 compiler, and I regard this as a good thing.
On the other hand, I'd hate to have to go around rewriting programs
*often* because the language has done a "cobol" and sneaked out from
under my program.  (sneaked?  snuck?  sniggled?  double hmmn).
  I'll volunteer the Multics-ism that if a piece of software is allowed
to upgrade to a non-compatable format for its stored data, then that
program must contain a mechanism for upgrading data from the immediatly
previous version.
  As a Unix-ism this might be stated in the form of a YACC grammar for a
language-upgrading filter to use.
  As always, comments and attacks invited.
   --dave (unix hack on a bun) brown
     DBrown @ HI-Multics.ARPA
     drbrown at watbun.UUCP
     dave @ brown.TSD1.Honeywell

ka@hou3c.UUCP (Kenneth Almquist) (07/12/84)

> The other point which bothers me, even more, is the limitation of
> six significant characters in external names.   It seems to me that
> the cost of converting a few linkers from 6 characters to some
> larger number (say, 16 -- even 10 or 12 would be a vast improvement)
> is much less than the cost of having programmers figure out
> meaningful six-character names to use.

How spoiled we get running under UN*X.  In much of the rest of the
world, things are not simple.  If you don't have source code, you
have to convince your operating system vendor to convert the loader.
Your vendor has probably committed itself to supporting a particular
object format until the end of time, and is probably not interested
in supporting a second format until the end of time as well.  The
reason that a six character limitation was imposed to begin with
may have been to allow an identifier to fit in a single word; if
so increasing the identifier length could require rewriting most
of the loader (which was written in assebmly language, of course),
and the result would likely run slower.  No manufacturer would even
consider making its system run slower for the benefit of a few
oddballs who want to run C.

A UN*X standard (assuming that one is ever released) should probably
guarentee long global names and the ascii character set.
				Kenneth Almquist

DBrown@HI-MULTICS.ARPA (07/18/84)

re ken almquist's comment about peoples elderly linkers:
  i get even more spoiled on multics, where all disk-resident data
structures like object files have "version numbers".
  because the designers thought of making the files self-identifying its
easy to write a "new" linker which supports two formats of file.  then
after a few releases you drop support of the oldest version and count on
the fingers of one hand the number of people who scream.  (they get told
to relink with the -update_it_by_hook_or_crook flag on).
  i've even managed to retrofit version numbers into an old program of
mine, by putting it in a field that never could have a 1 in it in the
old layout.

  as a result i can upgrade such things as old linkers to use better
algorithms, instead of the bubble-sort that the original *"$%&!!!!
author used because it was easy to code in mac...
  --dave (unix hack on a bun) brown