[comp.lang.c] const, volatile, etc

gwyn@smoke.BRL.MIL (Doug Gwyn ) (12/02/88)

In article <10919@ulysses.homer.nj.att.com> ggs@ulysses.homer.nj.att.com (Griff Smith) writes:
>I blew several hours discovering that a flag set by a signal handler
>had been optimized out of existence because it wasn't declared volatile.

>If I were writing new software I would be
>aware of the problem and use the proper declaration, but what am I to
>do about fixing all the old stuff that now has subtle errors caused by
>optimizations that used to be illegal?

Aggressive optimization is the new wave, brought on partly by all those
C compiler races published in hobbyist magazines.  Most of the time,
faster-running code is an advantage, so generally this is a good thing.

>The response I got from a C++ guru around here wasn't encouraging: he
>suggested declaring everything volatile and ignoring the issue.

Maybe he was joking?

>Maybe he's right, but that attitude could easily lead to a habit of
>forcing all declarations to include the volatile qualifier just to
>avoid a `silly' rule.

No, that's not the proper way to use "volatile".  The only time you
need to use it is when there is a violation of the abstract C machine
model of sequential computation (where objects retain the value stored
into them until the next explicit store into them).  The two main
categories of exceptions to this model are variables shared with
interrupt handlers, as you noticed, and device registers that change
in response to outside events not known to the compiler.

>Do any of you have some practical experience, plus suggestions for
>living in the brave new ANSI-C world?

I have a "standard C language assist" file <std.h> that I configure
for each system I port to, and that I include in each source file of
my applications.  A portion of it follows:

#if __STDC__
typedef void	*pointer;		/* generic pointer */
#else
typedef char	*pointer;		/* generic pointer */
#define	const		/* nothing */	/* ANSI C type qualifier */
/* There really is no substitute for the following, but these might work: */
#define	signed		/* nothing */	/* ANSI C type specifier */
#define	volatile	/* nothing */	/* ANSI C type qualifier */
#endif

Then I simply use the appropriate type qualifiers in my code, for
example "volatile" on a "signal_was_seen" flag.  In non-ANSI C
environments the qualifiers in effect vanish and I get whatever the
compiler gives me, which MAY be what I need.  (That assumption was
implicit in your discussion.)

Really, C optimizers always have been hard to second-guess.  At least
ANSI C imposes SOME limits on the degree to which optimization can be
carried out.

ggs@ulysses.homer.nj.att.com (Griff Smith) (12/03/88)

In article <9033@smoke.BRL.MIL>, gwyn@smoke.BRL.MIL (Doug Gwyn ) writes:
> In article <10919@ulysses.homer.nj.att.com> ggs@ulysses.homer.nj.att.com (Griff Smith) writes:
| ...
> >The response I got from a C++ guru around here wasn't encouraging: he
> >suggested declaring everything volatile and ignoring the issue.
> 
> Maybe he was joking?

Could be, but he seemed serious.  This was in response to a question
about bug search and destroy missions directed toward existing working
code.  I think the implication was that fixing old code to work with
new compilers is boring; just sweep the problem under the rug and be
done with it.  Do it right when you write something new.

> >Maybe he's right, but that attitude could easily lead to a habit of
> >forcing all declarations to include the volatile qualifier just to
> >avoid a `silly' rule.
| 
> No, that's not the proper way to use "volatile".

I know that.  The point was that `some' variable needed to be declared
volatile, and it was easier to fix them all rather than find the right
one using reasoning.

> >Do any of you have some practical experience, plus suggestions for
> >living in the brave new ANSI-C world?
| 
> I have a "standard C language assist" file <std.h> that I configure
> for each system I port to, and that I include in each source file of
> my applications.

Good start.  Now, what do I do about upgrading a million lines of old
code to the new standard, and finding all the mis-declared variables.
-- 
Griff Smith	AT&T (Bell Laboratories), Murray Hill
Phone:		1-201-582-7736
UUCP:		{most AT&T sites}!ulysses!ggs
Internet:	ggs@ulysses.att.com

gwyn@smoke.BRL.MIL (Doug Gwyn ) (12/03/88)

In article <10929@ulysses.homer.nj.att.com> ggs@ulysses.homer.nj.att.com (Griff Smith) writes:
>Good start.  Now, what do I do about upgrading a million lines of old
>code to the new standard, and finding all the mis-declared variables.

You could do what I've had to do for each release of UNIX from AT&T:
Go through the entire goddamn source code and fix everything the hard
way.  I have a small list of things I look for that catch most poor
practices.  Here are some of them:
	char c;		// look for a following (c = getc())
	sig*		// signal handling is almost always wrong
	unlink(		// add check first for null filename pointer
	argv		// check argc first
	/*VARARGS*/	// usually doesn't use <varargs.h>
	error		// may need to be varargs-ized
	#if		// check that conditionalization is right
	u3b		// usually a 3B-only kludge instead of a fix
	gets(		// usually used unsafely
	strc..(		// often used unsafely
	scanf(		// often used wrong
	BUFSIZ		// usually indicates an unsafe assumption
	execl*(		// terminator usually needs cast
	unsigned	// sometimes mixed expressions need casts
		// look at EVERYthing that "lint -p" squawks about:
	read(		// buffer address usually needs cast
	write(		// buffer address usually needs cast
	malloc(		// declaration often missing
	lseek(		// declaration often missing
	time(		// declaration often missing
	exit(		// declaration often missing
	free(		// declaration often missing
	perror(		// declaration often missing
	return;		// often missing value
What I don't know is why AT&T didn't do all this a long time ago.
For the $80,000 or so we have to pay for a source license, it is
reasonable to expect good quality control to have been applied.

As to changes brought on by a new release of a C compiler, there
is no easy way to be sure what compiler-dependent assumptions
were made in existing code.  The specific problem you mentioned
concerning data shared with an asynchronous signal handler can
be searched for fairly readily by grepping for use of signal()
then hunting down the handlers and see just what they do (not
very much in most cases).  Add "volatile" to shared flags, and
consider using type sig_atomic_t for them (which you need to
typedef yourself for non-__STDC__).  UNIX PCCs have used a
variety of techniques to prevent hyperoptimization of accesses
to potential device registers in compiling device drivers.  For
example, one I know of knew what hardware addresses were used
for device registers and disabled such optimization for them.
Many device drivers are still critically dependent on the way
the code generator works, so they need to be reviewed EVERY
time the code generation changes.

chris@mimsy.UUCP (Chris Torek) (12/04/88)

In article <319@aber-cs.UUCP> pcg@aber-cs.UUCP (Piercarlo Grandi) writes
a great deal, but I will copy just the summary header line:
>Summary: volatile is bad because register is cheaper and safer

I think it is more accurate to say that, in the past, `cheaper' meant
using simpler languages with simpler compilers.  As time goes on, we
find that `cheaper' means using a higher level of abstraction, fancier
languages, fancier compilers.  The pattern repeats; the wheel goes
round and round: you can see it everywhere, not just in the history of
computers, but in the history of every technology.

`Everything should be made as simple as possible, and no simpler.'  C
was that.  Is it any more?  For some time to come, I think so.  But I
think its days are numbered, as those of Fortran IV were---and clearly
so---years ago, and now F77.  There comes a time when an overhaul is
insufficient.  C is getting away with an overhaul, but it will not
last.  (Contrast the change from F77 to F8X, which is like putting the
old steamship in the swimming pool of a luxury super-liner: the old
boat is still there, but it is largely just for show.)

But I see I am getting philosophical in a technical group again.  (Must
be the roach poison.  My apartment building sprayed recently, and the
place needs airing out again....)
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7163)
Domain:	chris@mimsy.umd.edu	Path:	uunet!mimsy!chris

ray@micomvax.UUCP (Ray Dunn) (12/08/88)

In article <10929@ulysses.homer.nj.att.com> ggs@ulysses.homer.nj.att.com (Griff Smith) writes:
>....
>Good start.  Now, what do I do about upgrading a million lines of old
>code to the new standard, and finding all the mis-declared variables.

You dont.  See my previous posting. 

The standard as far as this goes, hasn't changed.  If your program is wrong
now, it was wrong before!  The only change is that *now* you have a way of
using the optimizer with variables that are volatile.  You didn't before.
-- 
Ray Dunn.                      |   UUCP: ..!philabs!micomvax!ray
Philips Electronics Ltd.       |   TEL : (514) 744-8200   Ext: 2347
600 Dr Frederik Philips Blvd   |   FAX : (514) 744-6455
St Laurent. Quebec.  H4M 2S9   |   TLX : 05-824090

ray@micomvax.UUCP (Ray Dunn) (12/10/88)

In article <319@aber-cs.UUCP> pcg@cs.aber.ac.uk (Piercarlo Grandi) writes:
>  [long argument against "volatile]
>First of all, in a sense, volatile and register exclude each other; in
>traditional C all variables are volatile save for those declared to be
>register (and this explains why there is a restriction on &...).

Unfortunately, the whole foundation of your argument is incorrect.  All
variables in tradional 'C' are *non-volatile*.

Nowhere is there any guarantee that each and every variable reference made
in a 'C' source will translate to a memory reference in the run code.

That is what "volatile" gives you.

> [continued long argument totally confusing "volatile" with
>  *performance* issues and the use of "register"]

There is no relationship between "volatile" and "register", except that
conceptually, "volatile" variables can *NOT* be of type register and cannot
be arbitrarily *kept* in a register in the compiled code.  In addition to
the performance issue, "register" implies the exact opposite of "volatile".

>Arguments for volatile, as opposed to register, are essentially:
>
>    [1] you cannot trust programmers to be competent and understand
>    what they are writing, or at least not enough to place appropriate
>    register declarations; in that case an automatic optimizer will
>    possibly improve things, even if it does a purely static analysis.

Volatile is not *just* needed as an adjunct to optimizing compilers.

Volatile is needed in *all* cases when the programmer wishes to convey to
the compiler that a variable has environmental constraints normally not the
concern of the compiler.

It is in fact only luck that most traditional compilers handle machine
specific volatile variables correctly when no optimization is called for,
and without the use of the "volatile" keyword.

>As to point [1], unfortunately you need damn competent programmers that
>understand a lot of the subtleties of their program to place volatile
>where it is needed, and only there, otherwise they will be inclined,
>for safety, to declare everything in sight to be volatile...

and produce "vapourware" I presume.... (:-)

The argument that programmers can be expected to be competent enough to
design the algorithms of the program, and analyse their code sufficiently to
be able to duplicate the effects of a good optimizizing pass, but that
somehow some esoteric higher level of understanding is required to be able
to place "volatile" appropriately, I find most unconvincing.

If the programmer does not understand which variables are referencing
"locations" which can side effect or be side effected by other software or
hardware, then should (s)he be let loose in that environment?

Classical examples of the need for "volatile" are when handling memory
mapped I/O, or inter-process signals.  In these cases, amongst many others,
we need to ensure that the compiler is not treating an I/O write or a signal
update as an intermediate value that does not *yet* need to be written into
the actual memory location.

Does the traditional definition of 'C' allow the programmer to be sure that
the compiler is treating these cases correctly?  Nope!
-- 
Ray Dunn.                      |   UUCP: ..!philabs!micomvax!ray
Philips Electronics Ltd.       |   TEL : (514) 744-8200   Ext: 2347
600 Dr Frederik Philips Blvd   |   FAX : (514) 744-6455
St Laurent. Quebec.  H4M 2S9   |   TLX : 05-824090

cjc@ulysses.homer.nj.att.com (Chris Calabrese[mav]) (12/11/88)

In article <1526@micomvax.UUCP>, ray@micomvax.UUCP (Ray Dunn) writes:
> In article <319@aber-cs.UUCP> pcg@cs.aber.ac.uk (Piercarlo Grandi) writes:
> >  [long argument against "volatile]
> >First of all, in a sense, volatile and register exclude each other; in
> >traditional C all variables are volatile save for those declared to be
> >register (and this explains why there is a restriction on &...).
> 
> Unfortunately, the whole foundation of your argument is incorrect.  All
> variables in tradional 'C' are *non-volatile*.
> 
> Nowhere is there any guarantee that each and every variable reference made
> in a 'C' source will translate to a memory reference in the run code.
> 
> That is what "volatile" gives you.
> 
> > [continued long argument totally confusing "volatile" with
> >  *performance* issues and the use of "register"]
> [...]
> 
> Classical examples of the need for "volatile" are when handling memory
> mapped I/O, or inter-process signals.  In these cases, amongst many others,
> we need to ensure that the compiler is not treating an I/O write or a signal
> update as an intermediate value that does not *yet* need to be written into
> the actual memory location.
> 
> Does the traditional definition of 'C' allow the programmer to be sure that
> the compiler is treating these cases correctly?  Nope!

#define FlameOn 1
This is an interesting view of this volatile issue (pun intended).
C (B actually)was first created to write a file system handler
on PDP's.  Given this fact, how could the variables _possibly_
been non-volatile?  Indeed, 99.9% of UNIX is written in C,
including _all_ the memory mapped I/O, and signal handling
routines which may be in the kernel.  If the variables
can all be optimized out of loops, etc, how come the machine
I'm working on, which has a memory bit-mapped screen,
memory mapped keyboard, etc - whith all the drivers written
in Classic C - possibly work?
#define FlameOn 0

I happen to like the volitile keyword.  I have two compilers
on my machine, a PCC, and a GNU optimizing compiler.
The GNU compiler produces code which is about 30% faster
by placing small functions in-line, optimizing access
of variables, etc - all without giving up readability or
cleanliness of design by hand-done micro-optimization.

I also think ANSI compatible compilers should
have a flag which tells it to assume all variables
as volitile, so that old device driver and kernel
code can be compiled with them.  Luckily, the GNU
compiler has several flags of this type for turning
on/off various ANSIisms.

The only issue remaining now is whether the 'register'
keyword is still needed.  Since a compiler can't
infer what the data passed to a function will be,
the answer is definitely yes.  For instance:

MyFun(int x, register int y)
	{
	for(; x>=0; x--)
		{
		...
		}
	for(; y>=0; y--)
		{
		...
		}
	}

If I know that y is usually going to be _much_ greater than
x, the inside of the loops are about the same, the compiler
can't do register caching, and there's
only one free register, I want y to be in it!
-- 
	Christopher J. Calabrese
	AT&T Bell Laboratories
	att!ulysses!cjc		cjc@ulysses.att.com

gwyn@smoke.BRL.MIL (Doug Gwyn ) (12/11/88)

In article <10988@ulysses.homer.nj.att.com> cjc@ulysses.homer.nj.att.com (Chris Calabrese[mav]) writes:
>If the variables
>can all be optimized out of loops, etc, how come the machine
>I'm working on, which has a memory bit-mapped screen,
>memory mapped keyboard, etc - whith all the drivers written
>in Classic C - possibly work?

Well, first of all you're probably using a compiler that doesn't
optimize very highly, and possibly when compiling the kernel the
cc -O option is NOT used, to further restrict the amount of
optimization.

If it's a PDP-11 or VAX PCC, then there is an explicit check for
*(pointer_cast)(device_space_address) usage, and certain optimizations
are disabled if it can be determined that the I/O page may be accessed.
I don't know whether something similar is done for other architectures.

"volatile" would have been a much better solution..

henry@utzoo.uucp (Henry Spencer) (12/14/88)

In article <10988@ulysses.homer.nj.att.com> cjc@ulysses.homer.nj.att.com (Chris Calabrese[mav]) writes:
> ...
>C (B actually)was first created to write a file system handler
>on PDP's.  Given this fact, how could the variables _possibly_
>been non-volatile?  Indeed, 99.9% of UNIX is written in C,
>including _all_ the memory mapped I/O, and signal handling
>routines which may be in the kernel.  If the variables
>can all be optimized out of loops, etc, how come the machine
>I'm working on, which has a memory bit-mapped screen,
>memory mapped keyboard, etc - whith all the drivers written
>in Classic C - possibly work?

A combination of unambitious compilers, kludges in the compilers to try
to avoid problem areas, and judicious non-use of the -O flag when compiling
the kernel.  Don't confuse AT&T compilers with the definition of C.
-- 
SunOSish, adj:  requiring      |     Henry Spencer at U of Toronto Zoology
32-bit bug numbers.            | uunet!attcan!utzoo!henry henry@zoo.toronto.edu

gwyn@smoke.BRL.MIL (Doug Gwyn ) (12/15/88)

In article <377@aber-cs.UUCP> pcg@cs.aber.ac.uk (Piercarlo Grandi) writes:
>[Gwyn thinks:] if you want signed characters, the only way is to add
>a signed keyword, if you want efficient code the only way is a complex
>optimizer that relies on volatile.

You really have not been listening at all, have you?  X3J11 operates
under certain rules and constraints to accomplish a particular task
having particular scope.  "signed" was not invented by X3J11; it
was introduced in several commercial C compilers many years ago as
one way -- the most obvious of several possibilities -- to provide the
C programmer some way to obtain guaranteed signedness for char-sized
objects.  X3J11 had to determine first whether this was a problem in
the base document (K&R 1st Ed. Appendix A) that needed fixing, and if
so, how to fix it.  They agreed with what appears to be your premise,
that this lack in C's integral types needed to be provided in the new
standard.  Then the question became, how best to do that.  In such
cases, whenever there was widespread existing practice, unless it had
some obvious problem it served as a guide for the standard.  In cases
where existing practice was ambiguous (multiple different solutions
had been devised), X3J11 was at liberty to choose one over the other.
The ONLY widespread existing practice for obtaining guaranteed signed
char-sized objects in C was use of the "signed" specifier; since the
only possible technical problem with adopting this (conflict of a new
keyword with existing use as an identifier) was judged to not be
likely to cause much practical problem, it was adopted.  X3J11 could
hardly have done otherwise; ignoring existing practice would run
counter to the committee's charter.

>Again, it has never been a usual rule that an optimizer is allowed to turn a
>correct program into an incorrect one.

The proposed standard is quite specific about the virtual machine
semantics that the C implementation must correctly support.

>But in any case I still must volatilize all other variables, otherwise the
>compiler will do funny things behind my back.

Wrong!!!

>Volatile encourages people to think that a complex optimizer will clean up
>their sloppy C code. Too bad that the cleanup cannot be as good as
>competently written code, ...

You don't seem to appreciate what modern optimization technology can
accomplish.  A good optimizer can significantly speed up well-written
C code.  In fact it often cannot do very much for sloppy code, such
as code that follows your recommendation of declaring all variables
volatile (thereby imposing a constraint that is totally unwarranted
under nearly all circumstances).

>    The use of "volatile" does not require much analysis.  If you were paying
>    attention, I already explained in response to Griff Smith the cases where
>    you need to use it.

>My dear Doug Gwin, rest assured that your paternalistic innuendo is not
>necessary.  I know perfectly well the rationale for "volatile" (enable
>optimizations where side effects may occur); I dispute that the only solution
>to the problem is "volatile", as I dispute the rationale itself.

My name is Gwyn (which as you should know is Welsh), not Gwin!

Again you attempt to put false words in others' mouths.  Please cease
these sleazy debating ploys, as I will not let you get away with them
and I'm sure others are also annoyed by them.

I did NOT say that "volatile" was the only solution to a particular
problem.  However, the solution you suggest (in effect, requiring
what is now known as the volatile attribute for ALL data) is not
acceptable to a large fraction of the current C marketplace.  The
intention is to have a standard that will actually be used, not to
make it unacceptable to a large segment of the C user population.
The determination was made that as high a degree of optimization as
possible should be permitted while remaining consistent with a
reasonable C virtual machine model.  Note that "noalias" arose
out of the insistence of a sufficient number of committee members
on this principle; the attempt to have "pointer to const" function
parameters permit optimization that ignored possible aliasing was
NOT considered reasonable.  We deliberately prohibited the extra,
sometimes quite significant, optimizations that this would have
permitted.  The "optimizer people" on the committee settled for
having explicit programmer use of "noalias" required to permit such
optimization.  ("noalias" has unfortunately been dropped from the
final standard, so this optimization is completely disallowed for
conforming programs compiled by conforming implementations.)  So,
as you should be able to see from this example, X3J11 did NOT wish
to trade off reasonable program semantics for the sake of better
optimization.  The thing you don't seem to agree with is, that we
didn't think that the volatile property should be part of the
reasonable C virtual machine model; in fact quite the contrary --
volatile behavior is quite UNreasaonable, and it is appropriate to
require that the programmer explicitly flag those few instances
where it really is expected.

>I would also like to emphasize that the real big problem will be multi
>threaded C code, not signal handlers or device drivers. As any Ada programmer
>knows, if you use tasking and fail to stick pragmas volatile/shared wherever
>they are needed, maybe in dozens of modules, and that may be a lot of places,
>you get into BIG trouble.

But the C standard does not attempt to define semantics for multitasking
or multiple threads.  This too was deliberate.  If someone (for example,
IEEE 1003) wants to establish additional C constraints for such an
environment, they are free to do so.  For example, it could be required
that the volatile attribute automatically apply to every data object in
such an environment.  However, I guarantee that you have much worse
problems than that to contend with in using any flavor of C in such an
environment.

>    In pre-ANSI C the same problem existed, but there was no standard
>    solution for it.
>The problem did not exist, because it was unthinkable that the compiler would
>do funny things behind your back!

Wrong.  Even Ritchie's PDP-11 C compiler would occasionally do things
when the peephole optimizer was enabled that caused trouble in device
drivers, etc.  This was not generally considered a bug; one merely
revised the code to outwit the optimizer or else turned off the
optimizer.  Many of these problems could have been circumvented through
the use of "volatile", if it had existed.

I explained elsewhere about a horrible kludge that some PCCs had to
selectively disable certain optimizations in a portion of virtual
address space; again "volatile" could have rendered that unnecessary.

>    I would like to be able (maybe in C++ ?) to declare a procedure as
>    "register", to signify that the values it returns only depend on the
>    arguments and not on any globals, and thus it will always return the same
>    values given the same arguments.

This is known as a "pure" function.  It has been suggested before, but
no existing C practice seems to be available to use to judge its merit.

P.S.  It's about time for another reminder that I'm not speaking
officially for X3J11 here, although at least I WAS involved in the
deliberations leading to the final form of the proposed standard,
so there is SOME chance that I'm in a position to appreciate the
trade-offs and decision criteria.

ray@micomvax.UUCP (Ray Dunn) (12/17/88)

In article <10988@ulysses.homer.nj.att.com> cjc@ulysses.homer.nj.att.com (Chris Calabrese[mav]) writes:
 >In article <1526@micomvax.UUCP>, ray@micomvax.UUCP (Ray Dunn) writes:
 >> Nowhere is there any guarantee that each and every variable reference made
 >> in a 'C' source will translate to a memory reference in the run code.
 >> 
 >> That is what "volatile" gives you......
 >
 >#define FlameOn 1
 >This is an interesting view of this volatile issue (pun intended).
 >C (B actually)was first created to write a file system handler
 >on PDP's.  Given this fact, how could the variables _possibly_
 >been non-volatile?  Indeed, 99.9% of UNIX is written in C,
 >including _all_ the memory mapped I/O, and signal handling
 >routines which may be in the kernel......

Your argument just evaporated away (pun intended).

Maybe you didn't realize it, but unless your compiler documentation states
quite clearly that this is all safe, then you have *no* guarantee that the
code you describe *has* been compiled correctly.

The solution to this argument is really quite simple, and lies in your hands.

Unless this guarantee of non-volatility is clearly stated in one of the
bibles (K&R etc), *or* in the implementaion documentation, then you have no
justification for relying on it (and if it is specified in the latter, then
the argument still holds good for 'C' in general).

Sooo....quote chapter and verse.....

-- 
Ray Dunn.                      |   UUCP: ..!philabs!micomvax!ray
Philips Electronics Ltd.       |   TEL : (514) 744-8200   Ext: 2347
600 Dr Frederik Philips Blvd   |   FAX : (514) 744-6455
St Laurent. Quebec.  H4M 2S9   |   TLX : 05-824090

gwyn@smoke.BRL.MIL (Doug Gwyn ) (01/04/89)

In article <475@aber-cs.UUCP> pcg@cs.aber.ac.uk writes:
>I understand very well, as Doug Gwyn said, that X3J11 is nearly as
>political a body as a House public works committee (:->), ...

Note that I did NOT say that.  [Sleazy ploy #45.]

I said that I think that the "Common Extensions" section does not belong
in the Standard and exists there primarily for political reasons.  The
same, I think, is true of the deprecated [] parameter non-aliasing in
"Future Language Directions".  However, the ENFORCEABLE sections of the
Standard are remarkably clean, when one considers the variety of
technical and practical factors that had to be balanced; their
compromises are nearly always technically, not politically, motivated.
(Even the EXIT_* macros, originally driven by what many of us would
consider to have been a mistake in a certain existing implementation of
exit(), have intrinsic technical merit.)

>Does not dpANS C have the taste of an omnibus appropriations bill?  :->).

No, it does not.  Literally thousands of proposals for inclusion of a
variety of "nice features" were rejected.  Extensions beyond the base
documents were made only where there were perceived deficiencies and
viable solutions (usually based on existing practice) could be found.
The one major invention was the support for "internationalization";
this was mandated by ISO and is also commercially quite important.
(Such commercial concerns are valid, because they reflect genuine user
needs.)  I backed what I think was a technically superior solution to
the "large character set" issue, but at least I understand the several
arguments for all the proposed alternatives; since the decision was
made rationally after proper deliberation, I support the outcome even
though I "lost" on this issue.  In fact, in several committee
discussions of issues, I argued on multiple "sides" of the issues,
simply to make sure that all relevant factors were being considered in
arriving at each decision.  I maintain that this is the correct way
to make such decisions.

What ticked me off in the first place concerning Grandi's postings
is his presumption that he is fit to pass judgement on X3J11's work
when he is obviously unfamiliar with it.  I have no record of his
contributing to the public review process, which would have been the
appropriate channel for his suggestions about what "should" have been
done.  His description of X3J11's methods, goals, and results have in
most instances shown his ignorance of these matters.  In particular he
shows no sign of recognizing that the few valid "issues" he raises have
in fact been discussed in one form or another, and after discussion
and consideration of alternatives and trade-offs measured against the
goals of the Standard, the Committee deliberately decided on doing
things the way they are now specified (for good reasons of which Grandi
is apparently unaware).  Why does he continue to proclaim what is "wrong"
with the pANS without first having the relevant information at hand?
It sure is annoying.  (Aha! Maybe that's the reason.  How sick.)

If you want to see Grandi's qualifications for discussing what C should
be like, check out his rewrite of the "getchar loop" example in another
recent posting to see what he thinks C is like now!  Pretty amazing..