[comp.lang.c] compiler chokes on 1986th #define

ceide@bbn.com (Chantal Eide) (05/03/89)

When compiling a file with a large number of '#define's on our AT&T3B2
machine I encountered the following error message:

too much defining

Each '#define' after that had the same error message.  We tried to
isolate the problem with the following test program:

xx.c:
	#include "hh.h"
	main() { printf("hi there\n"); }

hh.h:
	#define DEF1 1
	#define DEF2 2
	 ...
	#define DEF10000 10000

The 3B2 chokes at the 1986th '#define', printing 'too many defines'.

Does anyone know how to get around this problem?  Please email me, as
I do not read this group.

Thank you,

Chantal Eide
BBN Systems and Technologies Corp.
ceide@bbn.com

jas@ernie.Berkeley.EDU (Jim Shankland) (05/04/89)

In article <1710@currant.bbn.com> ceide@bbn.com (Chantal Eide) writes:
>
>When compiling a file with a large number of '#define's on our AT&T3B2
>machine I encountered the following error message:
>
>too much defining

This error message adds the insult of a rather scolding tone (as though
"defining" were a bad habit in which you are overindulging) to the injury
of refusing to compile your source file.  The problem is a static buffer
in cpp (30000 bytes, if I recall correctly).  Quadruple the size of the
buffer, recompile cpp, and you're all set.

What's that?  You don't have the source to cpp?  You can:

	* try to remove some defines from your program;
	* try to hack up some way to run cpp iteratively over your program,
		expanding some of the defines each time;
	* port GNU's cpp to your 3B2;
	* file a bug report with AT&T;
	* buy a different computer.

None of these options should be sounding very attractive to you.

This business of small, static buffers in the days of virtual memory
is a blight on various UNIX implementations; in my experience, AT&T is
one of the worst offenders.

Jim Shankland
jas@ernie.berkeley.edu

"Blame it on the lies that killed us, blame it on the truth that ran us down"

cdaf@iuvax.cs.indiana.edu (Charles Daffinger) (05/04/89)

In article <29028@ucbvax.BERKELEY.EDU> jas@ernie.Berkeley.EDU (Jim Shankland) writes:
>In article <1710@currant.bbn.com> ceide@bbn.com (Chantal Eide) writes:
>>
>>When compiling a file with a large number of '#define's on our AT&T3B2
>>machine I encountered the following error message:
>>
>>too much defining
>
I've had analogous problems while abusing cpp by creating some huge
#defines.  (a different application, not a C program).... "too much
pushdown", at which case, instead of bombing, it just decides it really
didn't want to do any substitutions anyway, and continues on its merry
way...   This is on the Ultrix cpp.

The solution?  Make a smaller #define with other defines nested inside of it,
which in this case was kind of unwieldy....(but worked), and #include'd 
a file of the smaller #defines wherever I needed the larger one.. Gak.

or:
>	* port GNU's cpp to your 3B2; (or vax)

	Works wonders.

-charles



-- 
Charles Daffinger  >Take me to the river, Drop me in the water<  (812) 339-7354
cdaf@iuvax.cs.indiana.edu              {pur-ee,rutgers,pyramid,ames}!iuvax!cdaf
Home of the Whitewater mailing list:    whitewater-request@iuvax.cs.indiana.edu

steve@oakhill.UUCP (steve) (05/04/89)

In article <1710@currant.bbn.com>, ceide@bbn.com (Chantal Eide) writes:
> 
> When compiling a file with a large number of '#define's on our AT&T3B2
> machine I encountered the following error message:
> 
> too much defining
> 
 .
 .
Since I just started looking in the SGS compiler for the origin of another
bug (Yes, there are other bugs in the code :-{), I'd thought I'd look up
the origin of this one.

The c-preprocessor contains a large scratch area where all defined
(and other information) is stored.  This is a compiled in as 'SBSIZE'.
So the answer is: No, this limit is compiled in.  The only solution
to use less buffer space in you defines.  (Of course if you had a copy
of this compiler you could bump up this token, but then you would
probably not be posting this question).

                   enough from this mooncalf - Steven
----------------------------------------------------------------------------
These opinions aren't necessarily Motorola's or Remora's - but I'd like to
think we share some common views.
----------------------------------------------------------------------------
Steven R Weintraub                        cs.utexas.edu!oakhill!devsys!steve
Motorola Inc.  Austin, Texas 
(512) 891-3023 (office) (512) 453-6953 (home)
NOTE: NEW PHONE NUMBER!!
----------------------------------------------------------------------------

peno@kps.UUCP (Pekka Nousiainen /DP) (05/05/89)

>This business of small, static buffers in the days of virtual memory
>is a blight on various UNIX implementations; in my experience, AT&T is
>one of the worst offenders.

The scene is the ATT universe of our production Pyramid.  The machine
has about 250 meg physical+virtual memory.  We want to make some final
checks before printing 90000 invoices:

% make lint
....
lint pass2 error: too many names defined

--
peno@kps

friedl@vsi.COM (Stephen J. Friedl) (05/07/89)

Chantal Eide writes:
<
< When compiling a file with a large number of '#define's on our AT&T
< 3B2 machine I encountered the following error message:
<
<	too much defining

Jim Shankland writes:
<
< This business of small, static buffers in the days of virtual memory
< is a blight on various UNIX implementations; in my experience, AT&T is
< one of the worst offenders.

AT&T has made a lot of effort to remove these static limits in
various pieces of software, and it would not surprise me if they
have an official policy on this somewhere.

I believe that this specific cpp limit will be fixed in the C
compiler provided in System V Release 4.0.

     Steve

-- 
Stephen J. Friedl / V-Systems, Inc. / Santa Ana, CA / +1 714 545 6442 
3B2-kind-of-guy   / friedl@vsi.com  / {attmail, uunet, etc}!vsi!friedl

Breaking a collarbone is a great way to cut down typing speed :-(