apple@nprdc.navy.mil (Jim Apple) (08/07/90)
I have run into a problem with cpp on a 600G ( 3.2.2 ) "too many defines". I created some test files and it appears that there is a hard limit on the number of defines. The size of both the name and value affect the number of defines that I can get. Simple names and a value of 1 gives me around 1980 defines if I change the value to a string I'm down to 1200. We are using a windowing package that has 1100 defines in it, so we don't have much room left over. Is their a way to change this limit ? Any help would be great. Jim Apple apple@nprdc.navy.mil WB1DOG ...}ucsd!nprdc!apple
ram@attcan.UUCP (Richard Meesters) (08/07/90)
In article <8953@arctic.nprdc.arpa>, apple@nprdc.navy.mil (Jim Apple) writes: > > I have run into a problem with cpp on a 600G ( 3.2.2 ) > "too many defines". I created some test files and it appears > that there is a hard limit on the number of defines. The size of > both the name and value affect the number of defines that I can > get. Simple names and a value of 1 gives me around 1980 defines > if I change the value to a string I'm down to 1200. I also ran into this problem compiling, of all things, nethack sources. There is, however a fix, which should be available to you by calling the AT&T Support Hotline (NSSC) at 1-800-245-2480. The cause/symptom is as follows: Symptom: too many defines Cause: the number of #defines exceed the symbol table size: STATIC struct symtab stab[ symsiz ]; The fix is a new cpp which contains an increased symsiz to handle a larger number of defines. Unfortunately, unless you have source, you can't change it yourself. Hope this helps, Regards, ------------------------------------------------------------------------------ Richard A Meesters | Technical Support Specialist | Insert std.logo here AT&T Canada | | "Waste is a terrible thing ATTMAIL: ....attmail!rmeesters | to mind...clean up your act" UUCP: ...att!attcan!ram | ------------------------------------------------------------------------------
kevin@cfctech.cfc.com (Kevin Darcy) (08/08/90)
In article <12157@attcan.UUCP> ram@attcan.UUCP (Richard Meesters) writes: >In article <8953@arctic.nprdc.arpa>, apple@nprdc.navy.mil (Jim Apple) writes: >> >> I have run into a problem with cpp on a 600G ( 3.2.2 ) >> "too many defines". I created some test files and it appears >> that there is a hard limit on the number of defines. The size of >> both the name and value affect the number of defines that I can >> get. Simple names and a value of 1 gives me around 1980 defines >> if I change the value to a string I'm down to 1200. > > > >I also ran into this problem compiling, of all things, nethack sources. (Side note: that's why the STUPID_CPP option exists in NetHack). Actually, the "hard limit" you're running into is *only* on total symbol name space if you're getting "Too much defining." (which is what NetHack has trouble with on a 3B2). Richard was running into the hard limit on *number* of defines. Two slightly separate but interrelated problems. >There >is, however a fix, which should be available to you by calling the AT&T Support >Hotline (NSSC) at 1-800-245-2480. Er, this isn't the normal number U.S. AT&T customers use for the NSSC in New Jersey (1-800-922-0354). Or is that number the one for Lisle? As an AT&T person, you may be able to call Lisle directly on problems, but I've found that Lisle will rarely talk to mere mortals unless they already have an escalated ticket from New Jersey. >The cause/symptom is as follows: > >Symptom: too many defines > >Cause: the number of #defines exceed the symbol table size: > STATIC struct symtab stab[ symsiz ]; > >The fix is a new cpp which contains an increased symsiz to handle a larger >number of defines. > >Unfortunately, unless you have source, you can't change it yourself. No, but fortunately there are public domain alternatives to AT&T's cpp. Ones that malloc() instead of of using static arrays, for example. Richard, if you can't beat a real cpp out of Lisle, drop me a line, and I'll arrange to send you a decent PD cpp. ------------------------------------------------------------------------------ kevin@cfctech.cfc.com | Kevin Darcy, Unix Systems Administrator ...sharkey!cfctech!kevin | Technical Services (CFC) Voice: (313) 948-4863 | Chrysler Corporation Fax: (313) 948-4975 | 27777 Franklin, Southfield, MI 48034 ------------------------------------------------------------------------------
ram@attcan.UUCP (Richard Meesters) (08/08/90)
In article <1990Aug8.003316.12414@cfctech.cfc.com>, kevin@cfctech.cfc.com (Kevin Darcy) writes: | In article <12157@attcan.UUCP> ram@attcan.UUCP (Richard Meesters) writes: | > | >I also ran into this problem compiling, of all things, nethack sources. | | (Side note: that's why the STUPID_CPP option exists in NetHack). | | Actually, the "hard limit" you're running into is *only* on total symbol name | space if you're getting "Too much defining." (which is what NetHack has | trouble with on a 3B2). Richard was running into the hard limit on *number* | of defines. Two slightly separate but interrelated problems. | I'll agree here, there are two slightly sepreate but interrelated problems. The fix I have, however, fixes both problems. Having to use the "STUPID_CPP" option when compiling nethack goes away after you apply the fix that I received. | >There | >is, however a fix, which should be available to you by calling the AT&T Support | >Hotline (NSSC) at 1-800-245-2480. | | Er, this isn't the normal number U.S. AT&T customers use for the NSSC in New | Jersey (1-800-922-0354). | Sorry, being in Canada, that's the number I call to get through to the NSSC. Your Mileage may vary... | Or is that number the one for Lisle? As an AT&T person, you may be able to | call Lisle directly on problems, but I've found that Lisle will rarely talk | to mere mortals unless they already have an escalated ticket from New Jersey. | Agreed, but given the knowledge of what the problem is, you should be able to get the patch directly from the NSSC. | >The cause/symptom is as follows: | > | >Symptom: too many defines | > | >Cause: the number of #defines exceed the symbol table size: | > STATIC struct symtab stab[ symsiz ]; | > That's problem one, the second cause/symptom fixed by the patch is as follows: Symptom: too much defining Cause: the amount of text for defines and includes exceeds the buffer if ( savch > sbf + SBSIZE - BUFSIZ ) | >The fix is a new cpp which contains an increased symsiz to handle a larger | >number of defines. | > | >Unfortunately, unless you have source, you can't change it yourself. | | No, but fortunately there are public domain alternatives to AT&T's cpp. Ones | that malloc() instead of of using static arrays, for example. | | Richard, if you can't beat a real cpp out of Lisle, drop me a line, and I'll | arrange to send you a decent PD cpp. Actually, I'd love to see it. Regards, ------------------------------------------------------------------------------ Richard A Meesters | Technical Support Specialist | Insert std.logo here AT&T Canada | | "Waste is a terrible thing ATTMAIL: ....attmail!rmeesters | to mind...clean up your act" UUCP: ...att!attcan!ram | ------------------------------------------------------------------------------
debra@alice.UUCP (Paul De Bra) (08/08/90)
In article <12157@attcan.UUCP> ram@attcan.UUCP (Richard Meesters) writes: >In article <8953@arctic.nprdc.arpa>, apple@nprdc.navy.mil (Jim Apple) writes: >> >> I have run into a problem with cpp on a 600G ( 3.2.2 ) >> "too many defines"... >I also ran into this problem compiling, of all things, nethack sources. There >is, however a fix, which should be available to you by calling the AT&T Support >Hotline (NSSC) at 1-800-245-2480. Well, the programmers sure know their ways to keep the support line busy... Does such a fix also exist for the DMD630 software? The limit there is also ridiculously low, as are a number of other limits, not just in the preprocessor but also in the compiler, assembler, optimizer, etc. It's almost as if they thought the compiler should be able to run inside the terminal instead of on the host... Paul. -- ------------------------------------------------------ |debra@research.att.com | uunet!research!debra | ------------------------------------------------------
mtd@mtunf.ATT.COM (Mario T DeFazio) (08/09/90)
In article <1990Aug8.003316.12414@cfctech.cfc.com>, kevin@cfctech.cfc.com (Kevin Darcy) writes: > In article <12157@attcan.UUCP> ram@attcan.UUCP (Richard Meesters) writes: > >In article <8953@arctic.nprdc.arpa>, apple@nprdc.navy.mil (Jim Apple) writes: > >> > >> I have run into a problem with cpp on a 600G ( 3.2.2 ) > >> "too many defines". I created some test files and it appears > >> that there is a hard limit on the number of defines. The size of > >> both the name and value affect the number of defines that I can > >> get. Simple names and a value of 1 gives me around 1980 defines > >> if I change the value to a string I'm down to 1200. > > Actually, the "hard limit" you're running into is *only* on total symbol name > space if you're getting "Too much defining." (which is what NetHack has > trouble with on a 3B2). Richard was running into the hard limit on *number* > of defines. Two slightly separate but interrelated problems. [...] > >The cause/symptom is as follows: > > > >Symptom: too many defines > > > >Cause: the number of #defines exceed the symbol table size: > > STATIC struct symtab stab[ symsiz ]; > > > >The fix is a new cpp which contains an increased symsiz to handle a larger > >number of defines. > > > >Unfortunately, unless you have source, you can't change it yourself. If the application is using curses, it doesn't surprise me that there are too many #define's. There are nearly 800 #define's in term.h and curses.h alone. (I think Jim had indicated that his application does do screen manipulation.) First, I hope you are using C Programming Language Utilities (CPLU) Version 4.2. The cpp in there is supposed to be able to handle up to 20,000 #define's. This alone might fix the problem. If it's still blowing out, another way of solving the problem is to isolate the curses function calls (and thus the inclusion of term.h and curses.h) in a separate C source module. You might have to do further module breakup (if your own header files are also large) until you get under the limit. Yes, I work for AT&T, but I'm not trying to make excuses for the hard limit in cpp. I just thought you could use a possibly quicker solution to your problem. And a solution over which you have much more control. Hope this helps, Mario T DeFazio AT&T Bell Labs AT&T Mail: mdefazio Lincroft, New Jersey UUCP: att!mtunf!mtd Internet: mtd@mtunf.att.com