barnes@infinet.UUCP (Jim Barnes) (04/23/86)
I posted a question about lint to net.lang.c and net.unix but have received only one response so far. Can anyone in this group offer any suggestions? The original posting follows. >Recently I tried running lint on some code that I was modifying. >I started getting an error message of the following type >repeated over and over again. > > ./inc.h (251): too much defining > >The line number would change in each error message, but the >end result seems to be that the lint symbol table that stores >#define'd symbols was overflowing. Is there any way of >expanding the symbol table size? (I think we have the 4.2 >lint sources hanging around.) > >Please e-mail any suggestions that you might have. > To add a little information, I have had no problems compiling the program, the problem only occurs when I use lint. Secondly, there are several large #include files referenced by the source that are part of a dbms package that we use. I consider hacking those #include files a last resort since I will have to modify the files every time we get a new release of the dbms package. -- ------------------------- decvax!wanginst!infinet!barnes Jim Barnes
barnes@infinet.UUCP (Jim Barnes) (04/23/86)
I have apparently solved my problem with lint. I increased the value of SBSIZE in the cpp.c file and rebuilt the C preprocessor. After that, lint ran fine on my program. I still do not know why cpp did not fail when I compiled my program, only when I ran lint. Does lint itself have a lot more #defines than the C compiler? Thanks to Bill Masek and Chris Torek for suggesting that I look at cpp. -- ------------------------- decvax!wanginst!infinet!barnes Jim Barnes