[comp.unix.xenix] XENIX Compiler Limitations

jkimble@crash.CTS.COM (Jim Kimble) (10/08/87)

For the past few days I have been in the process of porting over a UNIX
(system V) program over to a XENIX machine (It's a TI-1100 2MB of ram,
with nothing real special under the hood). 

I have been having a problem of running out of heap space. The error occures
when the compiler hits a whole load of DEFINE statements (in "C", obviously).

The book recommends:

	o	Add more memory
	o 	Break the program into small segments

	I added 3 additional megs of RAM to no avail and broke down
	the program into a whole bunch of little programs. Same error,
	"Out of heap space."

	My thinking now is that the compiler can only handle a preset
	number of defines and tables. Is this thinking along the right
	track or am I way out in left-field?

	Has anyone else had a similiar problem?

	Any and all help is greatly appreciated...


--Jim Kimble
UUCP: {hplabs!hp-sdd, sdcsvax, nosc}!crash!jkimble
ARPA: crash!jkimble@nosc
INET: jkimble@crash.CTS.COM

"Yesterday I picked up some instant water but I have no idea what to add..."

dyer@spdcc.COM (Steve Dyer) (10/08/87)

Use the -LARGE flag to cc.  That indicates that the cc command should
use the large model passes of the compiler rather than the standard
which is medium or small, can't remember.  It's a crock that you have
to do this; ideally, the small model pass should exit with a unique
error status which cc could interpret as meaning "I've run out of memory;
reinvoke the large model version of this pass."
-- 
Steve Dyer
dyer@harvard.harvard.edu
dyer@spdcc.COM aka {ihnp4,harvard,linus,ima,bbn,m2c}!spdcc!dyer

dgb@minnow.UUCP (Dennis Bach) (10/12/87)

I wrote a ddcmp device driver for xenix a few months back and had the same
problem. The driver is made up of several files all about the same length.
The problem occured in the files that included lots of system .h files. I
finally made private copies of these and deleted out the stuff I didn't need.
Also, in one of the files that didn't include system .h files I had to change
my macros to procedure calls, (I could have inserted the actual code, but ...).

I can't use the Large model.

The problem occured before linking, but I can't remember if it occurred during
preprocessing or not. I think it did. Anyway, everything works fine now, and I 
only had to sacrifice future compatibility and speed :-).
-- 
Dennis Bach               UUCP: ...!amdahl!ems!minnow!dgb
Unisys Corporation               ...!ihnp4!meccts!ems!minnow!dgb
Phone: +1 612 635 6334    CSNET: dgb@minnow.SP.Unisys.Com

dgb@minnow.UUCP (Dennis Bach) (10/12/87)

Sorry, I just posted an article describing a problem with heap space overflow
during a compile, but forgot to mention what that the problem is heap space
overflow.

-- 
Dennis Bach               UUCP: ...!amdahl!ems!minnow!dgb
Unisys Corporation               ...!ihnp4!meccts!ems!minnow!dgb
Phone: +1 612 635 6334    CSNET: dgb@minnow.SP.Unisys.Com

mdf@tut.cis.ohio-state.edu (Mark D. Freeman) (10/13/87)

In <1835@crash.CTS.COM> jkimble@crash.CTS.COM (Jim Kimble) writes:
>For the past few days I have been in the process of porting over a UNIX
>(system V) program over to a XENIX machine (It's a TI-1100 2MB of ram,
>with nothing real special under the hood). 
>
>I have been having a problem of running out of heap space. The error occures
>when the compiler hits a whole load of DEFINE statements (in "C", obviously).

Try the -LARGE option to cc.  This causes it to use executables that
were compiled in large model to process your code.  Therefore, it can
deal with more than 64K worth of heap.  Without -LARGE, you can add all
the memory you want, but the compiler can only use 64K of data space.
-- 
Mark D. Freeman							(614) 262-3703
StrongPoint Systems, Inc.			    mdf@tut.cis.ohio-state.edu
2440 Medary Avenue		 ...!cbosgd!osu-cis!tut.cis.ohio-state.edu!mdf
Columbus, OH  43202		    Guest account at The Ohio State University