[comp.lang.c++] Turbo C++ & Zortech C++ memory problems

waynet@tekred.CNA.TEK.COM (Wayne Turner) (01/08/91)

We are developing a commercial application under DOS using
Turbo C++ Version 1.01 and are consistently running into "Out
of memory", "Not enough memory" errors from the command-line
versions of the compiler and linker. The application itself is
not huge--the source consists of about 5-6K lines and the
current .EXE takes up about 360K bytes when loaded.

Typically "tcc" will give an out of memory error when the .c
file is at least 200-500 lines in length and the amount of code
loaded from header files is about 1000-2000 lines. It would be
difficult (if not impossible) for us to reduce the amount of
header code included since we are using C++ class hierarchies
that are up to 5 or 6 levels deep, and also need to define
enumerators, consts, structs and so on.

The developments systems are Compaq 386/20e's with 640K base
memory running DOS 3.31. One system has 2 Megs extended memory
and the other has 4 Megs. The TDMEM utility recognizes the
presence of the stated amount of extended memory. We are using
the -Qx option with tcc and /yx with tlink yet both run out of
memory in certain instances.

The only thing Borland tech support could suggest was to (1)
break down our source modules into smaller modules (2) to
unload any TSRs or unnecessary drivers. (1) is not really a
solution for reasons stated above. (2) is only a temporary
workaround. Sometimes the out of memory errors occur when there
are 520K bytes free (again as reported by TCC) but a compile
succeeds when 540K bytes are free.  We are still in the early
to middle stages of development and are concerned that at some
point even 540K bytes will not be enough.  Also note that
sometimes it is necessary not only to unload any
memory-resident utilities but to use the -S option with "make"
to cause make to swap itself out of memory. In other instances
we must actually invoke tcc or tlink from the DOS command line
instead of using make at all.

As an alternative, we are currently evaluating Zortech's C++
development system and are finding that we must seriously hack
up some medium-size source modules just to get a working
executable for our application. We are using the version of the
compiler that uses Rational Systems DOS Extender technology
("ztc -br -e ...") and with larger modules get an error from
DOS 16/M complaining about "involuntary switch to real mode".
Zortech tech support is currently looking into this for us and
appears to be serious about finding a solution.

Is anyone out there developing applications with TC++ or
Zortech C++ that take full advantage of C++ classes and are as
large or larger that our application? If so have you run into
similar memory problems? We would be interested in hearing
about your experiences and any possible solutions or
workarounds. We are even considering cross-development systems
(that run on Sun Workstations) , if any exist, so we can get on
with development and stop fighting the development tools.

Other pertinent info:
Using large memory model 
Using debugging option for all modules (we don't know in 
advance what modules will need debugging :-)

Please respond by email to

waynet@kit.CNA.TEK.COM

and I will summarize responses and post them to the net.

Thanks, 
Wayne Turner 
Tektronix, Inc.

emigh@ncsugn.ncsu.edu (Ted H. Emigh) (01/09/91)

In article <6818@tekred.CNA.TEK.COM> waynet@tekred.CNA.TEK.COM (Wayne Turner) writes:
>We are developing a commercial application under DOS using
>Turbo C++ Version 1.01 and are consistently running into "Out
>of memory", "Not enough memory" errors from the command-line
>versions of the compiler and linker. The application itself is
>not huge--the source consists of about 5-6K lines and the
>current .EXE takes up about 360K bytes when loaded.

....

>Other pertinent info:
>Using large memory model 
>Using debugging option for all modules (we don't know in 
>advance what modules will need debugging :-)

My comment has to do with his last statement.  I have a large project which
I maintain (sounds about the same size as his).  I turn on debugging ONLY
for the module which contains main{} and any others which I am currently
debugging.  A well-designed project should allow you to be able to turn on
debugging for a couple of modules at a time.  I have three makefiles -- one
which turns on debugging info for ONLY the main{} module; one which turns
on debugging info for ALL modules; and one which turns off debugging info
for ALL modules.  When I change the headers or want to rebuild the program,
I use the first.  When I want to turn on debugging for a module (test.c),
I delete test.obj, then use the second makefile.  When I want to turn off
debugging for a module (test.c), I delete test.obj, then use the last
makefile.

(Note: all makefiles have link set up to include debugging information.  I
also have a "production" makefile which doesn't link debugging information.)

bourd@galaxy.cps.msu.edu (Robert Bourdeau) (01/10/91)

In article <6818@tekred.CNA.TEK.COM>, waynet@tekred.CNA.TEK.COM (Wayne
Turner) writes:
|> We are developing a commercial application under DOS using
|> Turbo C++ Version 1.01 and are consistently running into "Out
|> of memory", "Not enough memory" errors from the command-line
|> versions of the compiler and linker. The application itself is
|> not huge--the source consists of about 5-6K lines and the
|> current .EXE takes up about 360K bytes when loaded.
|> 

Following up on your problems with TurboC++, I too have had the same problems.
The application which I have been involved in developing is currently
about 10K lines of my own source code not considering the class library
and include files.   The class hierarchy which I have developed
is approximately 7 levels deep at the greatest.

I was using a lot of inline code in my low level class definitions
to improve performance, and it was this that was causing me
a great deal of memory problems.  When I reduced the inline code to
only the most minimal situations, the memory problems went away.

Apparently, using inline code requires a significant amount of
RAM to hang on to, and it does not take a whole lot of code
to use up all available space.

Hope this helps some.

Robert Bourdeau
bourd@buster.cps.msu.edu
Michigan State University

keffert@jacobs.CS.ORST.EDU (Thomas Keffer) (01/11/91)

In article <1991Jan10.143902.23969@msuinfo.cl.msu.edu> bourd@galaxy.cps.msu.edu (Robert Bourdeau) writes:
>In article <6818@tekred.CNA.TEK.COM>, waynet@tekred.CNA.TEK.COM (Wayne
>Turner) writes:
>|> We are developing a commercial application under DOS using
>|> Turbo C++ Version 1.01 and are consistently running into "Out
>|> of memory", "Not enough memory" errors from the command-line
>|> versions of the compiler and linker. The application itself is
>|> not huge--the source consists of about 5-6K lines and the
>|> current .EXE takes up about 360K bytes when loaded.
>|> 
>
>Following up on your problems with TurboC++, I too have had the same problems.
>The application which I have been involved in developing is currently
>about 10K lines of my own source code not considering the class library
>and include files.   The class hierarchy which I have developed
>is approximately 7 levels deep at the greatest.

We encountered this problem while checking out Tools.h++ on Turbo C++.
It's a bug being tickled by your code --- the memory 
limitation is not real.  Try commenting out various parts of 
your code, looking for the offending statements.

-tk