waynet@tekred.CNA.TEK.COM (Wayne Turner) (01/08/91)
We are developing a commercial application under DOS using Turbo C++ Version 1.01 and are consistently running into "Out of memory", "Not enough memory" errors from the command-line versions of the compiler and linker. The application itself is not huge--the source consists of about 5-6K lines and the current .EXE takes up about 360K bytes when loaded. Typically "tcc" will give an out of memory error when the .c file is at least 200-500 lines in length and the amount of code loaded from header files is about 1000-2000 lines. It would be difficult (if not impossible) for us to reduce the amount of header code included since we are using C++ class hierarchies that are up to 5 or 6 levels deep, and also need to define enumerators, consts, structs and so on. The developments systems are Compaq 386/20e's with 640K base memory running DOS 3.31. One system has 2 Megs extended memory and the other has 4 Megs. The TDMEM utility recognizes the presence of the stated amount of extended memory. We are using the -Qx option with tcc and /yx with tlink yet both run out of memory in certain instances. The only thing Borland tech support could suggest was to (1) break down our source modules into smaller modules (2) to unload any TSRs or unnecessary drivers. (1) is not really a solution for reasons stated above. (2) is only a temporary workaround. Sometimes the out of memory errors occur when there are 520K bytes free (again as reported by TCC) but a compile succeeds when 540K bytes are free. We are still in the early to middle stages of development and are concerned that at some point even 540K bytes will not be enough. Also note that sometimes it is necessary not only to unload any memory-resident utilities but to use the -S option with "make" to cause make to swap itself out of memory. In other instances we must actually invoke tcc or tlink from the DOS command line instead of using make at all. As an alternative, we are currently evaluating Zortech's C++ development system and are finding that we must seriously hack up some medium-size source modules just to get a working executable for our application. We are using the version of the compiler that uses Rational Systems DOS Extender technology ("ztc -br -e ...") and with larger modules get an error from DOS 16/M complaining about "involuntary switch to real mode". Zortech tech support is currently looking into this for us and appears to be serious about finding a solution. Is anyone out there developing applications with TC++ or Zortech C++ that take full advantage of C++ classes and are as large or larger that our application? If so have you run into similar memory problems? We would be interested in hearing about your experiences and any possible solutions or workarounds. We are even considering cross-development systems (that run on Sun Workstations) , if any exist, so we can get on with development and stop fighting the development tools. Other pertinent info: Using large memory model Using debugging option for all modules (we don't know in advance what modules will need debugging :-) Please respond by email to waynet@kit.CNA.TEK.COM and I will summarize responses and post them to the net. Thanks, Wayne Turner Tektronix, Inc.