chang@premise.ZONE1.COM (John Chang) (04/07/88)
Does anyone know what the MSC compiler (ver 5.0) uses the near and far heaps for? The problem: I'm getting "Out of near heap space" and "Out of far heap space" errors during my compiles. My individual source files are not that big (max 25 K), although I do run nested makes, and have one large include file in just about every file (Microsoft Window's windows.h). I'm using ndmake 5.0; Microsoft's make just doesn't make it :-) My environment: I have 640 K in a 386, but only 470K after network drivers and TSR's are loaded. I have lots of extra memory, but it isn't being used by the compiler. 386-to-the-max can move the TSR's out, but the overhead of the program is nearly equal to the savings in memory. The network driver eat up ~100K, and can't be moved out of lower memory. Also, 386-to-the-max conflicts with other 386 programs like Windows 386, or the Compaq memory cache, or some of the new debuggers that relocate themselves in extended memory. Some observations: It seems the near heap is used for a symbol table, and parse tree, since I can lessen the 'near heap space' problem by hand copying declarations from include files, and writing shorter expressions. I can increase the far heap size by removing my network drivers, but I'd like to find a more elegant solution than rebooting every time I get the error. Also, the problem is flaky: sometimes I can rerun my make, and get the compiler to accept a file it just rejected. Suggestions: Perhaps using 'appropriate' values of stack size with exemod would be the answer?? I realize that I'm asking about tweaking the compiler so that I can squeeze just a little more memory out of it. But I'd also like to know what I can do to the source or to my environment to minimize my inconvenience. Don't suggest that I switch compilers to brand X, unless that brand supports Windows. When is MS coming out with a compiler that uses extended/expanded memory? Does the OS/2 development environment have such a tool already?