jb@CSUStan.EDU (John Birchfield) (05/12/89)
I have a program which does directory scanning in a big way and I am
finding that the program corrupts memory and behaves in a more or less
irrational manner when the Large Memory Model is used.
The errors reported relate to the inability to allocate memory and the
program failing or that DOS reports memory as being corrupted and
can't load COMMAND.COM when the program terminates.
The problems dissappear when I use the Small Memory Model - I know that this
sounds like the old pointer size problem but I have looked over the places
the program does mallocs and frees and they seem to be correct.
The question then is IS THERE A KNOWN BUG IN MALLOC USING THE LARGE MEMORY
MODEL???
thanks
- -
=== ===
John Birchfield 1575 Quail Ct.
jb@koko.csustan.edu Turlock, CA 95380
(209) 634-6243cramer@optilink.UUCP (Clayton Cramer) (05/16/89)
In article <1071@koko.CSUStan.EDU>, jb@CSUStan.EDU (John Birchfield) writes: > I have a program which does directory scanning in a big way and I am > finding that the program corrupts memory and behaves in a more or less > irrational manner when the Large Memory Model is used. > > The errors reported relate to the inability to allocate memory and the > program failing or that DOS reports memory as being corrupted and > can't load COMMAND.COM when the program terminates. > > The problems dissappear when I use the Small Memory Model - I know that this > sounds like the old pointer size problem but I have looked over the places > the program does mallocs and frees and they seem to be correct. > > The question then is IS THERE A KNOWN BUG IN MALLOC USING THE LARGE MEMORY > MODEL??? > > John Birchfield 1575 Quail Ct. There may be a bug, but it is almost certainly a bug in your code somewhere -- probably a place where you free something you don't own, or use heap you haven't allocated. Here's something that I use for debugging. The #defines go in some high level .h file that all modules include, and cause all your mallocs and frees to actually go to MyMalloc and MyFree, which records every malloc and free. Go through the output file when your program finishes, and remove all the matching pairs of malloc and free records, and you will frequently find blocks of memory you didn't malloc being freed. This screws up the heap something awful, with all sorts of un- pleasant side effects. This doesn't do you any good for references through uninitialized pointers, but it will solve some problems, as well as giving you an idea how much heap you are mallocing, and how long you are keeping it. /* Special purpose definitions intended for use in debugging memory allocation and freeing errors. */ #define malloc(A) MyMalloc(__FILE__, __LINE__, A) #define free(A) MyFree(__FILE__, __LINE__, A) FILE* MallocOut = NULL; void *MyMalloc (FileName, LineNbr, BytesToGet) char* FileName; int LineNbr; size_t BytesToGet; { void* Mem; if(MallocOut) { #if 0 Mem = (void*)halloc((long)BytesToGet, sizeof(char)); #else Mem = malloc(BytesToGet); #endif if(Mem) fprintf(MallocOut, "M %04x:%04x %s:%d %5d\n", FP_SEG(Mem), FP_OFF(Mem), FileName, LineNbr, BytesToGet); else fprintf(MallocOut, "* %s:%d %5d\n", FileName, LineNbr, BytesToGet); fflush(MallocOut); } else #if 0 Mem = (void*)halloc((long)BytesToGet, sizeof(char)); #else Mem = malloc(BytesToGet); #endif return(Mem); } void MyFree (FileName, LineNbr, Mem) char* FileName; int LineNbr; void* Mem; { if(MallocOut) { if(Mem) fprintf(MallocOut, "F %04x:%04x %s:%d\n", FP_SEG(Mem), FP_OFF(Mem), FileName, LineNbr); fflush(MallocOut); } #if 0 if(Mem) hfree(Mem); #else if(Mem) free(Mem); #endif } void StartMallocRecord () { MallocOut = fopen("malloc", "w+"); } void EndMallocRecord () { fclose(MallocOut); MallocOut = NULL; } -- Clayton E. Cramer {pyramid,pixar,tekbspa}!optilink!cramer Assault rifle possession is a victimless crime. ---------------------------------------------------------------------------- Disclaimer? You must be kidding! No company would hold opinions like mine!
wesw@ozvax.WV.TEK.COM (Wes Whitnah) (05/17/89)
There are bugs in MSC and/or DOS (verified VERBALLY to me by
some MicroSoft Engineers a few years back -- see description
below for details...) when heavily manipulating memory using
the Large Memory Model -- by heavily manipulating memory I
mean allocating and freeing memory VERY often. Most memory
bugs I have seen though are in the application itself.
(Description of my personal experiences to follow... )
I worked on an application which uses a LARGE amount of memory
for windows, data structures, file information, etc. and of
course uses the large model with MSC. After many minutes of
intense use (usually ~30 mins) the system would come up with
either an "out-of-memory" or "memory-corrupted" error. Either
way after exiting the application a "memory-corrupted" message
would be displayed and the system would halt.
I double-checked all use and manipulation of memory, and set
traces on the allocation and freeing of the memory. Everything
looked O.K. . I suspected some fragmentation and made sure
memory allocation and freeing were in the proper order. No
change in the results.
After many days working through the problem it began to look like
either a library or system problem (What! question MSC or DOS? :-)
I created a test program which allocated random-sized chunks of
memory up to a maximum limit and then freed them in reverse order,
repeating this sequence a given number of times. Using either
malloc() or direct DOS calls produced the same results -- system
memory corruption at the third pass (or less). Before the fatal
error condition memory was *VERY* fragmented. Freeing blocks
in the same order of allocation, or in random order did not
significantly change the results.
Now that I didn't trust MSC or DOS to correctly manipulate many
blocks of memory correctly I wrote my own memory manager which
called malloc() for LARGE blocks of memory and it managed the
many requests for memory from this large-block pool. I then
ran my above test with this memory manager and have had no
troubles to this day (after debugging the memory manager of
course! :-).
I confronted some MicroSoft engineers with my findings, and
after a long discussion ("Did you check this?", "Yes";...)
they said, "Oh, you found *THAT* memory problem! Yes we know
that DOS (and malloc() which uses DOS calls) has problems when
memory is heavily used, but not many people come across it,
so we don't intend to fix it!" You might guess my reaction to
that! When I asked what developers and users should do when they
come across this problem they told me, "They should write their own
memory manager!".
The above events transpired nearly two years ago, so I don't know
if MicroSoft has really put much effort into solving these problems.
From what I have seen of recent DOS versions and MSC 5.0 the old
memory bugs are still there. I've not come across them very often
unless many blocks of memory are allocated and freed with FAR
pointers, and the first noticeable problem is memory fragmentation.
The best bet after *THOUROUGHLY* verifying your program's memory
useage is to write your own memory manager and avoid confusing the
DOS manager.
Wes Whitnah
wesw@ozvax.WV.TEK.COMbill@zycor.UUCP (bill) (05/18/89)
In article <1659@ozvax.WV.TEK.COM> wesw@ozvax.UUCP (Wes Whitnah) writes: > > There are bugs in MSC and/or DOS (verified VERBALLY to me by > some MicroSoft Engineers a few years back -- see description > below for details...) when heavily manipulating memory using > the Large Memory Model -- by heavily manipulating memory I > mean allocating and freeing memory VERY often. Most memory > bugs I have seen though are in the application itself. > Does this imply that a program compiled in the compact model (small code, big data) does the same thing? Also, does the size of the blocks make a difference? I have an application that uses large model memory and does malloc/free all over the place of blocks that are about 40 bytes long. Should I look harder and see if this problem is happening, or is it only when there are large "chunks" of memory being malloc'd? Enquiring minds want to know. Bill Mahoney bill@zycor.UUCP Warning: Driver breaks for 2400 baud. -- Bill Mahoney bill@zycor.UUCP Warning: Driver breaks for 2400 baud.