nielsen@opus.ee.QueensU.CA (Robert Nielsen) (05/30/91)
I am a brand new Windows programmer about to begin a fairly large project. The program I will be writing uses several linked lists in which nodes are allocated as the need arises. This seems to be a problem in Windows as there are a limited number of memory handles and a large amount of overhead for each block. The SDK recommends grouping small memory objects together. The problem with this is that you cannot know in advance how many nodes are required. Making an assumption as to the number of nodes required for different stages of data structure genereation would be wasteful and would probably use much more memory than required. What is the best way to allocate memory in many small (thousands) chunks without exhausting Windows memory handling resources?
venkat@spdeast.East.Sun.COM (Desikan Venkatrangan - Sun BOS Software CONTRACTOR) (06/01/91)
In article <486@opus.ee.QueensU.CA> nielsen@opus.ee.QueensU.CA (Robert Nielsen) writes: > >What is the best way to allocate memory in many small (thousands) >chunks without exhausting Windows memory handling resources? Microsoft Systems Journal, Jan 1991 issue has the stuff you need. The article is 'Improve Windows Application Memory Use with Subsegment Allocation and Custom Resources', an excellent article by Paul Yao. -venkat