[comp.lang.c] Problems allocating 300k data structure using Turbo C

kemps@prism.CS.ORST.EDU (Scott Micheal Kemp) (03/16/91)

Machine: IBM 386
Compiler: Turbo C++
Problem: Need to make an array larger than 64k.
 
Specifically, I need to create a dynamic array of a 10 byte
structure which will need over 300k.
 
First, can you tell me if this is the correct declaration of my
array:
 
     POINTS huge *points;
 
What about this?
 
     POINTS huge (*points)[];
     
Now I need to create memory for this array.  I assume I need to use
farmalloc.  How do I type cast the pointer returned by farmalloc
to my pointer to the dynamic array?
 
Lastly, what memory model do I need to use with Turbo C?  My code is
"small", and the rest of the data (except the 300k structure) is
"small", so do I use "small", or do I need to use "huge"?
 
Thanks for your help.
 
                                     |-------------------------------]
================ Scott Michael Kemp  |  Midician by Profession...    ]
============ kemps@prism.cs.orst.edu |      The only way to die!     ]  
=== Computer Sc., Oregon St. Univ.  /|\------------------------------]

grimlok@hubcap.clemson.edu (Mike Percy) (03/19/91)

Try     
POINT huge *p;
 
p = farmalloc(300000UL);
if(!p) /* error ... */ 
     
for(long i = 0; i < 300000L; p[i++] = 0);
 
/* the above might not work, the [] operator may take only ints, I can't
remember. If so, the following should work */

for(long i = 0; i < 300000L; *(p + i++) = 0);
 
You can use any model except the tiny model.

Check out the section on pointer arithmetic in the Programmer's Guide
(p. 57 in my copy).

"I don't know about your brain, but mine is really...bossy."
Mike Percy                    grimlok@hubcap.clemson.edu
ISD, Clemson University       mspercy@clemson.BITNET
(803)656-3780                 mspercy@clemson.clemson.edu

campbell@dev8o.mdcbbs.com (Tim Campbell) (03/19/91)

In article <1991Mar15.230133.16073@lynx.CS.ORST.EDU>, kemps@prism.CS.ORST.EDU (Scott Micheal Kemp) writes:
> Machine: IBM 386
> Compiler: Turbo C++
> Problem: Need to make an array larger than 64k.
>  
> Specifically, I need to create a dynamic array of a 10 byte
> structure which will need over 300k.

A friend of mine just went through this problem.  Here's what we know.
Your problem exists *only* because you are using a segmented memory
environment.  Specifically this problem was identified in Borland's Turbo
C - I imagine that the problem is a carry-over into C++.  We also tried
it using a Microsoft compiler and had the same problem - I suspect you'll
find most C compilers on a PC (segment memory) will have this problem.
The address (offset) of the *beginning* of any element in a structure, must
be within 64k bytes of the beginning of the base pointer to the structure.
Basically, the compiler assumes that all elements of a structure are located
in the same memory segment as the base of the structure (they share segment
addresses).  It does not matter what memory model you use, it does not matter
if you allocate the memory static or dynamic - you will have this problem
no matter what (we can find no way to fix it other than writing "voodoo code".

Here's an example.

	struct {
	    char array_1[65534];
	    char array_2[5];
	    char array_3[10];
	} stuff;

The machine will have no problem accessing stuff.array_1 - this element is
completely within 64k of the base address of "stuff".
The machine will have no problem accessing stuff.array_2 - although this
element does cross the 64k barrier - the fact is, it *begins* before the
64k barrier (the beginning is all that matters - if it can find out where
the data begins, it will find everything).
The machine *WILL* have a problem with stuff.array_3.  This is because the 
base of array_3 is more than 65535 bytes from the start of "stuff".  Basically
you'll have a "wrap around" effect.  It will attempt to locate array_3 only
five bytes from the base of "stuff".

We maintain that this is a "bug" - Borland seems to maintain that their
compiler simply doesn't support this "feature".  So I guess if you consider
proper compilation and execution of code to be a "feature" you might be
inclined to agree with Borland.  Other compilers (those not dependent on
segmented memory) don't seem to have this problem.  And of course I don't
beleive the language spec limits structures or arrays (aggregates) to 64k.

	-Tim

  ---------------------------------------------------------------------------
	  In real life:  Tim Campbell - Electronic Data Systems Corp.
     Usenet:  campbell@dev8.mdcbbs.com   @ McDonnell Douglas M&E - Cypress, CA
       also:  tcampbel@einstein.eds.com  @ EDS - Troy, MI
 CompuServe:  71631,654
 P.S.  If anyone asks, just remember, you never saw any of this -- in fact, I 
       wasn't even here.

torek@elf.ee.lbl.gov (Chris Torek) (03/20/91)

In article <1991Mar15.230133.16073@lynx.CS.ORST.EDU> kemps@prism.CS.ORST.EDU
(Scott Micheal Kemp) writes:
>     POINTS huge (*points)[];

Ignoring the `huge' modifier, this declares `points' as a

	pointer to array ? of POINTS

where `?' means `I have no idea how big this array is'.  Such a type
is nearly useless; I claim it should not exist at all, but for the fact
that it has to, to give meaning to

	extern char foo[];
	... &foo

There is nothing you can do with a `pointer to array ? of T' that you
cannot do with just a `pointer to T', so you might as well ignore the
existence of this type.

See the Frequently Asked Questions for more details.
-- 
In-Real-Life: Chris Torek, Lawrence Berkeley Lab CSE/EE (+1 415 486 5427)
Berkeley, CA		Domain:	torek@ee.lbl.gov