[comp.lang.c] Large 2D array woes...

fechuny@aix01.aix.rpi.edu (felix chung yau) (03/14/91)

    I am a new programer in C and was wondering if someone could help me with
a problem I have in using a two dimensional array.  I am currently using
Microsoft Quick C and my problem is in dimensioning an integer array like
test[400][400].  When I compile the program I get the message "allocation
exceeds 64k".  I have tried using the "far" keyword to define the variable
as a far pointer but that does not seem to work.  I was wondering if there
is a better way to define and handle large arrays.
    Thank you in advance.

marwk@levels.sait.edu.au (03/14/91)

In article <!+%=PZ$@rpi.edu>, fechuny@aix01.aix.rpi.edu (felix chung yau) writes:
>
>     I am a new programer in C and was wondering if someone could help me with
> a problem I have in using a two dimensional array.  I am currently using
> Microsoft Quick C and my problem is in dimensioning an integer array like
> test[400][400].  When I compile the program I get the message "allocation
> exceeds 64k".  I have tried using the "far" keyword to define the variable
> as a far pointer but that does not seem to work.  I was wondering if there
> is a better way to define and handle large arrays.
>     Thank you in advance.

In TURBOC one has to use huge pointers aas in the following example.
This means that a 32-bit pointer is used to point to the data.  FAR pointers
do not work as they still wrap around at the 64K mark.

#include <stdio.h>
#include <alloc.h>

typedef char huge *   MYLARGE;

long i;
MYLARGE lots;

#define SIZE 65538L

void main(void)
    {
    lots = (MYLARGE) farmalloc(SIZE);

    for (i = 0; i < SIZE; ++i)
        lots[i] = i % 10L + 1;

    for (i = 0; i < 10L; ++i)
        printf("%ld: %2d\n", i, lots[i]);
    }


for a 400 * 400 array this is not necessary (just realised this, sorry):

char far * p_rows[400];

for (i = 0; i < 400; ++i)
    p_rows[i] = (char far *) malloc(400 * sizeof(char));

This allocates 400 columns for each row.

FAR pointers are required as there is not enough room on the near heap
to hold all the data.

Now the element in the r'th row and c'th column is p_rows[r][c], just as
you would expect.

Ray
--
University of South Australia   | Plus ca change, plus c'est la meme chose.
P.O. Box 1                      | Ghing thien me how, ming thien gung me how.
Ingle Farm                      | Knobs, knobs everywhere,
South Australia                 |              just vary a knob to think!

craig@bacchus.esa.oz.au (Craig Macbride) (03/15/91)

Are you using the small, medium, large or huge memory model libraries? If
the answer is anything other than "huge", you're doing it wrong. To exceed
64k total variable space, you must at least use large model. To exceed
64k in one variable/array/structure, you _must_ use huge. This is the legacy
of the disgusting 8086 architecture that MessyDOS programmers must put up
with.

Using the "far" keyword will not help in this case. Without huge model
implementation, single data items just can't exceed 64k in length.

-- 
 _____________________________________________________________________________
| Craig Macbride, craig@bacchus.esa.oz.au      | Hardware:                    |
|                                              |      The parts of a computer |
|   Expert Solutions Australia                 |        which you can kick!   |