[comp.lang.c] See below...

simon.ewins@f664.n250.z1.fidonet.org (simon ewins) (05/25/90)

I have an interesting problem that I have been playing with that has me stumped!
 
Given an IBM-PC architecture to work with and limited to a SMALL memory model (for sake of argument) (I realize that using a large memory model takes the problem away!).  The following does not work:
 
char huge *files;
int i;
 
files=(char huge *)farmalloc(2000L*82L);
 
for (i=0; i<2000; i++)
   strcpy(&files[i],"Some kind of character string");
 
for (i=0; i<2000; i++)
   puts(&files[i]);
 
The result I get is either garbage or a total system crash!  This, with both Turbo-C and (a code equivalent in) MSC ...
 
All I really want to do is find a way to use a small memory model to access an array of 2000 strings of 81 bytes in length!
 
All I can see is that in the small memory model one may be able to allocate data arrays that are bigger than the memory model but the functions that one is likely to use to access them are all mapped to NEAR pointers and throw up when given pointer arguments longer than 16 bits.  This is true??
 
As I said, my real-world solution was to use a latge memory model.  However, I still feel that somehow one must be able to use the small model (which executes much faster due to the smaller code pointers).  Anyone who can get the above code to work in a small environment would be thought of quite highly if they could share their success with me! Thanks...
 
Simon Ewins .... Confused and Depressed :-)
 
.

--- D'Bridge 1.30/002506
 * Origin: A_X_A_X_A  [ FactBase/qDos <> 416-483-2821 ] (1:250/664)