[comp.sys.ibm.pc.programmer] Porting software to the PC, 64K data structure barrier.

avinash@felix.contex.com (Avinash Chopde) (06/30/91)

I need to port a program I wrote for UNIX machines onto the IBM PC.
I have the Turbo C++ compiler, and at first, I just tried to
recompile all the sources, but the compiler complained about
a struct being greater than 64K in size!
Now, I'm sure I'm doing something wrong, surely there must be some
way of defining large arrays.
Is there ?
(If not, will malloc() support allocation of large arrays ?)

Looks like it is not going to be an easy job, porting programs to the PC,
and I would be glad for any help anybody can offer regarding what I could
do, and what I should watch out for.
Thanks for any help.
-- 
---------------------------
Avinash Chopde            home :   508 470 1190     office : 617 224 5582
avinash@contex.com       (if that fails, use: contex!avinash@uunet.uu.net)

cpcahil@virtech.uucp (Conor P. Cahill) (06/30/91)

avinash@felix.contex.com (Avinash Chopde) writes:

>Now, I'm sure I'm doing something wrong, surely there must be some
>way of defining large arrays.

Most DOS compilers have a "huge" model that allows definition of data
elements that are larger than 64k. 

>(If not, will malloc() support allocation of large arrays ?)

Malloc is usually limited by the model you choose (and by the size of it's
arguement (i.e. 16 bit unsigned int == 64K).

>Looks like it is not going to be an easy job, porting programs to the PC,
>and I would be glad for any help anybody can offer regarding what I could
>do, and what I should watch out for.

This is an understatement.  One of the hardest things to get used to 
is that you can easily run out of memory.  On a totally bare bones system
you start out with 640 K.  Take off 70K or so for DOS and command.com 
and that leaves you with a max size of around 570K - not much in 
todays world. 

I would look at redesigning your application such that you didn't need 
super-large data elements (although you can have multiple data segments
that total to more than 64K).  The nice thing about this is that you
could then use LIM EMS to page segments to expanded memory.
-- 
I guess these are the views of VTI - since it is my consulting company.

Conor P. Cahill              (703)430-9247              uunet!virtech!cpcahil 
Virtual Technologies, Inc.  46030 Manekin Plaza            Sterling, VA 22170 

tmurphy%peruvian.utah.edu@cs.utah.edu (Thomas Murphy) (07/01/91)

In article <1972@contex.contex.com> avinash@felix.contex.com (Avinash Chopde) writes:
>I need to port a program I wrote for UNIX machines onto the IBM PC.
>I have the Turbo C++ compiler, and at first, I just tried to
>recompile all the sources, but the compiler complained about
>a struct being greater than 64K in size!
>Now, I'm sure I'm doing something wrong, surely there must be some
>way of defining large arrays.
>Is there ?

You need to read the readme that came with the compiler for faq's.  You need
to use the huge memory model and if you have an array over 64k declare it as huge. ---> huge double big_array[12000]




-------------------------------------------------------------------------------
      Murph                     "Government that governs the least
   Thomas Murphy                   Governs the best."  Jefferson
   many depts. UofU          all drabble is dregged up from my own mind.