[comp.lang.c++] NIH class libraries for Turbo C++

kaufman@delta.eecs.nwu.edu (Michael L. Kaufman) (11/08/90)

Is there a version of the NIH libraries for Turbo C++?  If so, where do I find
it?  If possible, I would rather have it in a ZIP, ARC, LZH or ZOO archive, but
I will take what I can get. Any information would be apprecaited.

Thanks,  Michael


===============================================================================
Michael L. Kaufman  \\  "I painted her with tar and touched her off and watched
kaufman@eecs.nwu.edu \\ her blaze away...How love's old embers burn!" D Marquis
===============================================================================

av@kielo.uta.fi (Arto V. Viitanen) (11/14/90)

>>>>> On 8 Nov 90 08:09:01 GMT, kaufman@delta.eecs.nwu.edu (Michael L. Kaufman) said:

Michael> Is there a version of the NIH libraries for Turbo C++?

I asked similar question, and received following letter from Sandy Orlow
(sandy@alw.nih.gov):

=============================================================================
In response to those who made inquiries about my news article in which I 
reported an attempt to port the NIH Class Library to Turbo-C++, I am
reporting some more information here. In my previous report I said:

> I made an effort to port NIHCL to Turbo C++ on a COMPAQ-386
> with many mega-bytes of memory. There were many problems -- a few 
> portability problems, a few Turbo compiler bugs, but the upshot of my 
> effort was that Turbo C++ cannot currently compile a large enough module 
> to support the NIH Class Library. Turbo-C++ appears to adhere to a 640K 
> memory limit in its own execution.
> 
> I went to some lengths to get many of the classes compiled and, Turbo bugs
> aside, had some success. However, I ended my efforts when I came to the
> following conclusion:  The NIH Class Library is designed for Object-Oriented
> programming. In my experience, object-oriented programs are large beasts with
> deep nesting of header files, and modules for classes that are coupled to 
> many other classes can become large. If a compiler has trouble with NIHCL as
> it is, I suspect its ability to compile useful software built with NIHCL, and
> useful object-oriented software in general.
> 

Because of the interest, and against my better judgement, I went ahead
and made another effort to port NIHCL to DOS using Turbo-C++. I have
succeeded in creating a version of the library that compiles. However,
Turbo-C++ can't compile the (relatively small) test program for class 
OrderedCltn because it includes both 'OrderedCltn.h` and 'Set.h`. To 
generalize slightly, the use of two complex object-oriented data structures
together is more than Turbo-C++ can handle at this time. This confirms my
previous judgement about Turbo-C++ and object-oriented software.

Some particulars are listed below:

===========================================================================
This report describes work done attempting to port the NIHCL class library
to DOS using Turbo-C++ on a COMPAQ-386 with 10 megabytes of memory.

1. Memory Limits

The Turbo-C++ compiler uses a maximum of 640 Kbytes during compilation and 
this is not enough memory to compile many of the NIHCL modules. For example,
compiling the module for class Object as provided by NIHCL produces:

	$make object.obj
	MAKE version 3.0 Copyright(c) 1987, 1990 Borland International
	
	Available memory 414426 bytes
	    sh -c date > object.log
	    -tcc -I\tc\include -I\tc\nihcl\include -c object.cpp>>object.log
	    sh -c date >> object.log
	$ more < object.log
	Sun Apr 22 21:28:27 GMT 1990
	Turbo C++ Version 1.0 Copyright (c) 1990 Borland International
	object.cpp:
	Error \tc\nihcl\include\IdentSet.h 30: Out of memory
	*** 1 errors in Compile ***

		Available memory 0
	Sun Apr 22 21:29:48 GMT 1990	

Indiviual workarounds were attempted in each such case.	

2. File Names

File names under DOS are limited to 8 characters (not including extension)
and are case insensitive. The simplest way to handle this is to ignore it. 
When files with long names are copied into a DOS directory (I did it via FTP) 
the names are simply truncated to 8 characters plus extension. The Turbo 
compiler accepts the '#include` directives with long names and is able to find
 the file OK. 

Most NIHCL file names are already unique in the first 8 characters with regard
 to case. The exception is 'nihclerrsx.h` which you should rename to
'xnihclerrs.h` to distinguish it from 'nihclerrs.h`.

You have to change all the '.c` extensions to '.cpp` otherwise they are
treated as C language files.

3. System header files

string.h
--------
The name 'String.h` clashes with the system header file name 'string.h`
under DOS because of case insensitivity. Since 'String.h` is included in many
files and 'string.h` is included only a few, it seemed easiest to replace 
'#include <string.h>` by '#include <sys/string.h>` and then make a copy of
'string.h` in a sub-directory 'sys`. You should make this replacement in 
every file using 'string.h` including 'String.h` and 'OIOfd.h`.

alloc.h
-------
Turbo uses 'alloc.h` whereas UNIX uses 'malloc.h`. To work around this
write a file 'malloc.h` with the line: '#include <alloc.h>`.

libc.h
------
Make a dummy header file 'libc.h` with the line:

#include <stdlib.h>

osfcn.h
-------
Make a dummy header file 'osfcn.h` with the line

int _Cdecl write(int,const char*,int);

4. The Error Facility

The simplest approach is to use the files 'nihclerrs.[hc]` generated on a
UNIX machine. The only reason to build the error facility is if you really
want an error facility.

5. Default stream arguments

Turbo C++ cannot accept default arguments for arguments of type 'ostream&` or 
'istream&`.

Turbo C++  Version 1.00 Copyright (c) 1990 Borland International
nil.cpp:
Error nil.cpp 42: Cannot initialize 'ostream near&' with 'ostream_withassign'
Error nil.cpp 45: Cannot initialize 'ostream near&' with 'ostream_withassign'
*** 2 errors in Compile ***

This is because they did something (silly) with the stream library involving
derived classes 'ostream_withassign` and 'istream_withassign`. A workaround
is to comment out all such default stream arguments in NIHCL code.

6. Class Object

To compile 'object.cpp` I had to break it up into four parts:

	objectco.cpp -- all members for the copy utility
	objectde.cpp -- all members for the dependency utility
	objectio.cpp -- all members for the object I/O utility
	object.cpp   -- all members for a basic object

Actually, this in itself is not a bad idea because not all applications
need to use these utilities.

7. OIO classes

Files 'oionih.cpp`, 'oiofd.cpp`, 'oiostream.cpp`, 'oiotbl.cpp` are all too
large for Turbo C++. Breaking them up by separating input and output was
not good enough. I haven't pursued this further.

8. Abstract Classes

Class files 'collecti.cpp` and 'seqcltn.cpp` compiled with some warnings like:

Warning collecti.cpp 70: Possibly incorrect assignment in function Collection::addContentsTo(Collection near&) const

9. Non-abstract classes

Turbo C++ has a bug that prevents compilation of any non-abstract class.
The problem seems to be with Turbos implementation of implicit use of the 
copy constructor for initialization of objects and the return of an object
from a function. For example,

Turbo C++  Version 1.00 Copyright (c) 1990 Borland International
integer.cpp:
Warning integer.cpp 41: Temporary used for parameter 'class' in call to 'Integer::Integer(Integer near&)' in function Integer::shallowCopy() const
Error integer.cpp 41: '{' is not a member of 'NIHCL' in function Object::Object(const Object near&)
*** 1 errors in Compile ***

In this case Turbo's goofy error diagnostic is a symptom that something is
seriously wrong in the compiler. The only available work-around is to
explicitly declare and implement a copy constructor for each class that
needs one.

10. Implicit cast bug in Turbo

This comes up in a number of classes. I first noticed it in 
'class OrderedCltn`:

Turbo C++  Version 1.00 Copyright (c) 1990 Borland International
orderedc.cpp:
Error orderedc.cpp 73: Cannot initialize 'const Object near* near*' with 'Object near* near*' in function OrderedCltn::operator ==(const OrderedCltn near&) const
Error orderedc.cpp 74: Cannot initialize 'const Object near* near*' with 'const Object near* const  near*' in function OrderedCltn::operator ==(const OrderedCltn near&) const
Error orderedc.cpp 195: Cannot initialize 'const Object near* near*' with 'Object near* near*' in function OrderedCltn::hash() const

The work-around is to make the cast explicit. An isolated case of this bug is 
shown in the following code:

class Object {
public:
	Object() {}
};

class foo {
	Object** v;
public:
	foo() {}
	Object*& operator[](int	i) { return v[i]; }
	const Object *const& operator[](int i) const { return v[i]; }
};

class Foo {
	foo contents;
public:
	Foo() {}
	Object*& operator[](int i) { return contents[i]; }
	const Object *const& operator[](int i) const { return contents[i]; }
/*
Turbo C++  Version 1.00 Copyright (c) 1990 Borland International
bug1.cpp:
Error bug1.cpp 19: Expression type does not match the return type in function Foo::operator [](int) const
*** 1 errors in Compile ***
*/
/* workaround -- explicitly cast return value 
        const Object *const& operator[](int i) const
             { return (const Object *const&)contents[i]; } 
*/
};

11. class OrderedCltn

Class file 'orderedc.cpp` was too big for Turbo C++. I separated it into
two parts and was able to compile:

	ordercio.cpp -- all members for object I/O
	orderedc.cpp -- other members

12. class Class

The module for class Class is too big for Turbo. I separated member functions
into these modules in order to compile:

	classvio.cpp -- object I/O for virtual base classes
	classio.cpp  -- object I/O for all other situations
	classdic.cpp -- members using the class Dictionary
	class.cpp    -- all other members	

13. _main is a no-op

NIHCL is set up to initialize the class dictionary via static constuction.
This is done by linking in an alternate _main module which doesn't work
under Turbo-C++. NIHCL programs thus must include as the first statement
the the main() function:

NIHCL::initialize();

Unfortunately, this is not the end of the story. The class dictionary is
not getting built, we suspect, because NIHCL code uses a constructor
ClassList::ClassList(char*, ... ) where the variable actual argument list
will consist of Class** pointers that is not portable to TURBO-C+.

Sandy Orlow, Systex, Inc.
Computer Systems Laboratory 
National Institutes of Health
Bethesda MD


--
Arto V. Viitanen				         email: av@kielo.uta.fi
University Of Tampere,				   	    av@ohdake.cs.uta.fi
Finland

minar@reed.bitnet (Nelson Minar,L08,x640,7776519) (11/15/90)

Brave person, getting NIHCL to do anything in TC++. I would never have
attempted it, myself. some of the problems you have described (for example,
the necessity of renaming .c to .cpp files) can be solved easily (in this
case, a flag to the compiler).  Did you try this with TC++ 1.01? Some bugs
were fixed.  In any case, let Borland know what you have done, in the hopes
of a better compiler.

On to more philisophical points.  The monolithic class heirarchy is
conceptually elegant.  It might even be useful on a small machine,
when one is working on a huge project (where huge >= compiler).  However,
a lot of programs just don't need that much structure.

I've been using C++ for 'soft OOP' - the forest model of OOP: lots of little
classes, inheritance when it is needed. I have found this to be very good
for me, as it organizes my thinking much better than C does.  And, when I
need the power of an OOP its there for me: I CAN do inheritance.  If/when
templates come around to the DOS world, TC++ will become a very useful
thing for some purposes.

I don't think one can write off TC++ entirely.  Its true that it wont be
capable of doing much with NIHCL.  Then again, can a DOS box effectively run
programs that something as big as NIHCL is needed for?

jamiller@hpcupt1.cup.hp.com (Jim Miller) (11/17/90)

>                                   Then again, can a DOS box effectively run
>programs that something as big as NIHCL is needed for?

A 16Meg 386 or 486 should be able to handle most anything.
Under the present MS-DOS you need extenders, but my guess is that
sometime in the next couple of years MicroSoft will break the 640kb
barrier and give/allow the mult-MegB liner address space.

In short, MS-DOS on the new mini machines (the 486's are not micros) is not
dead now, and will probably stay with or ahead of the needs of MOST
customers.  The program developers may always complain, but they have been
able to deliver amazing products in what in 1982 was a large address space
and now is considered "tiny".  (In 1982 if anyone said that 640Kb was
"small" was out of their heads).

   jim - why in the world am I defending MS-DOS ?!? - miller

muyanja@hpdmd48.boi.hp.com (bill muyanja) (11/18/90)

>  jim - why in the world am I defending MS-DOS ?!? - miller

Maybe because 40 million users can't all be wrong?

Frankly, I'm starting to wonder about the benefits of the unlimited
linear address space of Un*x, which allow humongous, monolithic software
modules, both in classic and OO "c".  I have yet to see a piece of software
in the Unix world with the price/performance/utility point provided by
Lotus 123 2.01/WordPerfect 4.2/dBase III+ on a 12 MHz AT-clone circa 1987.
This combo is what appealed to those 40 million DOS users.

My (admittedly limited) experience with  OO systems (Actor, C++) has      
convinced me that any benefit from the object metaphor can easily be
outweighed by the cost of learning zillions of new classes/objects.

Whatever happened to the elegant philosophy of Kernighan & Plauger, as
embodied in "Software Tools"?  I feel that the ideas espoused in that
classic apply equally well to c++.


bill - ok, I'm off the soapbox now - muyanja

jimad@microsoft.UUCP (Jim ADCOCK) (11/27/90)

In article <15150004@hpdmd48.boi.hp.com> muyanja@hpdmd48.boi.hp.com (bill muyanja) writes:
>Frankly, I'm starting to wonder about the benefits of the unlimited
>linear address space of Un*x, which allow humongous, monolithic software
>modules, both in classic and OO "c".  I have yet to see a piece of software
>in the Unix world with the price/performance/utility point provided by
>Lotus 123 2.01/WordPerfect 4.2/dBase III+ on a 12 MHz AT-clone circa 1987.
>This combo is what appealed to those 40 million DOS users.

Having used both Unix-style huge linear addresses, and Intel 80x86 segments,
I believe neither has much in common with OOP.  In either one has to copy
objects around, or do manual clustering of objects via heuristics, etc.  
Maybe one needs hardware based on an obid+offset, with automatic support of
clustering?

The "huge linear address" of Unix-style machines is a farce in the 
first place, given that that "huge linear address" is built of 4K typical
pages, which are mapped in unique ways to disks, and programmers have
to reverse engineer all these aspects to get good performance in serious
applications.

jimad@microsoft.UUCP (Jim ADCOCK) (11/27/90)

In article <15696@reed.UUCP> minar@reed.bitnet (Nelson Minar) writes:
>
>On to more philisophical points.  The monolithic class heirarchy is
>conceptually elegant.  It might even be useful on a small machine,
>when one is working on a huge project (where huge >= compiler).  However,
>a lot of programs just don't need that much structure.

Seems to me much of the objections to a big, monolithic library goes
away if one has good support for shared libraries on your system, and
if there are relatively few, big, monolithic OO libraries that the
world standardizes on.  Then one can have multiple applications running
simultaneously sharing the same [big] library code.  This also implies
that someone takes the time to factor the libraries well so that less
than the total library need be loaded at a given time.

Until we get such emerging library standards, one can still do well using
C++ and the "forest" approach to make stand-alone, well-encapsulated units
of functionality.