[comp.lang.misc] software ICs vs. libraries

bnfb@uw-june.UUCP (Bjorn Freeman-Benson) (10/22/87)

First, a disclaimer: I don't use Object-C or C++, I don't know their
terminology, and I don't own any stock in either company.

>...exciting, like "Software-IC".  Libraries have been around at least as
>long as programming languages; ...

I think that there is a difference between libraries and something
more general, say "software-ICs".  The libraries I have seen are
collections of routines that someone found useful, and then put
together for mass consumption.  A good example of this is the
Macintosh Toolbox routines: they are powerful, but they are
hopelessly complex, and have a questionable programmer improvement.

A "software-IC" would seem to be a more thourghly thought out, better
designed, unit of software that could be used for multiple purposes.
In other words, a design team would set out to design a software-IC,
rather than extracting little pieces of code from their final
product to place in a library.  The subclassing ability of Smalltalk
and its kin are a start.

>...7000 series ICs has all but ended and almost all serious design now is
>being done with semicustom or custom components.

However, one must consider that current software technology is at
the level of individual transistors and resistors, and that we could
use the step up to "7000 series ICs".  After that, custom and
semi-custom would be great.

				Bjorn N. Freeman-Benson

farren@gethen.UUCP (10/22/87)

In article <3349@uw-june.UUCP> bnfb@uw-june.UUCP (Bjorn Freeman-Benson) writes:
>A "software-IC" would seem to be a more thourghly thought out, better
>designed, unit of software that could be used for multiple purposes.

I must confess to a failure to understand what it is that you are proposing
here.  You have said that the Macintosh Toolbox routines aren't what you
are talking about, and it seems to me that they, as well as the similar
libraries available on most complex machines today, are about as close
as you would want to come to a "Software IC".  If the routines are going
to be flexible enough to be used for different purposes, they are also
going to be complex enough to be difficult to use correctly for any of 
them.  If they are simple enough to be used correctly by "the masses", as
you put it, then it seems to me that they will also be much less flex-
ible.  How do you propose to solve this dilemma?  Examples, please.


-- 
----------------
Michael J. Farren      "... if the church put in half the time on covetousness
unisoft!gethen!farren   that it does on lust, this would be a better world ..."
gethen!farren@lll-winken.arpa             Garrison Keillor, "Lake Wobegon Days"

ruffwork@orstcs.CS.ORST.EDU (Ritchey Ruff) (10/23/87)

In article <244@gethen.UUCP> farren@gethen.UUCP (Michael J. Farren) writes:
>In article <3349@uw-june.UUCP> bnfb@uw-june.UUCP (Bjorn Freeman-Benson) writes:
>>A "software-IC" would seem to be a more thoroughly thought out, better
>>designed, unit of software that could be used for multiple purposes.
>
>	You have said that the Macintosh Toolbox routines aren't what you
>are talking about, and it seems to me that they, as well as the similar
>libraries available on most complex machines today...
>...If the routines are going
>to be flexible enough to be used for different purposes, they are also
>going to be complex enough to be difficult to use correctly for any of 
>them.  If they are simple enough to be used correctly by "the masses", as
>you put it, then it seems to me that they will also be much less flex-
>ible.  How do you propose to solve this dilemma?  Examples, please.

You asked for it!  I have used both the Macintosh Toolbox routines
to work with windows and the Interlisp-D windowing system.  Both will
do almost exactly the same types of things, but I can write a program
to do windowing in Interlisp-D about 10 times faster!  The keys here
are - 
	(1) Graceful defaults - if I don't want to anything nonstandard
		then all I do is make a simple call or two and there
		it is.  Scroll bars, selection, shrinking to an icon,
		everything.
	(2) Message-base paradigm - when I do want to do something
		nonstandard I attach a function to the property
		on a window.  Lets say I want the mouse pointer
		to change when it enters window A, I just putprop
		a routine to do that on the "enter-window" property
		of window A and now every time the mouse cursor enters
		window A that function is called and the cursor changes.
		All there is to it!!!
Example - I like the "zoom" feature for windows (moving quickly between
	the last two window sizes).  I wrote one for my mac (after I
	was an experienced mac-hack) and it took me about 2 days.  I wrote
	one as a first windowing program for Interlisp-D and it took 
	about 3 hour.  The mac one is buggy and the Interlisp-D 
	one has not been touched after that 3 hours work.

I'm ramblin' on so will wrap it up saying "what you want is not JUST
a powerful interface, but a WELL THOUGHT OUT powerful interface.  This
does not mean one that does everything you want: it means one that does
everything you want with the LEAST amount of programming effort."  (That's
the hard part.)

--ritchey ruff

	ruffwork@cs.orst.edu
	{ hp-pcd | tektronix }!orstcs!ruffwork

	It's against my programming to impersonate a diety!
		-- C3PO

drw@culdev1.UUCP (Dale Worley) (10/23/87)

Another feature of "software ICs" comes from the fact that they are
part of an object-oriented system.  One can actually write, say, a
linked list manager that will work on objects of *any* type.  In most
languages, this is impossible to do in a library routine.

I suspect that I could write considerably less code if I could write
in one statement all the trivial, but extremely stereotyped, bits of
code.  After all, what fraction of your lines of code are
loop-and-search, and other completely standard stuff?

Dale

adamj@thoth17.berkeley.edu.BERKELEY.EDU (10/24/87)

In article <1691@culdev1.UUCP> drw@culdev1.UUCP (Dale Worley) writes:
>Another feature of "software ICs" comes from the fact that they are
>part of an object-oriented system.  One can actually write, say, a
>linked list manager that will work on objects of *any* type.  In most
>languages, this is impossible to do in a library routine.

	"Object oriented?"  What you are describing involves
parameteized typing and polymorphic routines.  Objected oriented
programming (in my book) involves data hiding and polymorphic routines.
Parameterized typing is an independent concept from data hiding.

	Parameterized typing should be a tremendous win because it
then becomes possible to change the lower levels of a hierarchical
type without affecting the routines that only need to know some overall
information about that type.

	I'd like to drop the term "software IC" as that an IC is noticibly
deficient in this low-level parameterization.  E.g., I can't buy a TTL IC
from Radio Shack, buy a GaAs description module, apply one to the other,
and get my favorite chip design in GaAs.  The chip design has to be
reworked by the manufacturer, just like present day software has to be
recompiled and probably edited by an owner of the source code.

>I suspect that I could write considerably less code if I could write
>in one statement all the trivial, but extremely stereotyped, bits of
>code.  After all, what fraction of your lines of code are
>loop-and-search, and other completely standard stuff?
>Dale

	Right on!

		--Adam J. Richter
		  adamj@widow.berkeley.edu
		  ...!ucbvax!widow!adamj
Adam J. Richter			adamj@widow.berkeley.edu
				....!ucbvax!widow!adamj
(415)642-7762

jwhitnel@csi.UUCP (Jerry Whitnell) (10/24/87)

In article <3349@uw-june.UUCP> bnfb@uw-june.UUCP (Bjorn Freeman-Benson) writes:
|>...exciting, like "Software-IC".  Libraries have been around at least as
|>long as programming languages; ...
|
|I think that there is a difference between libraries and something
|more general, say "software-ICs".  The libraries I have seen are
|collections of routines that someone found useful, and then put
|together for mass consumption.  A good example of this is the
|Macintosh Toolbox routines: they are powerful, but they are
|hopelessly complex, and have a questionable programmer improvement.

Having programmed the Macintosh, I'll have to disagree with the last two
comments.  The toolbox is complicated, but not hopelessly so, since there are
a large number of programmers using it to develop software.  As far as
programmer improvement is concerned, 99.9% of the programs on the Mac use
some portion of the interface, and 99.8% of them use the Toolbox rather
then write it themselves (there's always one who wants to write it himself
anyway :-)).  Compared to the cost of doing it all themselves, there is
certainly a large improvement in programmer productivity.  One might argue
that the Mac interface itself is detirmental to programmer productitivty,
but since it greatly enhances user productivity, we programmers will have to
bite the bullet and use it.

|
|A "software-IC" would seem to be a more thourghly thought out, better
|designed, unit of software that could be used for multiple purposes.
|In other words, a design team would set out to design a software-IC,
|rather than extracting little pieces of code from their final
|product to place in a library.  The subclassing ability of Smalltalk
|and its kin are a start.

My defintion of a "software-IC" is a follows:

1) Simplicity of function:  An IC should have one and only one function.  This 
function could be just to sort data or as complicated as converting 3d-objects
to 2d-display.  This relates directly to the hardware IC which implements a
single function (whether it is an AND or a Multiplexer).

2) Layered.  Not only should the calling function interface be defined, but
also the called functions interfaces should be defined.  The programmer can
mix and match pieces to achieve the desired set of features.  A example of
where this concept is not used but should be is the UNIX malloc function.  
Malloc is implemented using several queues and OS calls.  There is no reason
while the programmer couldn't replace these with other equivalent functions
if needed, except that unless he has access to the source code, he has no
idea what the called functions do.

3) Well documented.  If you looks at a 7400 book, you can see the wealth of
information provided about ICs.  Everything from pin-outs to time charts to
tempature specs.  You look at any software libary spec and you'll usually
get half a page of incomplete information, usually wrong :-).  Complete
documentation in a standard format is a necessity.

4) Machine/OS/Language independent.  For a hardware engineer, all of the
iterfaces are standard (TTL), as are the ICs (7400).  This means that
an engineer can mix and match with no worry about the enviroment the final
product will end up in.  Different choices can be made for reasons of costs,
speed, power consumption all without changing the base design.  For software
ICs the same should be true.  If you need a sort, you should be able to go
to your "42-series" catalog, look up sort routines and pick the one that
meets your needs for speed/memory consumption/... independent of whether
your working on a Mac, a PC, a UNIX box or an IBM mainframe (well, maybe not
there :-)).  


|
|>...7000 series ICs has all but ended and almost all serious design now is
|>being done with semicustom or custom components.
|
|However, one must consider that current software technology is at
|the level of individual transistors and resistors, and that we could
|use the step up to "7000 series ICs".  After that, custom and
|semi-custom would be great.

I disagree, semi-custom and custom are really a step backwards on the hardware
side, as far as engineers are concerned.  Whereas before one could depend on
each IC being a black box (chip?) whose implementation you didn't care about
(mostly), now the implementations will be right in front of you with nothing
to prevent from aliasing this function to that global over there...  Just
as bad as software is now!

|
|				Bjorn N. Freeman-Benson


Jerry Whitnell                           It's a damn poor mind that can only
Communication Solutions, Inc.            think of one way to spell a word.
						-- Andrew Jackson

farren@gethen.UUCP (Michael J. Farren) (10/25/87)

Several people have made statements that attempt to correlate hardware
with software, usually starting with the resistor and transistor as a
basic block.  I don't feel that this analogy is appropriate.  The closest
equivalent in software to a 7408, to take a very low-level IC for an
example, is three or four AND statements in assembly language, NOT a small
library function (unless you have logical and implemented as a library
function, in which case you have my pity).  The largest piece of software
that has a hardware equivalent is probably something like BitBLT.

The current state of the hardware design art has shifted somewhat away
from general-purpose VLSI implementations of design.  More and more,
hardware designers are recognizing that a general-purpose implementation,
while possibly somewhat simpler to implement, carrys a high cost in
circuit efficiency, economy, and cost.  One of the hottest design
techniques right now is ASIC (Application Specific Integrated Circuits),
which commonly takes many low and medium level components and binds them
together in a form suitable for the specific job at hand - rather like a
high-level routine would use library functions in a specific way.

I believe that the "software IC" concept fails on just those kinds of
issues.  A routine which is simple to use, but complex in function, will
be more costly, both in storage and in speed, than one that is simply
complex, as the simple interface will, in itself, have a cost in code
and execution time.  The example has been mentioned of the difference in
implementation of window functions on the Macintosh and (I believe - I'm
not a Lisp hacker) a Lisp machine.  It was pointed out that changing
window parameters on the Lisp machine was much easier than the same
operation on the Mac.  However, what was not noted, probably because it
was fairly transparent to the end-programmer, was the cost, in bytes and
in nanoseconds, of making those parameters easy to change.  Most machines
now available just don't have enough resources to allow that level of
flexibility and still have an acceptable response for the user.  Smalltalk
is an instructive example; although its concepts made things a lot easier
for many tasks, it took considerable advances in technology before an
actual system could be constructed which wasn't painfully slow from the
user's standpoint.

There will always be a tradeoff between speed and ease of use/programming.
Currently, I strongly believe that the tradeoff favors the use of smaller,
less sophisticated routines designed for the maximum efficiency in
operation, rather than construction.  This situation may well change, but
I think that the use of more primitive constructs in order to obtain the
maximum in efficiency and machine usage will always be a preferred
alternative, whenever it is possible to make the choice.  What will change
will, more than likely, be the level of complexity considered "primitive".


-- 
----------------
Michael J. Farren      "... if the church put in half the time on covetousness
unisoft!gethen!farren   that it does on lust, this would be a better world ..."
gethen!farren@lll-winken.arpa             Garrison Keillor, "Lake Wobegon Days"

chip@ateng.UUCP (Chip Salzenberg) (10/26/87)

In article <5606@jade.BERKELEY.EDU> adamj@widow.berkeley.edu (Adam J. Richter) writes:
>In article <1691@culdev1.UUCP> drw@culdev1.UUCP (Dale Worley) writes:
>>Another feature of "software ICs" comes from the fact that they are
>>part of an object-oriented system.  One can actually write, say, a
>>linked list manager that will work on objects of *any* type.
>
>	"Object oriented?"  What you are describing involves
>parameteized typing and polymorphic routines.

Not so.  Objective-C supports _heterogoneous_ collections.  For example, a
single Set can hold another Set, a Dictionary, an Array, and any number of
other types -- _simultaneously_.

-- 
Chip Salzenberg         "chip@ateng.UUCP"  or  "{uunet,usfvax2}!ateng!chip"
A.T. Engineering        My employer's opinions are not mine, but these are.
   "Gentlemen, your work today has been outstanding.  I intend to recommend
   you all for promotion -- in whatever fleet we end up serving."   - JTK

chris@mimsy.UUCP (10/28/87)

In article <1270@csib.csi.UUCP> jwhitnel@csi.UUCP (Jerry Whitnell) writes:
>1) Simplicity of function:  [A software] IC should have one and only
>one function. ... This relates directly to the hardware IC which implements
>a single function (whether it is an AND or a Multiplexer).

Like the 74158?  If I recall correctly, this is a dual decade counter
with latches and BCD-to-seven-segment decoder-and-drivers.  In other
words, it is like two 7490s, two latches (numbers forgotten), and two 
744[78]s.

(The point is that subroutines and hardware both come in simple and
complex versions.  There are a number of special-purpose 74xx ICs.)

>2) Layered.  Not only should the calling function interface be defined, but
>also the called functions interfaces should be defined. ... A example of
>where this concept is not used but should be is the UNIX malloc function.  

7400 series chips are not layered, they are self-contained.  I am not
sure there is any suitable hardware analogue for subroutine layering.

>3) Well documented.  If you looks at a 7400 book, you can see the wealth of
>information provided about ICs.

On the other hand, much of the information is assumed (from the
class of the hardware).  This sounds familiar....

>4) Machine/OS/Language independent.  For a hardware engineer, all of the
>iterfaces are standard (TTL),

... and DTL and ECL and ...

>as are the ICs (7400).

... and 5400 and (now I have forgotten the CMOS series numbers, 4800?)

>This means that an engineer can mix and match with no worry about
>the enviroment the final product will end up in.  Different choices
>can be made for reasons of costs, speed, power consumption all
>without changing the base design.

Not at all.  Different classes of logic families with the same
functions (such as changing all your TTL to CMOS so you can run
off 3V or 15V) have different characteristics (e.g., fan-out).
You cannot just renumber your design: you have to check everything
all over again.  I think that we can in fact do *better* with
software than with hardware.
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7690)
Domain:	chris@mimsy.umd.edu	Path:	uunet!mimsy!chris

drw@culdev1.UUCP (10/28/87)

chip@ateng.UUCP (Chip Salzenberg) writes:
| Objective-C supports _heterogoneous_ collections.  For example, a
| single Set can hold another Set, a Dictionary, an Array, and any number of
| other types -- _simultaneously_.

The other important feature that object-oriented systems need
(Objective-C has it, I don't know of C++ does) is that one can build
two objects that present the same external interface, but for which
the method-routines are different.  I.e., I can have two different
sorts of "dictionary", implemented differently, and the code that uses
them can't tell the difference between them.  (This is why you need
run-time mapping from method-names to routines.)

Dale
-- 
Dale Worley    Cullinet Software      ARPA: culdev1!drw@eddie.mit.edu
UUCP: ...!seismo!harvard!mit-eddie!culdev1!drw
If you get fed twice a day, how bad can life be?

peter@sugar.UUCP (10/29/87)

In article <1691@culdev1.UUCP>, drw@culdev1.UUCP (Dale Worley) writes:
> Another feature of "software ICs" comes from the fact that they are
> part of an object-oriented system.  One can actually write, say, a
> linked list manager that will work on objects of *any* type.  In most
> languages, this is impossible to do in a library routine.

You can do it in 'C'. In fact the standard 'C' library for the Amiga comes
with exactly this tool.

	Insert(ListHead, ListNode, PredNode)

	 - Insert a node in a list.

		AddHead(ListHead, ListNode)

		 - Insert at the head of the list:

			Insert(ListHead, ListNode, NULL);

		AddTail(ListHead, ListNode)

		 - Insert at the tail of the list:

			Insert(ListHead, ListNode, ListHead->lh_Tail);

	Remove(ListNode)

	 - Removes it from whatever List it's in.

		RemHead(ListHead)

		 - Removes and returns the head of a list.

			return_val = ListHead->lh_Head;
			Remove(return_val);
			return return_val;

		RemTail(ListHead)

		 - Removes and returns the tail of a list.
		   As Joe Isuzu says "well, you know..."

	Enqueue(ListHead, ListNode)

	 - Adds it on a priority basis. RemHead will return the highpri
	   node, RemTail the lowpri node. FIFO ordering if priorities match.

	FindName(ListHead, "Name")

	 - Finds the first node matching Name.

	FindName(ListNode, "Name")

	 - Finds the next node matching Name.

This deals with objects called "nodes". But, of course, you can pass any struct
to these functions so long as the first element of the struct is a list or a
node. This is one place 'C' outshines more modern and more heavily typed
languages like Modula.
-- 
-- Peter da Silva  `-_-'  ...!hoptoad!academ!uhnix1!sugar!peter
-- Disclaimer: These U aren't mere opinions... these are *values*.

sommar@enea.UUCP (10/31/87)

peter@sugar.UUCP (Peter da Silva) writes:
>In article <1691@culdev1.UUCP>, drw@culdev1.UUCP (Dale Worley) writes:
>> Another feature of "software ICs" comes from the fact that they are
>> part of an object-oriented system.  One can actually write, say, a
>> linked list manager that will work on objects of *any* type.  In most
>> languages, this is impossible to do in a library routine.
>
>You can do it in 'C'. In fact the standard 'C' library for the Amiga comes
>with exactly this tool.
>   <description of interface deleted>
>This deals with objects called "nodes". But, of course, you can pass any 
>struct
>to these functions so long as the first element of the struct is a list or a
>node. This is one place 'C' outshines more modern and more heavily typed
>languages like Modula.

Of course you could do this in assembler too, n'est-ce pas? Just stack
an address to a block on the stack. The first word(s) in the block are 
the pointer to next block. And damn you if you forget them. The problem 
with C and assembler is that they leave the user to be reponsible for 
the correctness. If he forget the pointers in the struct, he may have 
a hard time to find out why the program crashes.
  Now, I don't speak Modula-2, so I can't speak for it. (But I can 
believe da Silva is right. My impression of Modula-2 has always been
that it is just another Pascal dialect :-)
  Finally some high-level languages in which you can write the 
linked-list manager *with* type checking are Simula and Ada. (But you 
don't write it in Simula, the manager is already there as a part of the 
language.)
-- 
Erland Sommarskog       
ENEA Data, Stockholm    
sommar@enea.UUCP        
                    It could have been worse; it could have been Pepsi.

peter@sugar.UUCP (10/31/87)

In article <1270@csib.csi.UUCP>, jwhitnel@csi.UUCP (Jerry Whitnell) writes:
> Having programmed the Macintosh, I'll have to disagree with the last two
> comments.  The toolbox is complicated, but not hopelessly so, since there are
> a large number of programmers using it to develop software.

I'll have to disagree with that. The MAC toolbox violates a fundamental law
of O/S design: FIRST, make the easy things easy. You should be able to
operate like this (enter fantasy mode):

	Window = OpenWindow(x0, y0, x1, y1, xmax, ymax, 0);

This gives you a minimal window. It gives you a window of size (x1-x0) by
(y1-y0) into a virtual screen of size xmax by ymax. Scroll bars are provided,
so the user can pan around the area. A sizing button is provided, so you
can rezise the window. You can then twiddle your thumbs for all the
user cares. The window is up there. Other programs windows may cover it up
and make it visible again. You don't care... it's all handled transparently
by the operating system. You can draw into the window, write text, and the
operating system will handle clipping and rendering into offscreen portions
of the bitmap. This is the default.

	Window = OpenWindow(x0, y0, x1, y1, 0, 0, FIXEDSIZE);

This gives you a window of fixed size, like a Mac dialogue box.

	AddButton(Window, x0, y0, x1, y1, function, 0);

This adds a poke point to the window, size and co-ordinates (relative to window)
as above. Imagery is handled by drawing it in beforehand. When the button is
poked, the named function is called asynchronously (as in a UNIX signal).

You get the idea. None of this business of calling SystemTask as often as
possible. None of this business of having to redraw the screen yourself. The
operating system has to figure out WHEN to redraw the screen... why not let it
do the work as well?

Any idiot can put together an oscillator circuit with a 555. Any idiot should
be able to put together a well-behaved windowing application.

I don't know of any system that's this sophisticated, though the Amiga
SuperBitmap windows are fairly close. Maybe XWindows or NewS.

> As far as
> programmer improvement is concerned, 99.9% of the programs on the Mac use
> some portion of the interface, and 99.8% of them use the Toolbox rather
> then write it themselves (there's always one who wants to write it himself
> anyway :-)).  Compared to the cost of doing it all themselves, there is
> certainly a large improvement in programmer productivity.  One might argue
> that the Mac interface itself is detirmental to programmer productitivty,
> but since it greatly enhances user productivity, we programmers will have to
> bite the bullet and use it.

I'm not gonna. There's no good reason to make any system more complex than UNIX
for the programmer in this day and age.

> My defintion of a "software-IC" is a follows:
> 
> 1) Simplicity of function:  An IC should have one and only one function.  This
> function could be just to sort data or as complicated as converting 3d-objects
> to 2d-display.  This relates directly to the hardware IC which implements a
> single function (whether it is an AND or a Multiplexer).

Yes.

> 2) Layered.  Not only should the calling function interface be defined, but
> also the called functions interfaces should be defined.  The programmer can
> mix and match pieces to achieve the desired set of features.  A example of
> where this concept is not used but should be is the UNIX malloc function.  
> Malloc is implemented using several queues and OS calls.  There is no reason
> while the programmer couldn't replace these with other equivalent functions
> if needed, except that unless he has access to the source code, he has no
> idea what the called functions do.

Maybe. The programmer can certainly replace malloc, though. Besides, the
complete source to malloc is listed in the C bible (K&R).

> 3) Well documented.  If you looks at a 7400 book, you can see the wealth of
> information provided about ICs.  Everything from pin-outs to time charts to
> tempature specs.  You look at any software libary spec and you'll usually
> get half a page of incomplete information, usually wrong :-).  Complete
> documentation in a standard format is a necessity.

Software is easier than hardware. You need more info than IBM gives you, but
I don't see how (for example) malloc needs more info than is provided in the
UNIX Programmer's Manual.

> 4) Machine/OS/Language independent.  For a hardware engineer, all of the
> iterfaces are standard (TTL), as are the ICs (7400).  This means that
> an engineer can mix and match with no worry about the enviroment the final
> product will end up in.  Different choices can be made for reasons of costs,
> speed, power consumption all without changing the base design.  For software
> ICs the same should be true.  If you need a sort, you should be able to go
> to your "42-series" catalog, look up sort routines and pick the one that
> meets your needs for speed/memory consumption/... independent of whether
> your working on a Mac, a PC, a UNIX box or an IBM mainframe (well, maybe not
> there :-)).  

Nope. If you're working with CMOS, you don't expect to be able to go to your
TTL catalog. Same for ECL. These correspond to different languages, if you like.
Some are positive logic, some are negativce logic. Some can handle a lot of
fan-in, some a lot of fan-out. Some are fast, some are slow.

> I disagree, semi-custom and custom are really a step backwards on the hardware
> side, as far as engineers are concerned.  Whereas before one could depend on
> each IC being a black box (chip?) whose implementation you didn't care about
> (mostly), now the implementations will be right in front of you with nothing
> to prevent from aliasing this function to that global over there...  Just
> as bad as software is now!

But, like the Mac Toolbox, it does make things easier on the user (more power
and functionality). (notice how I came 180 degrees on that one, folks?). But
you don't expect your home hobbyist to work with a silicon foundry...
-- 
-- Peter da Silva  `-_-'  ...!hoptoad!academ!uhnix1!sugar!peter
-- Disclaimer: These U aren't mere opinions... these are *values*.

lsr@apple.UUCP (11/05/87)

In article <951@sugar.UUCP> peter@sugar.UUCP (Peter da Silva) writes:
>
>I'll have to disagree with that. The MAC toolbox violates a fundamental law
>of O/S design: FIRST, make the easy things easy. You should be able to
>operate like this (enter fantasy mode):
>
>	Window = OpenWindow(x0, y0, x1, y1, xmax, ymax, 0);
>
>This gives you a minimal window. It gives you a window of size (x1-x0) by

You are absolutely right that the Macintosh does not provide BUILT-IN higher
level calls such as this.  

One problem you run into is what if you don't want a window with 1 "pane";
you want a window with 2 panes each with its own pair of scroll bars.  So
you need the lower level calls as well as higher level calls which handle
common cases.

This leads to the idea of software libraries, which provide the higher level
calls.  There are a couple such libraries for the Macintosh.

Libraries, however, only move the problem one step further out.  If you want
to do something not provided in the library, you have to program it from
scratch.  Perhaps, you can take advantage of some of the othe rsubroutines
in the library, but often you can't.

That's where object-oriented programming starts helping.  When a program is
structured using objects and methods, then it is possible to override 1
particular method and inherit the rest unchanged.  With a library of
routines, you either call the routine or not, you can't modify how it works,
except by varying the parameters.

>I don't know of any system that's this sophisticated, though the Amiga
>SuperBitmap windows are fairly close. Maybe XWindows or NewS.

That's the approach we took with MacApp.  MacApp handles all the common
details of a Mac program, while providing the flexibility to customize the
way it works.  

Once you define a method to draw your window, for example, MacApp can use
that method to implement window updates, scrolling, and printing.  To save a
document to disk, you only have to define the method which does the I/O;
MacApp takes care of creating the disk file, opening it, making sure there
is disk space, etc.  You don't have to worry about moving or resizing
windows, scrolling, printing, etc.; these details are handled automatically.




-- 
Larry Rosenstein

Object Specialist
Apple Computer

AppleLink: Rosenstein1
UUCP:  {sun, voder, nsc, mtxinu, dual}!apple!lsr
CSNET: lsr@Apple.com

peter@sugar.UUCP (Peter da Silva) (11/09/87)

In article <6644@apple.UUCP>, lsr@apple.UUCP (Larry Rosenstein) writes:
> In article <951@sugar.UUCP> peter@sugar.UUCP (Peter da Silva) writes:
> >
> >I'll have to disagree with that. The MAC toolbox violates a fundamental law
> >of O/S design: FIRST, make the easy things easy. You should be able to
> >operate like this (enter fantasy mode):
> >
> >	Window = OpenWindow(x0, y0, x1, y1, xmax, ymax, 0);
> >
> >This gives you a minimal window. It gives you a window of size (x1-x0) by
> 
> You are absolutely right that the Macintosh does not provide BUILT-IN higher
> level calls such as this.  

It should. Even better, for some purposes, is

	fp = fopen("CON:0/0/320/200/Console Window", "r+");

> One problem you run into is what if you don't want a window with 1 "pane";
> you want a window with 2 panes each with its own pair of scroll bars.

Open two windows. This is a flip answer, but for the most part this is a
perfectly acceptable way of doing things. It gives the user more flexibility,
too. I've seen painting programs on the Mac that to this that are wonderful.

> So
> you need the lower level calls as well as higher level calls which handle
> common cases.

Sure, but you shouldn't need the lower level calls until you're getting into
guru territory. The high level calls should do 99% of the job.

> This leads to the idea of software libraries, which provide the higher level
> calls.  There are a couple such libraries for the Macintosh...

How big is Mac App?

How many times is it going to be sitting on the disk or in memory taking up
room?

The typical "real" Mac application is at least 200K these days. 200K is a big
application on the Amiga. Part of this is because these libraries are part
of the system on the Amiga (and you can add your own shared libraries pretty
easily).

> That's where object-oriented programming starts helping.  When a program is
> structured using objects and methods, then it is possible to override 1
> particular method and inherit the rest unchanged.  With a library of
> routines, you either call the routine or not, you can't modify how it works,
> except by varying the parameters.

Not true. You can also replace it. Object oriented stuff is considerably
easier than libraries, but is also a lot less efficient. Why isn't the toolbox
written this way? Rhetorical question: assembler is faster. Shared libraries
are a decent compromise.

In any case, my point is still that something like tha Mac toolbox isn't
the answer. You just described something that might be, if it wasn't carried
around innumerable times.
-- 
-- Peter da Silva  `-_-'  ...!hoptoad!academ!uhnix1!sugar!peter
-- Disclaimer: These U aren't mere opinions... these are *values*.

lsr@apple.UUCP (Larry Rosenstein) (11/11/87)

In article <1039@sugar.UUCP> peter@sugar.UUCP (Peter da Silva) writes:
>
>How big is Mac App?
>
>How many times is it going to be sitting on the disk or in memory taking up
>room?

MacApp is about 50K of object code, of which 15K is resident.  Right now it
has to be linked with every application, but that is an implementation
detail.  There is nothing that says a "Software IC" has to be linked with
every program.

>Not true. You can also replace it [a library routine]. Object oriented
>stuff is considerably easier than libraries, but is also a lot less
>efficient. Why isn't the toolbox written this way? Rhetorical question:
>assembler is faster. Shared libraries are a decent compromise.

Replacing a library routine generally means rewriting more code than you
might have to do in an object-oriented system.  Often you don't want to
replace the whole routine, but only a part of it.  

Even if your library is structured so that you don't have to rewrite much,
one advantage of object-oriented programming is that you can override a
method in a different way in each class.  You could have a single Window
class and implement subclasses that draw the window in different ways, all
in the same application.


Object-oriented systems are not necessarily "a lot less efficient" than
subroutine libraries.  Methods calls can be made very close to procedure
calls in speed.  In Object Pascal, the linker completely eliminates the
run-time binding overhead of a method call, except in the cases where the
programmer uses it.

Whether you write your code in assembler or a high-level language has no
bearing on the issue of libraries vs. object-oriented systems.  Sure,
assembler is likely to be faster and smaller than the output of a compiler.
(One can write a MacApp program in object-oriented assember if you choose.)

>In any case, my point is still that something like tha Mac toolbox isn't
>the answer. You just described something that might be, if it wasn't carried
>around innumerable times.

The Amiga shared libraries are not the answer either.  They are
fundamentally the same as the Toolbox, although they provide some higher
level services.  Sharing code is always desirable, but this is independent
from the issue of libraries vs. object-oriented frameworks.

-- 
Larry Rosenstein

Object Specialist
Apple Computer

AppleLink: Rosenstein1
UUCP:  {sun, voder, nsc, mtxinu, dual}!apple!lsr
CSNET: lsr@Apple.com

drw@culdev1.UUCP (Dale Worley) (11/12/87)

I may be talking through my hat, but...

peter@sugar.UUCP (Peter da Silva) writes:
| In article <1691@culdev1.UUCP>, drw@culdev1.UUCP (Dale Worley) writes:
| > Another feature of "software ICs" comes from the fact that they are
| > part of an object-oriented system.  One can actually write, say, a
| > linked list manager that will work on objects of *any* type.  In most
| > languages, this is impossible to do in a library routine.
| 
| You can do it in 'C'. In fact the standard 'C' library for the Amiga comes
| with exactly this tool.
| 	Insert(ListHead, ListNode, PredNode)
|       [etc.]

Well, I'll bet not.  You can do a lot with macros, and if you're only
dealing with pointers to things, you can cast to (char *), but
consider a somewhat messier example where these two tricks don't work.
Say, a priority queue system, where the priorities aren't of a type
fixed in advance.

Of course you can *do* any of these in C by representing anything
messy as a (char *) pointing at it, but this is just implementing an
object-oriented system...

Dale
-- 
Dale Worley    Cullinet Software      ARPA: culdev1!drw@eddie.mit.edu
UUCP: ...!seismo!harvard!mit-eddie!culdev1!drw
If you get fed twice a day, how bad can life be?

arny@wayback.UUCP (Arny B. Engelson) (11/12/87)

In article <1716@culdev1.UUCP>, drw@culdev1.UUCP (Dale Worley) writes:
> The other important feature that object-oriented systems need
> (Objective-C has it, I don't know of C++ does) is that one can build
> two objects that present the same external interface, but for which
> the method-routines are different.  I.e., I can have two different
> sorts of "dictionary", implemented differently, and the code that uses
> them can't tell the difference between them.  (This is why you need
> run-time mapping from method-names to routines.)


This concept is also present in Ada.  The implementation of an object
(such as a queue, stack, or anything else) is separated from its external
interface.  Combine this with generics and overloading and you can create
a pretty flexible library of "objects".

drw@culdev1.UUCP (Dale Worley) (11/17/87)

arny@wayback.UUCP (Arny B. Engelson) writes:
| In article <1716@culdev1.UUCP>, drw@culdev1.UUCP (Dale Worley) writes:
| > The other important feature that object-oriented systems need
| > (Objective-C has it, I don't know of C++ does) is that one can build
| > two objects that present the same external interface, but for which
| > the method-routines are different.
| This concept is also present in Ada.  The implementation of an object
| (such as a queue, stack, or anything else) is separated from its external
| interface.  Combine this with generics and overloading and you can create
| a pretty flexible library of "objects".

True, but in Ada, the compiler has to be able to figure out which
implementation of a given abstract object is being dealt with.  In
object-oriented languages, this only has to be determinable at
runtime.

Dale
-- 
Dale Worley    Cullinet Software      ARPA: culdev1!drw@eddie.mit.edu
UUCP: ...!seismo!harvard!mit-eddie!culdev1!drw
If you get fed twice a day, how bad can life be?