[comp.object] Inheritance IS NOT Delagation!!!!!sic

tomlic@yoda.ACA.MCC.COM (Chris Tomlinson) (10/13/89)

From article <6219@jpl-devvax.JPL.NASA.GOV>, by david@jpl-devvax.JPL.NASA.GOV (David E. Smyth):
> In article <17653@brunix.UUCP> sdm@norton.UUCP (Scott Meyers) writes:
>>In article <125984@sun.Eng.Sun.COM> grover%brahmand@Sun.COM (Vinod Grover) writes:
>>>I once read that the inheritence mechanism is a special case of a more
> 
> Delagation is better than Inheritance, because 
> 1) Inheritance violates encapsulation by the superclasses, therefore

Yes and no.  There is a bit of confusion concerning this point.  Inheritance
as used in SmallTalk or C++ necessarily introduces some dependencies between
a subclass and a superclass.  This is of course just a result of the use
of inheritance as a code-reuse technique.  This can be viewed as a violation
of encapsulation of specifications but that is not at all the same thing as
violation of encapsulation of instances which is the usual sense of the term.

> 2) Inheritance makes concurrency more difficult in the resultant system.

This simply isn't true.  If inheritance is resolved at compile-time as in
C++ then each object carries with it a virtual function table that represents
the code objects that must be resident at the site of implementation of the
object.  The vtbl becomes a focus for distributing code throughout a parallel
or distributed system.  In a more dynamic situation the parents from which
code objects are inherited simply represent sites from which the code is
fetched to be cached at the site that an object resides.  Understanding
inheritance in the actor model via reflection makes it particularly easy to
describe the system support for inheritance in terms of distributing code
throughout a system.

> 
> Note: "Actors" which is the most widely accepted language concept for
> concurrent object programming, DOES NOT support concurrency.

Assuming that the last use of concurrency was supposed to be "inheritance",
it turns out that the actor model presented in Gul Agha's book can be used
to give a reflective definition of a system which in fact does support
inheritance.  The work of Watanabe and Yonezawa in OOPSLA-88 gives an
indication of this direction.

> 
> Using delagation, one builds a complex "object" through grouping
> several different small objects together.  Each small object maintains
> its own address space and thread.  The complex object is only
> conceptual.
> 
> Delagation reflects reality better than inheritance:  your car does not
> inherit from door, tire, and engine: is is a composite made up of
> doors, tires, and an engine.  The fact that the window on the left door
> is open should have no bearing on the state of other windows or the
> fuel injection or tire pressure.

Others have commented on the difference between 'part-of' and sharing
relations such as inheritance.  I would argue that both delegation and
inheritance are relevant to concurrent programming.  Inheritance is a means
of controlling the distribution of code throughout a system while delegation
is a means of partitioning the work to be performed in an algorithm.  It is
of course true that passing work on to the parts of an object concurrently
is itself a form of delegation in the common sense usage of the term.
The simple characterization of inheritance that I like to use is that
"Inheritance is the process of an active agent looking up the method for
 performing some task and then performing the task."
On the other hand:
"Delegation is the process by which one active agent gives responsibility
 for performing a task to one or more other agents (which in general are
 more competent to perform the task."
Stated this way it is clear that both are applicable in terms of organizing
concurrent computations on finite resources.

> 
> Concurrency is more cleanly supported with Delagation than with
> Inheritance.  Imagine a Car class built using multiple-inheritance.
> Building the car would require the constructors to be invoked serially,
> OR the programmer would have to be concerned about critical sections
> and the like.  As we all know, critical sections are notoriously hard
> to find and debug.  The reason time sharing systems are so easy to use
> is this: all of the processes get their own address space.  The
> encapsulation provided by objects is exactly the same, but the
> granularity is finer.  This leads to more effective use of
> multi-processors and multi-threaded processes (ala Mach).
> 
> In fact, one can think of the object universe which represents the
> typical object-oriented application as being a bunch of delagating
> objects.
> 
> The average depth of inheritance in most systems is about 3.  The
> number of object types in many systems is very large, in the dozens to
> thousands.  Therefore, people tend to favor delagation over inheritance
> by a large margin, even when they don't consider it to be an issue.
> 
> Can you imagine the nightmare which would result if all the objects in
> YOUR application were cobbled together via inheritance into a single
> object type?  Barfo dude!  Sure, a graduate student can "prove", that
> "Inheritance IS Delagation" but is is obviously the inferior form of
> the duality.
> 
> Lynn Stein would have met with alot of disagreement if that paper was
> presented at OOPSLA '89.

No!  It is fact that when viewed as pure protocols inheritance and delegation
can be used to describe identically the same sharing relations.  It is also
true that classes versus prototypes have nothing to do with this (despite some
of the comments in Liebermann's paper which referred to the language systems
extant at the time).  It also turns out that if one analyzes these two protocols
from the point of view of communications cost, inheritance will be in general
more cost effective regardless of cacheing or not.