[net.lang] goto/tail-recursion &c.

mac@uvacs.UUCP (06/02/84)

pseudorandom thoughts on the
    Expensive Procedure Call / LAMBDA as Ultimate GOTO
	myth.

I'm curious what mit-eddi!alan (mit-eddi.1979) learned as his first
programming language.  Is he perhaps also brain-damaged?  I started with
FORTRAN on an 1130, but feel that I've recovered since then.  Or is this an
exercise in Dijkstra-like arrogance and ridiculous rhetoric?

I agree with mit-eddi!gumby (mit.eddi.1986) that LISP isn't normally
object-oriented, though it's likely that those clever folks at MIT have
made it so.  I'm not sure about CLU.  Is overloading an adequate substitute
for late binding?  Maybe it's better.

On the subject of CLU, in reading the manual I was interested in the
treatment of exits/conditions.  CLU seems to make it possible to staticly
trace the possible control flow for these, unlike e.g.  Ada, where
exception handlers are dynamicly bound.  Is this adequate?  It looks like
a better idea.  mit-eddi!gumby (mit.eddi.1986) seems to agree.

Tail-recursion elimination does remove a lot of trace information, making
it somewhat harder to debug.  On the other hand, gotos (& whiles, &
selects, &c.) don't leave any trace information.  These are what the tail-
recursion is usually replacing.  Maybe interpreter/compilers could be
instructed to retain call trace information in the same way that some
debuggers retain jump trace information.

					    no longer

 ARPA: mac%uvacs@csnet-relay CS: mac@virginia USE: ...uvacs!mac

nessus@mit-eddie.UUCP (Doug Alan) (06/06/84)

>	From: mac@uvacs.UUCP

>	I'm curious what mit-eddi!alan (mit-eddi.1979) learned as his
>	first programming language.  Is he perhaps also brain-damaged?
>	I started with FORTRAN on an 1130, but feel that I've recovered
>	since then.  Or is this an exercise in Dijkstra-like arrogance
>	and ridiculous rhetoric?

I used to be brain damaged.  But I got better!  I learned a number of
languages before getting to good ones.  And it took me a lot of time to
undo the brain damage that had accumulated.  And I lost much more time
than if I had learned a reasonable language first.  Many people who are
brain damaged by being taught bad languages will probably never recover.
Languages I learned in chronological order (stars after ones I've
programmed in significantly): BASIC*, APL*, Macro-11*, Lisp*
(dynamically scoped)*, C*, Lisp* (lexically scoped), Algol 60*,
Fortrash*, CLU*, Macro-10*, Prolog (Prolong), Modula 2, Pascal, Algol
68, Euclid, Smalltalk, Ada, FP.  I didn't become un-brain-damaged until
after I learned Lisp for the second time, and no amount of learning poor
languages since (like Fortrash, Pascal, and Ada) has been able to revert
me.

>	I agree with mit-eddi!gumby (mit.eddi.1986) that LISP isn't
>	normally object-oriented, though it's likely that those clever
>	folks at MIT have made it so.  I'm not sure about CLU.  Is
>	overloading an adequate substitute for late binding?  Maybe it's
>	better.

I haven't been using "object-oriented" in the Smalltalk sense of the
word.  I've been using it in the sense of the property that Lisp, CLU,
and Smalltalk all have in common: there aren't any second class objects.
This seems like a more natural definition of the term, and there are
others who use the word this way.  In terms of implementation, this
usually means heaped base allocation.  In the Smalltalk sense of the
word "object-oriented", MIT has invented the Flavor system for Lisp.  It
provides for generic operations, and thus the system can be said to be
"object-oriented".

Overloading is not an adequate substitute for late binding, but if you
want your code to run moderately efficiently on normal hardware it is
the best you can do.  There is something called Extended CLU that allows
for the proper operation on an object whose type is not known at
compile-time to be determined at run-time.  But this feature was never
implemented.

>	On the subject of CLU, in reading the manual I was interested in
>	the treatment of exits/conditions.  CLU seems to make it
>	possible to staticly trace the possible control flow for these,
>	unlike e.g.  Ada, where exception handlers are dynamicly bound.
>	Is this adequate?  It looks like a better idea.  mit-eddi!gumby
>	(mit.eddi.1986) seems to agree.

You are right about CLU's error handling system.  It is very nice,
unlike Ada's which rots!  Some people tell me that CLU's error system is
not sufficient: If procedure A calls procedure B and receives a signal,
it cannot resume B at the point of the signal with a patch for the
error.  I, though, have never missed this feature.  In my opinion, CLU
has the nicest error system I have ever seen.  Its biggest problem is
that there is no way (that I know of) to handle stack or garbage
collector overflows -- the program just dies.

>	Maybe interpreter/compilers could be instructed to retain call
>	trace information in the same way that some debuggers retain
>	jump trace information.

They certainly should!
-- 
				-Doug Alan
				 mit-eddie!nessus
				 Nessus@MIT-MC

				"What does 'I' mean"?