[comp.edu] Recursion Summary

rrwood@contact.uucp (roy wood) (10/24/90)

Hi again.  I posted a question about recursion and its uses a while back.
After reading through all the replies from the numerous people who were
generous enough to offer their thoughts, the consensus seems to be that the
most elegant demonstration of the power of recursion is in traversing
binary (or other type) trees.  Too bad the person I was trying to convince
doesn't appreciate the usefulness of binary trees!  (what's the equivalent
smiley-face for someone pulling out their hair in frustration?)
 
As many people warned me, recursion is just a useful tool.  Almost anything
that can be done recursively can be done iteratively.  As well, business
folks in general don't often come up with much need for recursion, so they
have trouble believing that it has "real world" applications.  So it goes.
 
And as an interesting side note, I'd like to point out that my original
message made no mention of the sex of the person I was trying to convince,
yet *everyone* assumed that this person was male.  In fact, she is very
much female.  I suppose this is an interesting demonstration of a strong
bias or stereotyping among us.  Does anyone want to do a thesis on the
subject?
 
Thanks to all who replied....
 
-Roy Wood.

iho@cac.washington.edu (Il Oh) (10/25/90)

In article <1990Oct23.211651.10227@contact.uucp> rrwood@contact.uucp (roy wood) writes:
>
> 
>As many people warned me, recursion is just a useful tool.  Almost anything
>that can be done recursively can be done iteratively.  As well, business
>folks in general don't often come up with much need for recursion, so they
>have trouble believing that it has "real world" applications.  So it goes.
> 
Correction.  _Anything_ (not almost) that can be done recursively can be done
iteratively.  At the machine level, there is no such thing as recursion, so
when you write recursive code, the compiler produces code that can be executed
by a non-recursing CPU.  I'm pretty sure I'm right about this.

Still, recursive solutions are more elegant, and I understand them better.
--
  "Gosh!  You've really got          |    Il Hwan Oh
     some nice toys in here."        |    University of Washington, Tacoma
         -- Roy Batty, Bladerunner   |    iho@cac.washington.edu
                                     |

mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) (10/25/90)

In article <9868@milton.u.washington.edu> iho@akbar.UUCP (Il Oh) writes:
>Correction.  _Anything_ (not almost) that can be done recursively can be done
>iteratively.  At the machine level, there is no such thing as recursion, so
>when you write recursive code, the compiler produces code that can be executed
>by a non-recursing CPU.  I'm pretty sure I'm right about this.

Sorry, this is not at all correct!

Recursion is nothing more a reentrant subroutine call.  Most modern
CPU's have support for a hardware stack, which is the most common way
of implementing recursion.  That is, subroutine calls are done by a
"push current PC on stack and jump" instruction, subroutine returns by
a "pop PC from stack" instruction, and local variable creation by an
"allocate space on stack" instruction.  Many CPU's also have a "push
value on stack" and "pop value from stack" instruction, but these are
less commonly used today than the space-allocation instructions.

Machines without hardware stack instructions can emulate them.

What I think you are thinking of are compilers which are smart enough
to recognize tail-recursion and to generate iterative instead of
recursive code.

 _____   | ____ ___|___   /__ Mark ("Gaijin") Crispin  "Gaijin! Gaijin!"
 _|_|_  -|- ||   __|__   /  / R90/6 pilot, DoD #0105   "Gaijin ha doko?"
|_|_|_|  |\-++-  |===|  /  /  Atheist & Proud          "Niichan ha gaijin."
 --|--  /| ||||  |___|    /\  (206) 842-2385/543-5762  "Chigau.  Gaijin ja nai.
  /|\    | |/\| _______  /  \ MRC@CAC.Washington.EDU    Omae ha gaijin darou"
 / | \   | |__|  /   \  /    \"Iie, boku ha nihonjin." "Souka. Yappari gaijin!"
Hee, dakedo UNIX nanka wo tsukatte, umaku ikanaku temo shiranai yo.

eafournier@lion.uwaterloo.ca (Wade Richards) (10/25/90)

In article <1990Oct23.211651.10227@contact.uucp> rrwood@contact.uucp (roy wood) writes:
>And as an interesting side note, I'd like to point out that my original
>message made no mention of the sex of the person I was trying to convince,
>yet *everyone* assumed that this person was male.  In fact, she is very
>much female.  I suppose this is an interesting demonstration of a strong
>bias or stereotyping among us.  Does anyone want to do a thesis on the
>subject?

Does this have to be a bias or a stereotype?  Why can't we just notice
that there are more males in the CS field than there are females, and
make the statistically most likely correct guess? 

It's far too awkword to use (s)he, and for some reason both genders seem
to object to the gender-neutral `it'. 

Sexual stereotyping seems to be such a popular catch-phrase these days
that everyone wants to hang that label on anything.  Personally, I think
that stereotyping is slightly less common than many people would claim. 

By the way, here are some more "sexual stereotypes" that fall into the
same catagory: when I refer to a nurse whose gender I don't know, I
assume female.  If I don't know the gender of a sumo-wrestler, I assume
male. 

And no, I don't have any stastical data to back up my assumption that
there are many more males working in CS than females, just my own
perception and informal head-counts. 


	--- Wade

wiml@milton.u.washington.edu (William Lewis) (10/25/90)

In article <9882@milton.u.washington.edu> mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) writes:
|In article <9868@milton.u.washington.edu> iho@akbar.UUCP (Il Oh) writes:
|>Correction.  _Anything_ (not almost) that can be done recursively can be done
|>iteratively.  At the machine level, there is no such thing as recursion, so
|>when you write recursive code, the compiler produces code that can be executed
|>by a non-recursing CPU.  I'm pretty sure I'm right about this.
|
|Sorry, this is not at all correct!
|
|Recursion is nothing more a reentrant subroutine call.  Most modern
|CPU's have support for a hardware stack, which is the most common way
|of implementing recursion.  That is, subroutine calls are done by a
|"push current PC on stack and jump" instruction, subroutine returns by
|a "pop PC from stack" instruction, ...

   But the CPU itself is iterative, not recursive. The CALL instruction
finishes when the subroutine begins, not when it ends.

|Machines without hardware stack instructions can emulate them.

   Precisely. Or the compiler could generate the emulating instructions
itself. As you said, the call is just a "push PC and jump" shorthand.
In either case, the recursive algorithm is being rewritten
non-recursively.


-- 
 wiml@milton.acs.washington.edu       Seattle, Washington   
     (William Lewis)   |  47 41' 15" N   122 42' 58" W  
"These 2 cents will cost the net thousands upon thousands of 
dollars to send everywhere. Are you sure you want to do this?"

iho@cac.washington.edu (Il Oh) (10/25/90)

In article <9882@milton.u.washington.edu> mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) writes:
>
[stuff about recursion]
>
>Machines without hardware stack instructions can emulate them.
>
Doesn't this essentially prove my point?  That any recursive solution can
be also done in a nonrecursive algorithm without exception?  If I remember
my computing theory class, Turing machines are not capable of recursion.

As for your other inromation, I was unaware of the construction of modern
microprocessors, and I stand corrected.
--
  "Gosh!  You've really got          |    Il Hwan Oh
     some nice toys in here."        |    University of Washington, Tacoma
         -- Roy Batty, Bladerunner   |    iho@cac.washington.edu
                                     |

mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) (10/25/90)

In article <9892@milton.u.washington.edu> wiml@milton.u.washington.edu (William Lewis) writes:
>   But the CPU itself is iterative, not recursive. The CALL instruction
>finishes when the subroutine begins, not when it ends.

That is a wierd definition of iterative vs. recursion, although it may
be what Computer Science classes are teaching these days.  What if the
"subroutine call" is a trap instruction, or an instruction that is
implemented in hardware on some CPU models, in microcode on others,
and in macrocode on others?

By your argument there is no such thing as recursion.

The notion that high-level languages are any difference from machine
language can be disproved by the following C program:

	int foo (int n);
	main ()
	{
	  printf ("%d\n",foo (10));
	  return 0;
	}

	foo (int n)
	{
	  _exit (0);
	  return n;
	}

The call to each of main(), printf(), and foo() is finished as soon as
the stack frame is established.  A separate return, whether explicit
or implicit, is needed to unwind the stack and resume processing at
the caller.

 _____   | ____ ___|___   /__ Mark ("Gaijin") Crispin  "Gaijin! Gaijin!"
 _|_|_  -|- ||   __|__   /  / R90/6 pilot, DoD #0105   "Gaijin ha doko?"
|_|_|_|  |\-++-  |===|  /  /  Atheist & Proud          "Niichan ha gaijin."
 --|--  /| ||||  |___|    /\  (206) 842-2385/543-5762  "Chigau.  Gaijin ja nai.
  /|\    | |/\| _______  /  \ MRC@CAC.Washington.EDU    Omae ha gaijin darou"
 / | \   | |__|  /   \  /    \"Iie, boku ha nihonjin." "Souka. Yappari gaijin!"
Hee, dakedo UNIX nanka wo tsukatte, umaku ikanaku temo shiranai yo.

mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) (10/25/90)

In article <9893@milton.u.washington.edu> iho@akbar.UUCP (Il Oh) writes:
>Doesn't this essentially prove my point?  That any recursive solution can
>be also done in a nonrecursive algorithm without exception?  If I remember
>my computing theory class, Turing machines are not capable of recursion.

Too bad that so much CS is "theory" and not much "practice".

It is misleading to claim that "any recursive solution can also be
done in a nonrecursive algorithm without exception."  Tail recursion
in a single routine translates neatly into a nonrecursive algorithm.

However, in some cases the only nonrecursive way to implement an
algorithm involves, in effect, emulating recursion.  These are
algorithms which have significant intermediate state which is needed
after the recursive call completes.  This requires a stack, or at
least a vector, and control logic for two iterations.  Let's not
forget algorithms in which the recursion takes place several routines
deep (foo->bar->zap->rag->foo).

Matters are further complicated in a multi-tasking environment, where
a particular routine must be reentrant and hence cannot use any global
variables.  Here, a stack is mandatory.  You don't even need multiple
processes; co-routines will do just fine.

The bottom line is: you can do without recursion, but in some cases
the iterative equivalent can not be practically implemented.

 _____   | ____ ___|___   /__ Mark ("Gaijin") Crispin  "Gaijin! Gaijin!"
 _|_|_  -|- ||   __|__   /  / R90/6 pilot, DoD #0105   "Gaijin ha doko?"
|_|_|_|  |\-++-  |===|  /  /  Atheist & Proud          "Niichan ha gaijin."
 --|--  /| ||||  |___|    /\  (206) 842-2385/543-5762  "Chigau.  Gaijin ja nai.
  /|\    | |/\| _______  /  \ MRC@CAC.Washington.EDU    Omae ha gaijin darou"
 / | \   | |__|  /   \  /    \"Iie, boku ha nihonjin." "Souka. Yappari gaijin!"
Hee, dakedo UNIX nanka wo tsukatte, umaku ikanaku temo shiranai yo.

ccplumb@spurge.uwaterloo.ca (Colin Plumb) (10/25/90)

In article <9893@milton.u.washington.edu> iho@akbar.UUCP (Il Oh) writes:
> Doesn't this essentially prove my point?  That any recursive solution can
> be also done in a nonrecursive algorithm without exception?  If I remember
> my computing theory class, Turing machines are not capable of recursion.

Turing machines are fully capable of recursion.  Nobody has found a buildable
machine which is fundamentally more capable that a Turing machine, i.e. to the
best of my knowledge, my own brain is Turing-equivalent (although it would take
a *damn* fast Turing machine and very clever software!).

The difference between recursion and iteration is the FIFO stack nature
of recursion.  Both are a loop, repeatedly executing the same
instructions in various contexts (i.e. variable bindings).  Iteration
uses only one context, overwriting the old with the new.  Recursion
remembers old contexts, arbitrarily many (although usually not, in
practice, infinitely), to get back to them later.

For example, Quicksort:

quicksort(array, low, high)
{
	if (high-low <= 1)
		return;
	mid = partition(array, low, high);
	quicksort(array, low, mid-1);
	quicksort(array, mid+1, high);
}

During the first call the quicksort, you must remember the values of
mid and high for the second call.  Thus, you must ensure that the first
call does not store its mid and high in the same place as the parent.
This can be done by storing the parent's mid and high somewhere and
keeping the child's in the same absolute location (shallow binding), or
by keeping the child's variables somewhere else (deep binding, more or
less).

It is often useful to write problems in recursive form even when
iteration could be used.  The second call to quicksort above is an
example.  It could be replaced by "low = mid+1; goto start;", but
that's messy.  Some languages provide tail recursion optimisation,
where they notice that no further references to the caller's context
will be made, so there is no need to save it.

It is awkward to express recursion in terms of iteration, and error-prone
(the details of managing a stack are usually best left hidden), while
it is quite easy to implement iteration in terms of recursion, and if
tail recursion optimisation is available, it is just as efficient.

Thus there is a tendency in recent languages to not implement iteration
at all.  In Scheme, for (i = 1; i < 10; i++) printf("Hello, world!\n");
would be written (directly, not using the convenient shorthands) as:

(let foo ((i 1))
     (if (< i 10)
         ((print "Hello, world!\n")
          (foo (+ i 1)))
))

(I'm doing this from memory... there may be syntax errors.)

In other words, let foo be a function with initial value i = 1.
If i < 10, print "Hello, world!\n" and call foo with i+1.
Otherwise, return.

Recursion is so useful that most computers use reentrant procedure-calling
conventions, so it works in the straightforward way, but some old computers
did things like patch the final jump in a routine, so it took more care.
-- 
	-Colin

rbutterworth@watmath.waterloo.edu (Ray Butterworth) (10/25/90)

In article <1990Oct23.211651.10227@contact.uucp> rrwood@contact.uucp (roy wood) writes:
>As well, business
>folks in general don't often come up with much need for recursion, so they
>have trouble believing that it has "real world" applications.  So it goes.

In business, recursion is called "delegation".

Your boss says to you "I need blah1 blah2 blah3 for Friday!".
You think "I can do blah1, but blah2 and blah3 I'll have to give
to my subordinates", so you go back to your desk and phone Fred
and say "I need blah2.1 blah2.2 blah2.3 for Thursday!", and phone
Joe and say "I need blah3.1 blah3.2 blah3.3 for Thursday!", and
then procede with blah1 assuming they will get blah2 and blah3
done for you.  Joe and Fred are now in the same situation you were
in and will either do the work themselves or delegate it to their
subordinates.  etc.

If you think of the tree structure of a standard business hierarchy,
some requirement will start at a high node and it will divide and
spread out down the tree, with the decisions at each node being a
recursive process.  The actual work gets done at the lowest nodes
in the tree where the terminating conditions occur (i.e. if you
aren't a boss you've got to do it yourself).

Similarly, things can go up the tree too.  You tell you boss "it
would really be good for the company if such and such were done".
He'll either say "shut up stupid", "sounds good, let's do it", or
"I'll have to think about it".  The last one probably means he'll
go to his boss and say "it would really be good for the company if
such and such were done".  etc.

rbutterworth@watmath.waterloo.edu (Ray Butterworth) (10/25/90)

It's interesting to note that most of the responses are coming
from either Waterloo or Washington.

  ccplumb@spurge.uwaterloo.ca (Colin Plumb)
  eafournier@lion.uwaterloo.ca (Wade Richards)
  plragde@maytag.waterloo.edu (Prabhakar Ragde)
  rbutterworth@watmath.waterloo.edu (Ray Butterworth)

  mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin)
  wiml@milton.u.washington.edu (William Lewis)
  iho@cac.washington.edu (Il Oh)

Just in case someone is considering doing their thesis studying
why these two universities, thousands of miles apart and in
different countries, would have so many people interested in
this topic, I will refrain from mentioning the real reason.

giguere@csg.uwaterloo.ca (Eric Giguere) (10/25/90)

In article <1990Oct25.152058.14393@watmath.waterloo.edu> rbutterworth@watmath.waterloo.edu (Ray Butterworth) writes:
>It's interesting to note that most of the responses are coming
>from either Waterloo or Washington.

I've always wondered about this:  do both UWs (Waterloo and Washington) get
all copies of what is posted in their respective uw.generals or is this
random behavior?

--
Eric Giguere                                       giguere@csg.UWaterloo.CA

bpdickson@rose.uwaterloo.ca (Brian Dickson) (10/25/90)

In article <9882@milton.u.washington.edu> mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) writes:
>In article <9868@milton.u.washington.edu> iho@akbar.UUCP (Il Oh) writes:
>>Correction.  _Anything_ (not almost) that can be done recursively can be done
>>iteratively.
>
>Sorry, this is not at all correct!
>
>Recursion is nothing more a reentrant subroutine call.

Error #1: reentrant subroutine call == recursion

Example: inexperienced FORTRAN programmer writes recursive program,
	wonders why weird things happen. (I observed this one.) :-(

>What I think you are thinking of are compilers which are smart enough
>to recognize tail-recursion and to generate iterative instead of
>recursive code.

No, the discussion is the equivalence of 2 programs performing the same
task, one via recursion, the other via iteration. The equivalence can
be demonstrated by program-controlled stack-equivalent code, whereby
the users program performs all of the state-saving and restoration
of variables' states, as well as depth of "recursion" being emulated.
This can be shown as equivalent on all machines with space-allocation
of a sort, independent of manner of physical stack representation and
OS/compiler controled state-preservation. This equivalence should be obvious.
The question of whether the programs do the same thing is reduced to whether
the programs will, at a given level of "recursion", have the same state of
variables, and execute the same set of instructions (excluding the "recursion"
step, obviously), and the answer is by definition of our construction, yes.

Brian Dickson

nmouawad@water.waterloo.edu (Naji Mouawad) (10/25/90)

How about a non recursive algorithm for the Ackerman function ?

heh heh ...

--Naji.
-- 
         ---------------+-------------------------------------------
        | Naji Mouawad  |       nmouawad@water.waterloo.edu         |
        |  University   |-------------------------------------------|
        | Of Waterloo   | "Thanks God, we cannot prove He Exists."  |

nmouawad@water.waterloo.edu (Naji Mouawad) (10/25/90)

In article <1990Oct25.145621.13691@watdragon.waterloo.edu>
ccplumb@spurge.uwaterloo.ca (Colin Plumb) writes:

[Part deleted]

>Turing machines are fully capable of recursion.  Nobody has found a
>buildable machine which is fundamentally more capable that a Turing
>machine, i.e. to the best of my knowledge, my own brain is
                                            ~~~~~~~~~~~~~~~~   
>Turing-equivalent (although it would take a *damn* fast Turing
 ~~~~~~~~~~~~~~~~~
>machine and very clever software!).

[Part deleted]

--Colin

As a side note, the underlined sentence implies that the Cartesian
dualism is true and that the Kantian boundary between physics
and metaphysic can be removed.

That the current perception of the brain can be modeled with Turing
machines is true, but that does not mean that the various boundaries
imposed by the digitalisation of the brain (in order to represent it
as a Turing machine with a countable _not necessarly finite _ number
of states) is 'accurate'. It may be that  the continuous process of the
brain activity can be better represented by some other mechanism than
the Turing machine (undecidable question ?). More importantly, mathematics
taught us that there is just no way to represent all the reals by
rationals. Another way of saying the same thing is that reality (analog
process) cannot be entirely 'grabed' by reason (digital process).

Thus, if the Cartesian dualism is false (meaning that the thinking
process depends intimatly on the supporting hardware _ the brain _)
then Turing machines are a pale comparison (in power and flexibility ;) )
to the brain.

--Naji.

P.S. Could someone come up with a Pumping Lemma for the brain ?

-- 
         ---------------+-------------------------------------------
        | Naji Mouawad  |       nmouawad@water.waterloo.edu         |
        |  University   |-------------------------------------------|
        | Of Waterloo   | "Thanks God, we cannot prove He Exists."  |

mikew@cs.washington.edu (Mike Williamson) (10/26/90)

In article <9898@milton.u.washington.edu> mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) writes:
>In article <9893@milton.u.washington.edu> iho@akbar.UUCP (Il Oh) writes:
>>Doesn't this essentially prove my point?  That any recursive solution can
>>be also done in a nonrecursive algorithm without exception?  If I remember
>>my computing theory class, Turing machines are not capable of recursion.
>
>Too bad that so much CS is "theory" and not much "practice".
>
>It is misleading to claim that "any recursive solution can also be
>done in a nonrecursive algorithm without exception."  Tail recursion
>in a single routine translates neatly into a nonrecursive algorithm.
>
> [...]
>

The more Mr. Crispin posts, the more clear it becomes that he's a
little bit confused.  It is absolutely, positively, NOT "misleading to
claim that 'any recursive solution can also be done in a nonrecursive
algorithm without exception'".  This is, in fact, exactly the case for
any "recursive" program that is being executed on a digital computer.

Recursive function theory is a mathematical formalism for describing a
certain class of functions, that was introduced in the early part
of this century.  (In fact, the use of the term "recursive" predates
the existence of computers.)  The class of general recursive functions
is very large, and does indeed include functions that cannot be
computed by a Turing machine.  However, for these functions there is
no known effective decision procedure, so I am sure that these are not
what Mr. Crispin had in mind.  It was subsequently proven that an
important subclass of recursive functions, the "partial recursive
functions", are exactly equivalent to those functions that can be
computed by a Turing machine.  Within this class of functions lies
every program that is written for, and executed on, modern digital
computers. 

As several others have pointed out (including Mr. Crispin himself), a
modern digital computer does not actually compute recursively; it
merely emulates recursion through clever use of it's internal store.
Through the magic of high-level language compilers, we may use a
recursive sytax to denote our algorithms, but at the machine language
level, it's all done sequentially.

>The bottom line is: you can do without recursion, but in some cases
>the iterative equivalent can not be practically implemented.

The bottom line is: you do do without recursion, because in all cases
the iterative equivalent is what is actually performed by the hardware.

-Mike

iho@cac.washington.edu (Il Oh) (10/26/90)

In article <9898@milton.u.washington.edu> mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) writes:
>
>It is misleading to claim that "any recursive solution can also be
>done in a nonrecursive algorithm without exception."  Tail recursion
>in a single routine translates neatly into a nonrecursive algorithm.
>
If I understand the construction of modern CISC processors correctly, 
most of it is implemented at the microcode level.  The hardware level
still consists of a collection gates implementing combinatorial and 
sequential logic.  At the level of this paradigm, I'd think there is
absolutely no recursion going on.

Microcode is still a programming language, and can't be considered the
most _basic_ level of the computer.  I'm still not completely convinced
that you can do something recursively that you can't iteratively since
at the most fundamental level of logic (combinatorial and sequential),
there is no recursion possible.

If you accept my claim that there's no recursion at the fundamental
logic level, you must agree that, theoretically, a recursive solution
can never be more efficient than an iterative one.

Even though I believe my last statement, I also believe that recursive
solutions are more elegant and prefer to use them.  They just make more
logical sense to me.
--
  "Gosh!  You've really got          |    Il Hwan Oh
     some nice toys in here."        |    University of Washington, Tacoma
         -- Roy Batty, Bladerunner   |    iho@cac.washington.edu
                                     |

mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) (10/26/90)

The problem is that there are two definitions of "recursion".

Let's leave the real of mathematical formalism and talk about the real
world.  This may be difficult for some CS students, judging from the
results of a typical systems qual.  [I can't believe so many CS
students fail systems quals -- the ones I've seen have been *trivial*
modulo the obligatory
	Define: A is `strongly hyperhyperimmune' if A is infinite and there is
	no recursive f such that (u)[W(f(u))  A  empty] & (u)(v)
	u  v => W(f(u))  W(f(v)) = empty].
	A. show that if A is strongly hyperhyperimmune then A has no
	   infinite retraceable subset.
	B. show that if A is strongly cohesive then A is strongly 
	   hyperhyperimmune.
type of problem.]

In the pure mathematical definition of "recursion", then nothing on
today's typical stored-program digital computers is recursive.  This
*includes* the C and Lisp programming languages.  Yet both C and Lisp
are known for their "recursive" properties.

It is meaningless to talk about the pure mathematical definition of
recursion in the same breath with C or Lisp algorithms or the
recursive programming facilities of those languages.  Remember, the
original question was something like "why do I need to use recursion
when programming?"

So, what other definition of "recursion" is there?  The definition
used in C and Lisp.  The one that programmers actually use.  The one
that's served me well in 17 years of operating system/multi-tasking
applications programming.  It looks to the programmer a lot like
mathematical recursion, but in fact (as has been pointed out) it is in
fact iterative.

In a multi-tasking application -- an operating system, or one which
uses daemons (common in object-oriented languages such as Lisp or the
various object-oriented extensions to C) -- a recursive algorithm is
often the only algorithm which can be practically implemented.  Not
because an iterative algorithm is impossible, but because it is too
complex.  It gets worse when the algorithm must interact with other
algorithms and co-routining enters the picture.

In fact, I'll go further, and say that iteration is nothing more than
an optimized form of tail recursion.  Recursion in the programming
sense, not the mathematical sense.

Yes, for simple algorithms, recursion is a handy shorthand and you can
use iteration.  But I could give you some recursive algorithms that I
doubt you'll be able to write a working iterative version (much less
one that can be readily modified).  So what if the computer executes
the instructions serially?  Some machine in the future may not.  For
the most part, the details are hidden from the programmer as it should
be.  Computers are to make the job easier, not more complicated.

 _____   | ____ ___|___   /__ Mark ("Gaijin") Crispin  "Gaijin! Gaijin!"
 _|_|_  -|- ||   __|__   /  / R90/6 pilot, DoD #0105   "Gaijin ha doko?"
|_|_|_|  |\-++-  |===|  /  /  Atheist & Proud          "Niichan ha gaijin."
 --|--  /| ||||  |___|    /\  (206) 842-2385/543-5762  "Chigau.  Gaijin ja nai.
  /|\    | |/\| _______  /  \ MRC@CAC.Washington.EDU    Omae ha gaijin darou"
 / | \   | |__|  /   \  /    \"Iie, boku ha nihonjin." "Souka. Yappari gaijin!"
Hee, dakedo UNIX nanka wo tsukatte, umaku ikanaku temo shiranai yo.

manis@cs.ubc.ca (Vincent Manis) (10/26/90)

I was talking with a colleague of mine who knows a lot about numerical
analysis. He was saying that the standard method of solving non-sparse
linear systems is Gaussian elimination with pivoting (much to my
surprise, thinking back to courses I took 20 years ago which claimed
that one ought to use Gram-Schmidt, or the like). This lends itself to 
an extremely clean recursive implementation, he says, which actually
runs very efficiently. 

Your colleague may not be interested in sorting, searching, compiling,
or solving linear systems. She may even consider B-trees uninteresting
(though of course VSAM files, as well as the Macintosh HFS directory
structure, are represented as B-trees). However, I sincerely hope she
wouldn't deny that these areas are important to some people. 

With the recursion elimination technology pioneered by Guy Steele's
RABBIT compiler for Scheme, now some 12 years old, recursive code can
run as efficiently as iterative code. There certainly are stupid
recursions (the naive implementation of Fibonacci numbers is the
best-known example), but stupid code is stupid code. I know I couldn't
really survive, in the work I do, without recursion.

Incidentally, since your colleague is a Cobol programmer, I hope she's
getting herself geared up for Object Cobol, which Codasyl has struck a
committee to design. Programming languages *are* becoming more alike. 

--
\    Vincent Manis <manis@cs.ubc.ca>      "There is no law that vulgarity and
 \   Department of Computer Science      literary excellence cannot coexist."
 /\  University of British Columbia                        -- A. Trevor Hodge
/  \ Vancouver, BC, Canada V6T 1W5 (604) 228-2394

mdhutton@theory.toronto.edu (Mike Hutton) (10/27/90)

In article <9917@milton.u.washington.edu> iho@akbar.UUCP (Il Oh) writes:
>If you accept my claim that there's no recursion at the fundamental
>logic level, you must agree that, theoretically, a recursive solution
>can never be more efficient than an iterative one.

Why does efficiency always seem to imply run-time cost?  The 
dominant cost in most (educated guess) applications is the programmer
time, not computer time---not everybody is doing combinatorial problems
or number crunching.

Why not consider space efficiency:  
1. How long is the program.   The recently posted 5 or 6 line quicksort 
is a perfect example of how concise and explicit a recursive program can be.
For big applications, a smaller solution can be much faster time-wise too.
Somebody tell this to the Xwindows people; X is probably thrashing 
as we speak.
2. What is the run-time space.  It is not true in general (consider for
example a recursive factorial function), but many uses of recursion
involve searching a data structure or the input, and would add only 
a constant factor to the space (since you can only search what you have
available to search, and this bounds the recursion stack)

Or programmer efficiency directly:
3. Recursive programs are easier to write, both conceptually and
from (1) above (i.e. typing time), and easier to debug (as a function
of size and complexity).
4. Some people attribute 90% of application cost to maintenance, and
maintenance is a function of readability and size/complexity.  Recursive
programs should be easier to maintain and understand than the 
corresponding iterative versions.


Iteration and recursion are *both* high-level concepts; they belong to
the programmer, not the machine.  Just as translating iteration into
gotos is the job of the compiler, translating recursion into stack operations
and gotos should also be.  In the rare cases where absolute time efficiency is
required, the code is written in high-level, then hand optimized; and
so should be recursion, when necessary.  

>Even though I believe my last statement, I also believe that recursive
>solutions are more elegant and prefer to use them.  They just make more
>logical sense to me.

And hence make you a more efficient programmer...

You seem to imply that recursive programs are a tradeoff of elegance
for (time) efficiency.  I would suggest instead they are a tradeoff
of programmer time and effort, maintenance and correctness for a 
small constant factor of time which is probably not needed anyway.

Mike

iho@cac.washington.edu (Il Oh) (10/27/90)

In article <90Oct26.142612edt.6786@neat.cs.toronto.edu> mdhutton@theory.toronto.edu (Mike Hutton) writes:
>
>In article <9917@milton.u.washington.edu> iho@akbar.UUCP (Il Oh) writes:
>>If you accept my claim that there's no recursion at the fundamental
>>logic level, you must agree that, theoretically, a recursive solution
>>can never be more efficient than an iterative one.
>
>Why does efficiency always seem to imply run-time cost?  The 
>dominant cost in most (educated guess) applications is the programmer
>time, not computer time---not everybody is doing combinatorial problems
>or number crunching.

Touche!  I seem to have adopted the narrow-minded assumption that CPU
time is more valuable than my own time.  I now see the error of my ways
and will charge more for my own time accordingly.  :)

>You seem to imply that recursive programs are a tradeoff of elegance
>for (time) efficiency.  I would suggest instead they are a tradeoff
>of programmer time and effort, maintenance and correctness for a 
>small constant factor of time which is probably not needed anyway.

I like your way of looking at things better.  However, the central
point of argument is still the same.  You can (though you may not
want to) do anything iteratively that you can do recursively.
--
  "Gosh!  You've really got          |    Il Hwan Oh
     some nice toys in here."        |    University of Washington, Tacoma
         -- Roy Batty, Bladerunner   |    iho@cac.washington.edu
                                     |

manis@cs.ubc.ca (Vincent Manis) (10/27/90)

Good God! We seem to have a heat vs light argument here. 

Needless to say, it is quite possible: (1) to map all recursive
computations onto iterative one, and (2) to map all iterative ones onto
recursive ones.  The proof is quite straightforward, with (1) being done
by rewriting procedure call as a push and a goto, and return as a pop,
and (2) being done by rewriting while as a recursion, and then using
Bohm-Jacopini's result about assignment-if-while being universal. 

Another proof appeals to quite practical arguments: conventional
computers operate by sequential state dispatch (if you see this
instruction, do this operation, and enter that state), yet recursion can
be implemented on such machines. In the other direction, languages as
various as TRAC (R), Scheme, and Prolog define WHILE as a user
operation.

I would recommend Marvin Minsky's Computation: Finite and Infinite
Machines (Prentice-Hall, 1967, and still in print!) for good discussions
of program equivalence, and Guy Steele's series of Scheme papers
(Lambda: The Ultimate Imperative, Lambda: The Ultimate Declarative, and
so on, available as MIT AI Tech Reports, from the mid '70's), for
specific discussions of this topic.

--
\    Vincent Manis <manis@cs.ubc.ca>      "There is no law that vulgarity and
 \   Department of Computer Science      literary excellence cannot coexist."
 /\  University of British Columbia                        -- A. Trevor Hodge
/  \ Vancouver, BC, Canada V6T 1W5 (604) 228-2394

manis@cs.ubc.ca (Vincent Manis) (10/27/90)

Quicksort's recursion is quite interesting. Because there's, even with
the cleverest partitioning technique, a non-zero probability that you
will select pessimal pivots which require O(n) stack space, it's worth
taking the time and effort to be careful about which partition you sort
recursively. 

This results in a variety of Quicksort which looks like this (in Scheme)

(define Quicksort
  (lambda (a left right)
    (let ((i 0) (j 0))
      (let ((partition (lambda ... the partitioning code, sets i and j)))
         (partition)
         (if (< (- i left) (- right j))
            (begin      ; Left partition is smaller. 
              (Quicksort a left i)
              (Quicksort a j right))
            (begin      ; Else: right partition is smaller. 
              (Quicksort a j right)
              (Quicksort a left i)))))))

In this example, we always sort the larger partition tail-recursively
(i.e., iteratively), and sort the smaller partition
non-tail-recursively. This bounds the stack growth to O(lg n), and is
therefore a desirable improvement, or so I tell my students. 

This is the sort of optimization I would never expect a compiler (even
one designed by Guy Steele) to do, which is why people have to be aware
of tail recursion. 
--
\    Vincent Manis <manis@cs.ubc.ca>      "There is no law that vulgarity and
 \   Department of Computer Science      literary excellence cannot coexist."
 /\  University of British Columbia                        -- A. Trevor Hodge
/  \ Vancouver, BC, Canada V6T 1W5 (604) 228-2394

gillies@m.cs.uiuc.edu (10/28/90)

> How about a non recursive algorithm for the Ackerman function ?

A dumb suggestion, actually, since the "interesting" values of the
Ackerman function can be stored in ten memory cells, and the other
values are too large to hold in main memory, and would take until the
end of the universe to compute....

mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) (10/29/90)

In article <1990Oct25.160014.17216@watdragon.waterloo.edu> bpdickson@rose.uwaterloo.ca (Brian Dickson) writes:
>>Recursion is nothing more a reentrant subroutine call.
>Error #1: reentrant subroutine call == recursion
>Example: inexperienced FORTRAN programmer writes recursive program,
>	wonders why weird things happen. (I observed this one.) :-(

Fortran subroutines are non-reentrant, although there is no error
check for subroutine call reentrancy.

Some Fortran compilers generate subroutine calls using a JSR-type
machine instruction that stores the return PC in a static location
(e.g. the first location of the subroutine) and return by an
indirect-jump on that static location.

Other compilers may use a stack subroutine-call instruction, leading
to a false notion that Fortran has "reentrancy".

 _____   | ____ ___|___   /__ Mark ("Gaijin") Crispin  "Gaijin! Gaijin!"
 _|_|_  -|- ||   __|__   /  / R90/6 pilot, DoD #0105   "Gaijin ha doko?"
|_|_|_|  |\-++-  |===|  /  /  Atheist & Proud          "Niichan ha gaijin."
 --|--  /| ||||  |___|    /\  (206) 842-2385/543-5762  "Chigau.  Gaijin ja nai.
  /|\    | |/\| _______  /  \ MRC@CAC.Washington.EDU    Omae ha gaijin darou"
 / | \   | |__|  /   \  /    \"Iie, boku ha nihonjin." "Souka. Yappari gaijin!"
Hee, dakedo UNIX nanka wo tsukatte, umaku ikanaku temo shiranai yo.

sccowan@watmsg.uwaterloo.ca (Crispin Cowan) (10/29/90)

In article <10143@milton.u.washington.edu> mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) writes:
>In article <1990Oct25.160014.17216@watdragon.waterloo.edu> bpdickson@rose.uwaterloo.ca (Brian Dickson) writes:
>>>Recursion is nothing more a reentrant subroutine call.
>>Error #1: reentrant subroutine call == recursion
>>Example: inexperienced FORTRAN programmer writes recursive program,
>>	wonders why weird things happen. (I observed this one.) :-(
>
>Fortran subroutines are non-reentrant, although there is no error
>check for subroutine call reentrancy.
We know, that was the point.  FORTRASH is broken in this way.

Crispin
-----
Crispin Cowan, CS grad student, University of Western Ontario
Work:  MC28-C, x3342 crispin@csd.uwo.ca
890 Elias St., London, Ontario, N5W 3P2,  432-7823
Quote:  "You can have peace.  Or you can have freedom.  Don't ever count
on having both at once."        --Lazarus Long
"You can't separate peace from freedom because no one can be at peace
unless he has his freedom."     --Malcolm X

ctodd@desire.wright.edu (10/30/90)

In article <9898@milton.u.washington.edu>, mrc@Tomobiki-Cho.CAC.Washington.EDU (Mark Crispin) writes:

> 
> However, in some cases the only nonrecursive way to implement an
> algorithm involves, in effect, emulating recursion.  These are
> algorithms which have significant intermediate state which is needed
> after the recursive call completes.  This requires a stack, or at
> least a vector, and control logic for two iterations.  Let's not
> forget algorithms in which the recursion takes place several routines
> deep (foo->bar->zap->rag->foo).
> 
> Matters are further complicated in a multi-tasking environment, where
> a particular routine must be reentrant and hence cannot use any global
> variables.  Here, a stack is mandatory.  You don't even need multiple
> processes; co-routines will do just fine.
> 

Who cares? The original statement was a simple expression of truth. 

Anything that can be done recursively can be implemented in a non-recursive
manner! Anything! Of course the implementation is going to take more code!!!
If it didn't why would we need recursion. And isn't anything which takes the
place of recursion, "in effect, emulating recursion."