[comp.theory] Summary, recursive =?=> iterative

glenn@dit.lth.se (Glenn Jennings) (01/30/91)

Summary of responses, recursive =?=> iterative   (LONG) 455 lines
  THANK YOU to everyone who has contributed.
  I would have condensed this message but I am not qualified to
sift out "more significant" from "less significant" responses :-(   GJ
======================================================================
From: dkyoon%pollux.usc.edu@usc.edu (Dae-Kyun Yoon)

You might be interested in the following book/article though they are rather
theoretical than practical. 

- Jacques Arsac, "Foundations of Programming", Academic Press, 1985
- J. Arsac and Y. Kodratoff, "Some Techniques for Recursion Removal from 
	Recursive Functions", Comm. ACM Trans. on Programming
        Languages, Vol.4, No. 2, April, 1982
======================================================================
From: raman@cs.rochester.edu

I don't know if this is what you are looking for, but Ackermann's 
function is an example of a function that is not computable by
any "loop" program,  but is easily expressed in a recursive format. 
A reference is "Computability, Complexity and Languages", Martin 
Davis and Elaine Weyuker, Academic Press. This implies that this 
cannot be done, in some sense.

Rajeev

ps: A loop program looks like of the following: 

some variable := input value;
<loop program body>
print value of output variable

Inside the loop program body, the only control structures
allowed are "for" loops, with the restriction that the
loop index cannot be changed inside the loop. As you can
see this restriction would guarantee termination of
any loop program. Of course, this restriction might
make it uninteresting for you, since it doesn't rule
out exotic iterative programs.
======================================================================
From: smith@cs.UMD.EDU (Carl H. Smith)

If by "iterative" you mean fortran like `for' loops, then the primitive
recursive functions are the largest class computable iteratively.  All
the recursive functions can be computed recursively (that is where they
get their name from). Look up the Herbrand-Godel model of computability
to see how to express any partial recursive function recursively.  Hence,
in this case, it is impossible to uniformly and effectively transform an
arbitrary recursive algorithm into an iterative one.

On the other hand, if by "iterative" you mean some sort of search as in
a pascal like while statement, then Kleene's normal form theorem for the
partial recursive functions gives a uniform an effective way to transform
any description of a partial recursive function into one with a single while
loop and (perhaps several) fortran like `for' loops.  In this case, all one
has to do is to transform the given recursive algorithm into a parital
recursive function definition.  Depending on the algorithm is expressed,
this can usually be done in polynomial time.
			Carl Smith
======================================================================
From: Keith Rogers <rogers@ficc.ferranti.com>

Fundamentals of Data Structures, by Ellis Horowitz & Sartaz Sahni, gives
examples of tree traversal algorithms which are converted from recursive
to iterative.  I'm not sure what you mean by "automatic," but the examples 
in this book use an standard procedure, at least on the 1st pass, which
can be generalized; i.e., identify the variables which are passed as
parameters, and use a local array (or arrays) to simulate the stack.  The
loop exit conditions generally are the same as the test which ends the 
recursion.

The problem, of course, is knowing in advance how large to make the
array(s) which simulate the stack.  My experience was with some
tree traversal algorithm conversions (a PL/1 compiler was generating
some unbelievably inefficient code).  PL/1 supports dynamic array
storage allocation, so that feature was used where the worst case 
allocation would have required excessive memory, and the tree depth 
could be passed as a parameter.

The reference I used also has iterative versions of the recursive tree
traversals which operate in real, fixed space, and do not require the
use of an array to simulate the stack.  I do not know if there is any
way this approach can be applied to general recursion, since the 
algorithm essentially "hid" the stack within the tree structure during
the traversal.  This has the disadvantage of requiring that the 
traversal complete, i.e., it was unusable for searches which returned
on the first find, since the tree would be left corrupted.  It also
has a longer execution path, since the tree manipulations must be
undone, whereas a pseudo-stack has minimal overhead.  It was useful
in the place I used it because the possiblity of a degenerate tree
(list) meant the arrays had to be exceedingly large, and the tree
structure was not known at the time of the procedure call.  The
performance loss was worth the space gain. 

Keith Rogers 
======================================================================
From: sabry@rice.edu (Amr Sabry)

This is of course possible and is widely used in compilers for
Scheme and ML. The technique is called "continuation passing style".
The main idea is to pass around an extra parameter -- the continuation
(think of the continuation as the control stack)

The process can be automated, but it is harder for languages like
C and Pascal than for languages like LISP and Scheme.
The best reference I know of is the following

%A Daniel P. Friedman
%A Mitchell Wand
%A Christopher T. Haynes
%A Eugene E. Kohlbecker
%B Programming Languages: Their Abstractions, Representations,
and Implementations
%I MIT Press and McGraw-Hill
%D 1988-1989
%O in progress

It explains the conversion step by step and goes the extra mile for languages
like C and Pascal. The problem with these languages is that they do not
support closures (returning procedures as values). The algorithms are in
Scheme though.

Amr
======================================================================
From: softrue!kearns@uunet.UU.NET (Steven Kearns)

Hi. The CIP project, recounted in Springer Verlag lectures on computer
science number ????, has found a number of useful transformations
for this purpose.  While they are not applied automatically, it is
easy to apply them with a little human intervention.
-steve
======================================================================
From: ld231782@longs.LANCE.ColoState.Edu

This of course is my own bias, but a good book that addresses the issue
is Structure and Interpretation of Computer Programs.  Recursion and
iteration are becoming more and more interchangeable.  A good example is
"tail recursion".  When a routine calls itself recursively from the end,
so that no other operations are performed after the returning call, the
`jump to subroutine' can be replaced with a `jump' and the stack need
not be adjusted.

Once you are familiar with what a compiler does, the definitions of
`recursive' and `iterative' tend to blur a lot.
ld231782@longs.LANCE.ColoState.EDU
======================================================================
From: stark@sbstark (Eugene Stark)

Isn't this something that is done by nearly every compiler?
							- Gene Stark
======================================================================
From: Lars Lofgren <lofgren@dit.lth.se>

   (my translation GJ): More about Ackermanns function can be found in
(for example) Ann Yasahura /.../ Ann is a student of Martin Davis, and her
book (which lies on the border between mathematical and programming
language aspects) is quite good as far as clear definitions are concerned.
======================================================================
From: Ari_Pekka Laakkonen <@sunic.sunet.se:ari@kcl-cs.UUCP>

Hi,
I've looked into the subject recently, and there are some results that
you might find useful.

-Luddy Harrison's thesis, CSRD Report No. 860 from the Center for
 Supercomputing Research and Development, Univ. of Illinois
 He describes tail recursion elimination and I technique he has named
 'Recursion splitting'. Recursion splitting is combined with interprocedural
 analysis in his work. His work is in Lisp and Scheme.

-P.G. Harrison's work. I can't remember the reference outright, but he
 has published work on the Linear Expansion Theorem (LET) in his book
 with someone called Field, and several followup articles in (I think)
 The Science of Computer Programming, mainly to do with nonlinear functions
 and "degenerate multilinear" functions. If you want to review the subject,
 his book is excellent.

-Of course, Backus's '78 article: Can programming be liberated from the
 von Neumann.. etc. Classic. I think it was in CACM.

-Have a look at recent POPL editions.

I hope that helps.
-Ari             e-mail: ari@uk.ac.kcl.cs, ari@arpa.kcl-cs.ac.uk
======================================================================
From: eerke@cs.kun.nl (Eerke Boiten)

I was going to respond to this, Gene Stark was ahead of me:
>Isn't this something that is done by nearly every compiler?

Yes, of course it is done by nearly every compiler. So, generally it is
easy. Just the stacks-implementation of recursion.

The interesting question is not whether it is possible to transform
recursive algorithms into iterative ones, but whether it is possible to
do this *without using stacks*.

Tail-recursive functions, i.e. fns of the form
	f(x) = if P(x) then H(x) else f(K(x)) fi
can be immediately translated into
	y:= x;
	WHILE NOT P(y) DO y:= K(y);
	RETURN H(y)
General linear recursive functions, of the form
	f(x) = if P(x) then H(x) else E(x,f(K(x))) fi
cannot, in general, be translated into iterative form without using
stacks. This is proved by Paterson and Hewitt [Pat70].
In the literature on transformational programming (cf. the book by my 
"boss" H. Partsch [Par90a], or [Fea87]), many examples of transformations from
linear recursion to tail recursion are treated (in particular, use of an
accumulator and inversion of computation).

Follows a small excerpt from my BibTeX file on transformational
programming etc. :
@ARTICLE{
		Ars82,
		Author= {J. Arsac and Y. Kodratoff},
		Title= {Some Techniques for Recursion Removal from Recursive
			Functions},
		Journal= toplas,
		Volume= 4, Number= 2,
		Month= Apr, Year= 1982,
		Pages= {295--322}}
% Ars82: Transformational, many techniques.
@ARTICLE{
		Aus78,
		Author= {M.A. Auslander and M.R. Strong},
		Title= {Systematic Recursion Removal},
		Journal= cacm,
		Volume= 21, Number= 2,
		Pages= {127--133},
		Year= 1978}
% Aus78 and other work by Strong: classical.
@BOOK{
		Bau82a,
		Author= {Bauer, F.L. and W{\"o}ssner, H.},
		Title= {Algorithmic Language and Program Development},
		Publisher= sv,
		Address= {Berlin},
		Year= 1982}
% Early textbook on transformational programming;
% contains large section on recursion simplification and -removal.
@CONFERENCE{
		Fea87,
		Author= {Feather,  M.S.},
		Title= {A Survey and Classification of some Program
			Transformation Approaches and Techniques},
		Crossref= {Mee87},
		Pages= {165--196}}
@PROCEEDINGS{
		Mee87,
		Booktitle= {Program Specification and Transformation. Proceedings of
			the {IFIP} {TC2}/{WG2.1} Working Conference on Program
			Specification and Transformation}
		Title= {Program Specification and Transformation. Proceedings of
			the {IFIP} {TC2}/{WG2.1} Working Conference on Program
			Specification and Transformation}
		Editor= {Meertens, L.G.L.T.},
		Publisher= nhpc,
		Address= {Amsterdam},
		Year= 1987}
% General survey in IFIP TC2 conference proceedings.
@BOOK{
		Par90a,
		Author= {Partsch, H.},
		Title= {Specification and Transformation of Programs - a Formal
			Approach to Software Development},
		Address= {Berlin},
		Publisher= sv,
		Year= 1990}
% Recent textbook on transformational programming.
@CONFERENCE{
		Pat70,
		Author= {Paterson, M.S. and Hewitt, C.E.},
		Title= {Comparative Schematology},
		Booktitle= {Record of the Project MAC Conf. on Conc.Syst.
			and Par.Comp., {W}oods Hole, {M}ass.},
		Year= 1970,
		Address= {New York},
		Publisher= {ACM},
		Pages= {119--127}}
% Contains proof that stacks are sometimes necessary.
@ARTICLE{
		Str71,
		Author= {Strong, Jr., H.R.},
		Title= {Translating recursion equations into flowcharts},
		Journal= jcss,
		Volume= 5,
		Pages= {254--285},
		Year= 1971}
@ARTICLE{
		Str75,
		Author= {H.R.A. Strong and A. Maggiolo-Schettini and B.A. Rosen},
		Title= {Recursion Structure Simplification},
		Journal= sicomp,
		Volume= 4, Number= 3,
		Year= 1975,
		Pages= {307--320}}
@ARTICLE{
		Wal73,
		Author= {Walker, S.A. and Strong, H.R.},
		Title= {Characterizations of flowchartable recursions},
		Journal= jcss,
		Volume= 7,
		Pages= {404--447},
		Year= 1973}
Eerke Boiten
Department of Informatics (STOP Project), K.U.Nijmegen
Toernooiveld, 6525 ED Nijmegen, The Netherlands
Tel. +31-80-652236.	Email: eerke@cs.kun.nl
======================================================================
From: kt@juicy-juice.lcs.mit.edu (Kenneth R. Traub)

This is not necessarily a definitive paper, but just one I've read:
J. Arsac and Y. Kodratoff, "Some techniques for recursion removal from
recursive functions.",  ACM TOPLAS, 4(2):295-322, April 1982.
				--- Ken Traub
======================================================================
From: 122SCOTT.WITSVMA@f4.n494.z5.fidonet.org (122SCOTT WITSVMA)

Try appendix B of R.L. Kruse's Data Structures and Program Design,
Prentice-Hall, 1987, ISBN0-13-197146-8
======================================================================
From: sjohnson@iuvax.cs.indiana.edu (Steve Johnson)

In article <2665@wn1.sci.kun.nl> eerke@cs.kun.nl (Eerke Boiten) writes:
>General linear recursive functions, of the form
>	f(x) = if P(x) then H(x) else E(x,f(K(x))) fi
>cannot, in general, be translated into iterative form without using
>stacks. This is proved by Paterson and Hewitt [Pat70].

As I'm sure many will point out, the result in Paterson&Hewitt states
that NON-linear recursion schemes cannot be translated to iterative form
(i.e. expressed as a flow chart).  The proof is done by showing the scheme

  f(x) = if P(x) then x else H(f(L(x)), f(R(x)))

cannot be so translated.  The tree-pebbling argument is reproduced
in several places, including Manna's book [1].  Theorem 2 of [Pat70]
states that linear schemes are translatable and gives a construction
(in the form of a flow chart).  Strong developed a stronger notion
of translatability (between recursion schemes and flowcharts)--- I
think he called it "operational translatability"---which isolates the
iterative schemes from the linear schemes.

For the case of linear schemes, it is helpful to look at the simpler
case:
   f(x) = if P(x) then x else E(f(K(x))

The difference is that the value of `x' is not saved in the call to E.
A iterative system of two functions computes this value, which
is f(a) = E^n(K^n(a)) for some n.

   g(x, y) = if P(x) then h(x, y) else g(K(x), y)

   h(x, y) = if P(y) then x else h(E(x), K(y))

g(a, a) first computes K^n(a) while saving the original value of a for later.
h(K^n(a), a) computers E^n(K^(a)).  Applications of P are serving as
a counter.  Intuitively, g and h take twice as long to computer their result
as f takes (if you count defined-function calls, say, or applications of
P).  The more general linear scheme given earler is trickier to write
and, by the intuitive measure, takes quadratic time.

A couple of other good references to these topics are Greibach's lecture
notes [2] and Cohen's doctoral thesis [3].  I should also mention an
article by Wand [4] in which he explores using analysis of continuations
to determine whether control could be represented in something simpler
than a stack (e.g. an accumulator, a bit, a counter, etc.).

I hope that the originator of this note will post the collection of
references accumulated.

[Pat70] {from the original note}
   Paterson, M.S. and Hewitt, C.E., Comparative Schematology,
   Record of the Project MAC Conf. on Conc.Syst. Par.Comp.,
   Woods Hole, Mass., 1970, ACM, New York, 119--127

[1] Zohar Manna, Theory of Computation. McGraw Hill, New York, 1974

[2] Sheila A. Greibach. Theory of Program Structures: Schemes, Semantics,
    Verification.  Springer LNCS vol. 36, Springer, Berlin, 1975.

[3] Norman H. Cohen. Source-to-source Improvement of Recursive Programs,
    Ph.D. Dissertation, Harvard Univ., Cambridge, Mass. (USA), 1980

[4] Mitchell Wand, Continuation based program transformations strategies,
    JACM 27(1):164-180 (January 1980)
======================================================================
From: H}kan Millroth <HAKAN-MILLROTH@AIDA.CSD.UU.SE>

In my Ph.D. thesis i do not "convert an _arbitrary_ recursive program
to a program having only loops". However, I convert any recursive
program recursion over recursive data structures in a rather structural
way to a programs having only loop.

The main application for this transformation in my thesis is to be
able to apply standard loop parallelization techniques, such as those
successully exploited by the Fortran community for a long period, to
modern declerative languages.

My technique can be used for both linear and nonlinear recursion.

I have only carried out this transformation (or compilation, as I see
it) for logic programs. However, it should be possible to do it for
functional programs as well.
======================================================================
From: hakanm@emil.CSD.UU.SE (H}kan Millroth)

In my PhD thesis I develpoed a method to convert not " an arbitry
recursive program to a program having only loops" but to convert
a program recursion over recursive data structures to a programs
having only loops.

It can currently handle linear recursion over lists and integers
and lonlinear recursion over binary trees. It should be quite
easy to incorporate other data structures given type information.

The main motivation for my conversion (or compilation, as I see
it) is to be able to exploit the loop parallelization techniques
successfully used by the Fortran community for many years.

I have developed my method to logic programs; however, it should
be possible to do it for functional programs as well.
======================================================================
From: mattias@brigitte.csd.uu.se (Mattias Waldau)

Hakan Millroth (hakanm@emil.csd.uu.se) asked me to submit this response...

My PhD thesis describes a technique for compiling a logic program using
structural linear recursion to the following iterative form. The head
of a recursive clause is compiled to iterative code (WHILE- and FOR-loops)
for one large unification that replaces the N smaller unifications in an
SLD-resolution system. The body of the clause is compiled to iterative
code that runs the left-body (the calls to the left of the recursive call)
and right-body in separate FOR-loops.

Structural nonlinear recursion can be compiled in a similar way by obtaining
a temporary linear representation of the recursion tree during head unification
(this gives practically no overhead since the entire tree is traversed anyway
in the single large unification resulting invoking the program).

The significance of compiling recursion to FOR-loops in this way (rather than
to WHILE-loops as in traditional tail-recursion optimization) is that the
compiled code can be parallelized by standard techniques for parallelizing
Fortran DO-loops. This is a form of parallel processing that, when applicable,
is inherently more efficient than traditional AND-parallelism, since the
computation can be directly mapped to an array of parallel processors without
runtime task scheduling and load balancing.

Mattias Waldau                              
Computing Science Department                mattias@emil.csd.uu.se
P.O. Box 520, S-751 20  Uppsala, Sweden     Phone: +46-18-181055
======================================================================
end of "summary"