[comp.object] Continuations -- warning: *long*

kend@tekchips.LABS.TEK.COM (Ken Dickey) (12/11/89)

In article <6527@pbhyf.PacBell.COM> jste@PacBell.COM (Joshua Stein) writes:
>Can someone post a non-implementation dependent definition of continuations;
>what they are, what they do, and how they fit into the object-oriented
>paradigm.

A continuation represents the default future for a computation.
Continuations are used to describe generalized `jumps' in Denotational
Semantics {see refs, below}.  One way to think of continuations is that
functions never return, but hand their results to an extra argument, the
continuation, which consumes the result.  Let's say that there is a magic
way to make procedures out of arbitrary chunkf of code in C.  Then we might
be able to do something like the following:

int gcd(int a, int b) {
   if (b == 0) { 
	return( a ) ;
   } else {  
	return( gcd( b, remainder( a, b ) ) ) ;
   } 		 /* Boy, what a lot of parens! :^) */
}

could be transformed into:

gcd(int cont, int a, int b) {
  if (b == 0) {
	cont( a );
  } else {
	remainder( make-proc( int new-cont(int remainders_result)
			      { gcd(cont, b, remainders_result ) } ),
		   a, 
		   b );
  }
}

gcd now never `returns a result'.  It gives the result to the continuation
which it has as an (inplicit) argument.  The remainder routine now takes a
continuation also.  So what magically happens above in the else clause is
that remainder is called with a new continuation argument which `does the
rest of the computation'.  [You can think of make-proc as the compiler being
called in-line.  Don't think of how C's stack would deal with this.  It
doesn't, which is why all the messages on this topic].

You can think of getting a hold of the rest of the computation in the
source language.   Now you can ignore it!  To escape from an deeply nested
procedure, just ignore the continuation passed in and give your result to a
previously saved continuation.  This is what happens with catch/throw in
Lisps or in setjump/longjump in C.  But with first class continuations, you
can do more than this.  You can invoke a continuation multiple times.  You
can capture a continuation at every clock interrupt and switch to another
one (multi-tasking).  You can search down a computational path, then go
back to a previous decision and try another path, then save that and go
back to the first path.

It is easy to get confused thinking about this.  Let me try to give a more
operational model.  I have included the following to give some idea of the
style of code generation this type of transformation leads to...

===================

The following is excerpted from "Evaluating Software Language Technologies"
[Pacific Northwest Software Quality Conference, 1989--you can guess who the
author is].


8.  TECHNICAL APPENDIX: Scheme Samples


It is difficult to give the flavor of what programming in a highly paradigmic
language is like in a small example.  Here the Scheme programming language
is used to give a small taste.  See [AbelSuss85] or [Dybvig] for more
extensive examples.


8.1 A BRIEF INTRODUCTION TO THE PROGRAMMING LANGUAGE SCHEME

Scheme inherits lexical scoping from Algol and syntax from Lisp.  The
basic rule for evaluation is to evaluate all sub-expressions between
the balanced parentheses and apply the value of the first to the
values of the others.  There are a few "special forms" which do not
evaluate all their arguments.  These are noted as they occur.

Examples: 
		Scheme			  ~C
    (<= lo-bound x hi-bound)  ; ((lo-bound <= x) AND (x <= hi-bound))
    (+ 23 31 4)		      ; (23 + 31 + 4)
    ((if (foo x) * +) 2 5 3)  ; (foo(x) ? (2 * 5 * 3) : (2 + 5 + 3))
			      ; I.e. the IF construct returns the function to
			      ; be applied to the arguments 2, 5, and 3.

Scheme has very few fundamental constructs.  E.G.:

abstraction: (LAMBDA (x) (* x x))    ; `lambda' indicates an unnamed function
	     (DEFINE (sq x) (* x x))

application: ((lambda (x) (* x x)) 2)	; should return 4
             (sq 2) 			; should return 4

conditional: (IF test consequent alternative) 
				; IF evaluates consequent XOR alternative
assignment:  (SET! x 3)

sequencing:  (BEGIN (set! x 3) (sq x)) 
			; BEGIN returns the value of the last expression

;  A COMMENT begins with a semicolon and runs to the end of the line.

Scheme is extended via syntax transformations.  For example LET, which
declares local names can be implemented via lambda abstraction:

    (let ((a 1) (b 2)) (+ a b)) <==> ((lambda (a b) (+ a b)) 1 2)

Scheme has excellent abstraction features and supports first class
procedures and continuations.

{Continuations represent the default future of a computation and their use
allows the creation of advanced forms of control structures used for example
to implement coroutines, backtracking and multiprocessing.  They have
importance in program transformation strategies as well [Clinger87,
FelFriKohlDu, FriHaKohl84, FriHa85, HaFri84, HaFri87, HaFriWa84, HaFriWa86,
Wand80a, Wand80b].}


8.2 EXAMPLE: Variations on the Greatest Common Denominator (GCD) algorithm

Here are 4 variations on the GCD algorithm.  The main point of this series
of examples is to show how a functional algorithm (without assignment) can
be transformed into an imperative one (without function calls).  The
secondary objective is to show a number of styles [functional, imperative,
continuation-passing, object-oriented] in the same programming language and
demonstrate a (trivial) use of first class functions.

The first example is functional.  The only real function call inside
GCD-1 is to "remainder".  The other calls are in the `tail' position.
What this means in practice is that there is no more work to be done in
the function after the last call, so no extra return address has to be
passed.  The "tail call" can use the same return address that was passed
in to the main function and so tail recursive loops can be compiled into
simple jumps.  Recursive tail calls are just loops.


8.2.1 Functional GCD algorithm  -- no assignment


(define (GCD-1 a b)
   (if (= b 0)
       a
       (gcd-1 b (remainder a b))
)  )

(define (REMAINDER n d)      ; n=numerator, d=denominator
   (if (< n d)
       n
       (remainder (- n d) d) ; This is a `tail-recursive' call. This
)  )			     ; code compiles into a simple loop with
			     ; no recursion.

;------------------------------------------------------

In this second example, [1] has been transformed into Continuation Passing
Style (CPS).  What is happening here is that, instead of returning a result,
each function gets an auxiliary argument, the continuation, which is a
function which gets called with the result.  The top-level continuation
(identity in this case) is the only one that ever returns a result.
Essentially, the continuation holds the values which would be on the stack,
but in a form which can be manipulated directly.  Note that CPS-REMAINDER
just passes along the same continuation in its loop.  It "never has to push
a new return address."

The advantage of CPS is that it makes all temporary variables explicit.  It
is done as part of the compilation process by some compilers [AppelJim,
Kranz].

8.2.2 Continuation Passing Style (CPS)


(define (IDENTITY value) value)  ; just take a value and return it.

(define (GCD-2 a b) (cps-gpd a b identity))

(define (CPS-GCD a b k) ; k=continuation
   (if (= b 0)
       (k a)  ; result is passed to the continuation
       (cps-remainder a b (lambda (v) (cps-gcd b v k)))
)  )

(define (CPS-REMAINDER n d k)
   (if (< n d)
       (k n)
       (cps-remainder (- n d) d k)
)  )


;------------------------------------------------------

The third example is worth some study.  It was derived from [2] but is
basically code for a 2 register machine with a stack.  The function calls
are now jumps.  A function call is seen as a goto which passes arguments,
one of which is the continuation or return address.  Note that the `extra'
parenthesis around "((pop STACK))" means that the result of popping the
stack is a function which is then called.  The only real trick between [2]
and [3] is that the arguments are passed implicitly in registers.  [You
may have to `hand execute' the code for a few values to convince yourself
that it really works]. 

The stack abstraction for (8.2.3) is just an example of a simple object in
the object oriented sense.  Executing "(make-stack)" is roughly equivalent to
sending a stack `class' the `new' message.  What is returned is an object (a
functional closure in this case) with its own local state (the `a-stack'
variable).  Executing "(push STACK 5)" is implemented by sending the `push'
message to the STACK object.  What is returned is a function which is
applied to the argument, 5, which is added to the stack.  There are several
object systems available for Scheme (see e.g. [AdamsRees] for implementation
details).


8.2.3 Register Machine with Stack


; REGISTERS

(define R0 0) ; result and temporary register
(define R1 0) ; temporary register
(define STACK (make-stack)) ; stack abstraction


; ALGORITHM

(define (GCD-3 a b)
   ; setup initial state & run

   (set! R0 a) ; set up registers
   (set! R1 b)
   (push STACK ; set R0 as result of computation
	 (lambda () R0)) ; just return its value

   (register-gcd) ; jump to initial pc
)

(define (REGISTER-GCD)
   (if (= R1 0)
       ((pop STACK)) ; set PC to addr on stack
       (begin
          (push STACK R1)
	  (push STACK gcd-continuation)
	  (register-remainder) ; jump to label "REGISTER-REMAINDER"
       )
)  )

(define (REGISTER-REMAINDER)
   (if (< R0 R1)
       ((pop STACK)) ; execute the saved continuation
       (begin
          (set! R0 (- R0 R1))
	  (register-remainder) ; jump to label "REGISTER-REMAINDER"
       )
)  )

(define (GCD-CONTINUATION) ; value in R0
   (set! R1 R0)
   (set! R0 (pop STACK))
   (register-gcd) ; jump to label "register-gcd"
)


; STACK ABSTRACTION

(define (MAKE-STACK)  ; --returns a new stack object each time it is called
		      ; Code is shared by all returned "instances".

 (let ( (a-stack '()) ) ; This list implementation can be replaced by an 
		        ; array for speed, with no change in interface.
   (define (stack-push value) 
      (set! a-stack (cons value a-stack))
   )

   (define (stack-pop) 
      (if (null? a-stack) 
          (error "Attempt to pop from empty stack!" '())
          (let ( (result (car a-stack)) )
	     (set! a-stack (cdr a-stack))
	     result)
   )  )

   (define (dispatch message) ; message passing style of dispatch
      (case message
         ((push) stack-push)  ; return function matching message tag
	 ((pop)  stack-pop)
	 (else (error "STACK: Unrecognized message:" message))
   )  )

   dispatch) ; Return the dispatch function as a new `stack object' instance.
)

; convert message passing interface to function call interface

(define (PUSH stack value)
   ((stack 'push) value))

;; "(stack 'push)" returns a function which is applied to "value"
   
(define (POP stack) 
   ((stack 'pop))) ; execute the stack's internal pop routine
   

;------------------------------------------------------
The fourth and final variant is what a compiler might emit as a result
of compiling the functional definition [1] and inlining remainder.  It
is a straight forward transliteration of [3] into pseudo-code (and the
only "code" shown which has not been executed on a computer).


8.2.4 The above (8.2.3) as pseudo machine-code:
*
* START
*
GCD:		* ENTRY SETUP -- assume R0 has A and R1 has B at call
	store	@END-GCD,++(SP)	* push code address to come back to
	jump	REGISTER-GCD	* go do the work
END-GCD:
	rts			* return-from-subroutine (comes back indirect)
*				* R0 has result
*
REGISTER-GCD:
	zero?	R1
	brFalse L1
	load	PC,(SP)--		* pop from stack to PC -- indirect branch
L1:
	store	R1,++(SP)		* push R1
	store	@GCD-CONTINUATION,++(SP)* push label address (where to continue)
*	jump	REGISTER-REMAINDER	* fallthrough
*
REGISTER-REMAINDER:
	less?	R0,R1
	brFalse	L2
	load	PC,(SP)--	* pop from stack to PC -- indirect branch
L2:
	sub	R0,R1		* result in R0
	jump	REGISTER-REMAINDER
*
GCD-CONTINUATION:
	regmov	R1,R0		* R1 <- R0
	load	R0,(SP)-- 	* stack value pop'ed to R0
	jump	REGISTER-GCD	
*
* END
*


7. REFERENCES

[AbelSuss85] {Scheme-usage}
Abelson & Sussman: "Structure and Interpretation of Computer Programs"
MIT Press, 1985.

[AbelSuss86] {Scheme-usage}
H. Abelson & G. Sussman: "Computation, An Introduction to Engineering
Design", MIT Technical report, 1986.

[AbelSuss88] {Scheme-usage}
H. Abelson & G. Sussman: "Lisp: A Language for Stratified Design", Byte
Magazine (McGraw Hill), February 1988.

[AdamsRees] {Scheme-implementation}
N. Adams & J. Rees: "Object-Oriented Programming in Scheme", Proceedings of
the 1988 ACM Conference on Lisp and Functional Programming.

[Appel] {Automatic Storage Management}
Appel, Andrew: "Garbage Collection Can Be Faster Than Stack Allocation",
Information Processing Letters 25 (North Holland), 1987.

[AppelJim] {CPS}
A. Appel & T. Jim: "Continuation-Passing, Closure-Passing Style",
Proceedings 16th Annual Symposium on Principles Of Programming Languages
(ACM), January 1989.

[Barendregt] {Lambda Calculus}
H. Barendregt: "The Lambda Calculus: Its Syntax and Semantics", Studues in
logic and the foundations of mathematics, Volume 103, North-Holland 1984.
...

[Clinger84] {Scheme-compiler: proof of correctness}
W. Clinger:"The Scheme 311 Compiler, An Exercise in Denotational
Semantics", 1984 Symposium on Lisp and Functional Programming.

[Clinger87] {continuations}
W. Clinger: "The Scheme Environment: Continuations" Lisp Pointers, Vol 1
#2, June-July 1987.

[CliFreWa] {denotational semantics}
Clinger, Friedman & Wand: "A scheme for a higher-level semantic algebra"
in "Algebraic methods in Semantics", Ed: Nivat & Reynolds
Cambridge U Press, 1985.
...

[Dybvig] {Scheme-language}
K. Dybvig: "The Scheme Programming Language", Prentice-Hall, 1987.

[FelFriKohlDu] {continuations}
Felleisen, Friedman, Kohlbecker, & Duba: "Reasoning with Continuations",
Proceedings of the Symposium on Logic in Computer Science, IEEE Press, 1986.

[FriHaKohl84] {continuations}
Friedman, Haynes, & Kohlbecker: "Programming with Continuations", in "Program
Transformations and Programming Environments" Ed: P. Pepper, Springer-Verlag
1984.

[FriHa85] {continuations}
Friedman, Haynes: "Constraining Control".  Proceedings 12th Annual
Symposium on Principles Of Programming Languages (ACM), January 1985.

[Gabrial89] {Rapid Prototyping-requirements}
R. Gabrial: "Draft Report on Requirements for a Common Prototyping System",
SIGPLAN Notices, Vol 24 #3, March 1989.
...

[HaFri84] {continuations}
Haynes & Friedman: "Engines Build Process Abstractions", 1984 Symposium
on Lisp and Functional Programming.

[HaFri87] {continuations}
Haynes & Friedman: "Abstracting Timed Preemption with Engines", Computer
Languages, Vol 20 #2, 1987 (pub Great Britain).

[HaFriWa84] {continuations}
Haynes, Friedman, & Wand: "Continuations & Coroutines", 1984 Symposium on
Lisp and Functional Programming.

[HaFriWa86] {continuations}
Haynes, Friedman, & Wand: "Obtaining Coroutines with Continuations",
Computer Languages: V11, #3/4 1986.

[HalSus89] {Scheme-usage}
M. Halfant & G. Sussman: "Abstraction in Numerical Methods", Proceedings of
the 1988 ACM Conference on Lisp and Functional Programming.
...

[Jones]
S. Peyton Jones: "Functional programming languages as a software engineering
tool", Lecture Notes in Computer Science, Vol 287, Springer-Verlag, 1987.

[Kranz] {Scheme-implementation}
Kranz et al: "Orbit, an Optimizing Compiler for Scheme", SigPlan Notices,
Vol 21 #7, July 1986.
...

[PohlEdelson]
I. Pohl & D. Edelson: "A to Z: C Language Shortcomings", Computer
Language, Vol 13 #2, 1988 [Pergamon Press, Great Britain].

[Reese84] {Scheme, T}
Reese, Adams, & Meehan: "The T Manual", 4th Ed.
Yale U. January, 1984.

[ReesCling86] {Scheme-language}
J. Reese & W. Clinger (eds): "Revised^3 Report on the Algorithmic
Language Scheme", SigPlan Notices, Vol 21 #12, December 1986.

[Roylance89] {Scheme-usage}
G. Roylance: "Expressing Mathematical Subroutines Constructively", Proceedings of
the 1988 ACM Conference on Lisp and Functional Programming.

[Schmidt] {denotational semantics}
Schmidt, David: "Denotational Semantics:  A Methodology for Language
Development", Allyn & Bacon, 1986.

[Scott] {denotational semantics}
Scott, Dana: "Domains for Denotational Semantics" ICALP '82, Aarhus,
Denmark, July 1982.
...

[Stoy] {denotational semantics}
Stoy, Joseph: "Denotational Semantics:  The Scott-Strachey Approach to
Programming Language Theory", MIT Press, 1977.
...

[Wand80a] {continuations}
Wand, Mitchell: "Continuation-Based Program Transformation Strategies",
JACM V 27, #1, January 1980.

[Wand80b] {continuations}
Wand: "Continuation-Based Multiprocessing", Proceedings of the 1980 Lisp
Conference.