noekel@uklirb.UUCP (07/28/87)
Thanks to everyone who cared to mail or post responses to my prolog riddle. Although I have been told that I had duplicated the Goode Olde Prolog Riddle known since the Dark Ages or so the answers have been surprisingly controversal. This is a brief summary. The problem was to construct a Prolog program that (1) *generates* natural numbers in ascending order, (2) does not make use of side effects (assert, retract etc.), (3) generates the first n numbers in time O(n). The obvious solution to (1) and (2) is, of course, nat(0). nat(X) :- nat(Y), X is Y+1. which unfortunately runs in O(N**2). A number of people were convinced that this is the best one can manage given the restrictions in (2). Some sketched an infeasibility proof which revealed that they had made one more assumption not included in my list: numbers would be generated by successive calls to nat. In this case one needs side effects, hence I had eliminated the one crucial prerequisite. All other contributors had thought up isomorphic versions of the following solution which generates the numbers on *backtracking* (which was my intention in the first place): nat(X) :- nat1(X,0). nat1(X,X). nat1(X,Y) :- Z is Y+1, nat1(X,Z). Well, what about its runtime behaviour? Somebody conjectured that "it should run in O(n) on any decent Prolog implementation". True, but what is decent? When I fiddled around looking for a solution I used an interpreter without any optimization. All my attempts were O(n**2) and the above solution was no exception! In fact it is a good example for the problem of nesting calls that I mentioned in the original posting. With each number generated there is one more pending call to nat1 and none of these calls are ever exited for good. This primarily a space problem (and not my concern in this riddle). It also means that with each number generated one needs one more variable renaming to arrive at the answer substitution in terms of the variables of the query, and this is a time problem. Seeing this dilemma I decided to post the question. Frustrated by the experience with the interpreter I had completely overlooked the second important property of the proposed solution: the definition of nat1 is tail-recursive and can be optimized by almost any Prolog *compiler* in existence. The optimization consists of reusing the local variables of nat1 in the recursive call instead of allocating a fresh set, thus removes the overhead incurred by repetitive renamings and, voila, everything runs linear! You see, whether or not a solution to my problem exists hinges on the question of what you want to call a decent Prolog implementation. Funny indeed! Klaus Noekel UUCP: ...!mcvax!unido!uklirb!noekel
lee@mulga.oz (Lee Naish) (07/31/87)
In article <29400002@uklirb.UUCP> noekel@uklirb.UUCP writes: > nat(X) :- nat1(X,0). > > nat1(X,X). > nat1(X,Y) :- Z is Y+1, nat1(X,Z). > >All my attempts were O(n**2) and the above solution was no >exception! In fact it is a good example for the problem of nesting calls >that I mentioned in the original posting. With each number generated there >is one more pending call to nat1 and none of these calls are ever exited for >good. This primarily a space problem True, without tail recursion optimization there will be N stack frames in use after the Nth number has been returned. ie O(N). >It also means that with each number generated one needs one more variable >renaming to arrive at the answer substitution in terms of the variables of >the query, and this is a time problem. Its true that in simple implementations there will be N copies of X around but unless the implementation is extremely brain damaged, all the copies will be pointing directly to the original variable, not arranged in a chain of length N. To get the next solution, only the top stack frame is changed. ie. space is O(N) but time (dereferencing and manipulating the stack) is O(1) to get the next solution and therefore O(N) to get N solutions. >nat1 is tail-recursive and can be optimized by almost any Prolog *compiler* Making space O(1) but the same time complexity. Off hand, I cant think of how compilation techniques can improve time complexity, though there is a factor of a few hardware-years. Lee Naish