[comp.lang.c] Optimization in a better world

chase@Ozona.orc.olivetti.com (David Chase) (06/01/88)

In article <15422@brl-adm.ARPA> dsill@nswc-oas.arpa (Dave Sill) writes:
>From: Steven Ryan <smryan@garth.uucp>
>>The point being, do not expect magic from a compiler. It can provide at best
>>linear improvement.
>
>Since when?  [examples of better than linear improvement, some silly]

Agreed, but a much more plausible possible optimization is the
introduction of memo-functions.  This is not a completely trivial
transformation (figuring out when to apply it is the hard part,
really), but use of memo-functions will convert exponential time
algorithms (for example, f(x) = x > 2 then x else f(x-1) + f(x-2))
into polymonial-time algorithms.  For many structured problems you can
do this by hand and call it "dynamic programming", but that is only a
special case.

Other forms of this optimization have been dicussed for functional
languages--see John Hughes, "Lazy Memo Functions" in S-V LNCS #201,
Kieburtz and Schultis, "Transformations of FP programs schemes" in the
1981 Conf. on Functional Programming and Computer Architectures, and
(I think) a paper by Norman Cohen that appeared in the last couple of
years in TOPLAS.

(Proposed) APL compilers used pattern matching to discover common
"idioms" (Perlis's term) that could be efficiently compiled.  The
expected speedup obtained was "only" linear, but the factor was
typically 10 or 100.  There was a technical report on this out of Iowa
State University in 1982 by Cheng, Omdahl, and Strawn--there were
still problems remaining to be solved back then, but some of those
problems have since been solved.

David Chase
Olivetti Research Center, Menlo Park

smryan@garth.UUCP (Steven Ryan) (06/01/88)

>Agreed, but a much more plausible possible optimization is the
>introduction of memo-functions.  This is not a completely trivial

Unfortunately, we are talking about C with side effects and aliasses and
all those other nasty-nasties that give optimisers fits.