[comp.parallel] Forwarding of a post: subject: APL

eugene@eos.arc.nasa.gov (Eugene Miya) (05/23/89)

I received a kind of neat posting on languages for parallelism
and vectorism.  Unfortunately, the author isn't able to post.
I think he makes interesting comments on APL (which perhaps
deserves a second look).

--eugene
[complete: (please reply to author or follow ups to comp.parallel)]

Subject:      Parallelism/Vectorism
To: eos.arc.nasa.gov!eugene@cs.utexas.edu
From: uunet.UU.NET!tmsoft!loc@cs.utexas.edu (Leigh Clayton)
X-To:         Eugene Miya
Organization: I. P. Sharp Associates
X-Mail:       1900/2 First Canadian Place,
              Toronto, Canada, M5X 1E3
X-Telephone:  +1 (416) 364-5361
X-Fax:        +1 (416) 364-2910
X-Telex:      0622259
Date:         17 May 89 16:14:59 UT

 I think  I may be fairly accused of prejudice in favour of APL since I've
spent the last 16 Years prodding an implementation of it along (Sharp APL
in case you can't guess from my header).

 No language that I know of is perfect (except perhaps CDC 6400 COMPASS :-)
but there are certainly several features of 'modern' APL that I believe at
least point the way towards effective use of scalable architectures. By
'modern' APL I mean APL as Ken (Iverson) now sees it, which is to say *not*
IBM/APL2, but the 'Dictionary' APL as the APL community usually calls it.


 - Generalised operator/function syntax. This is something APL has always
   had, but it's value and generality has not been fully used or recognised
   until quite recently. The number of 'operators' (that is, meta-functions
   whose arguments and results are functions, the architypical one being
   "/", ie in "+/" the operator "/" takes "+" as it's argument, and produces
   "sum reduction" (usually called "summation" by non-APLers) as it's
   result) [pardon long parenthesis] has increased quite radically in the
   last 5 years or so, partly to deal with the other points below -

 - Boxed arrays. Since the structure of APL is embodied much more within
   it's data structures than within it's programs (ie it is more like LISP
   than PASCAL) it made sense to extend the levels of structure for APL
   objects, and general arrays essentially allow an arbitrary level of
   data hierarchy orthogonal to APL's traditional n-dimensional structuring.

 - Function rank. This is probably the most important change to APL formalism
   in the last ten or fifteen years, and complements and completes the notion
   of operators. Basically, each function is defined to have an implicit
   rank, which means the shape of arguments it works on. Thus the rank of
   "+" is 0, meaning it works on 0-rank operands (scalars). Iota is defined
   to have right rank of 0 (ie it's right argument is a scalar) and its
   left rank is 1 (it's left argument is vectors/lists). The power of this
   simple notion comes from two things:

     - If the rank of the argument is larger than the function rank, the
       function is applied (potentially in parallel) to *each* properly-
       ranked structure (which we call a cell) within the argument (ie, as
       Iota's left argument, a matrix is a collection of vector cells).

     - There is an operator to explicitly state the rank of a function,
       which allows explicit control of the application of a function
       across a structure.

 All implementations of  APL that I am currently aware of are (essentially)
single processor implementations (Sharp APL runs on large IBM multi-CPU
systems, but a given interpreter only uses a single CPU at a time). But
APL could be effectively used to exploit massive parallism (at least of the
MIMD sort) by at least the two following strategies:

 - Each cell within any single function invocation can be done separately

 - Within an expression such as   foo"dual">x    ("dual" is a dieresis)
   requests the function "foo"
   be evaluated for each sub-box of the array "x", and these evaluations
   can be done independantly.

 And I'm sure that, once that problem is seriously studied, there will
turn out to be lots of other opportunities.

 As for APL itself, it is alive and well in a few places (Investment banking
is it's current stronghold, believe it or not) where actual programming is
going on. Why it isn't used by more research scientists is something I've
never understood, since the most commonly heard criticisms outside of Comp
SCI departments is that it looks too mathematical, something that should not
frighten (say) a physicist. Of course, it is greatly disliked by all disciples
of Dijkstra, apparantly because it's structure is hidden rather than being
spelled out in English (or Dutch) words. This means that few Computer
Scientists (I have two C.Sc. degrees by the way) will give it the time of
day and it isn't taught in many University classes. Sigh.

 Sorry for the long posting; I don't have a news poster, so I sent it just to
you. It is a bit long for general posting anyway, tho you can repost it or
parts of it if you feel inclined to do so.


-------------------------------------------------------------------
- My employers have even less idea what I mean than I do; neither -
- they nor I should be held accountable for statements made here. -
-------------------------------------------------------------------
  Leigh Clayton,                        loc@tmsoft.UUCP