[comp.parallel] hold off on posting this a bit, but you can respond yourself

eugene@wilbur.nas.nasa.gov (Eugene N. Miya) (01/04/91)

My earlier list for suggested readings on parallelism is obsolete.
It's a new decade, I plan to throw out all old responses
and start a new.  Past recommendations have been thrown out.
It is time to obsolete pages 598-599 of Hennessy and Patterson.

I want to ask all colleagues for their recommendations, not for me,
but all the eager students and researchers who need some guidance
getting started.

The yearly question is:
Suggest up to ten required readings for a 1st or 2nd year grad student
for an introductory class (up to a year in length) on parallel processing.
Incomplete references (senior author and date, journal) are sufficient.
[Yeah, you can submit 11 or 12 or more, but they should be a good start
prioritize if you like.]

Totalled results across all "required" will have a tallied "top ten"
followed by a recommended 100.  Books or articles are acceptable.

Send any a specific annotations on likes or dislikes, if you desire
to include these.

END REQUEST.

Right now, you search for keywords in my biblio "grequired" and
"grecommended;" real words tend to get used in titles.  Last time
I counted anonymously, this time I will use a separate key field for
voter initials.

When I began this list, I was more a software engineer than a
parallel processor.  I've learn a few things, since that time.
In coming up with my own list, I am dissatisfied, ten is too few
on this difficult topic.  At the risk of introducing a bias
I present my 10 and why.  As editor, I might reserve the right
to note additional good references on certain topics.

NOTE: I am skeptical of "Me too" lists unless I get explanations.

My suggested refs:

Russell Cray-1: classic, historical material, importance of balance,
vectorization was important parallelism.  Needs supplemental material.
Distinct from those VAXen, PCs, and IBM machines.  But the X-MP and Y-MP
are "cleaned up architectures."

Hillis CACM CM paper ("massive parallelism" is an important concept
but it lacks Ivan Sutherland's input).  The book is okay, too.

Seitz Hypercube paper (important for local memory connectivity, has an
"interesting" (difficult?) message-passing C example.  This software example
is necessary to expose people to simple message passing.

Amasi and Gottlieb, Highly Parallel Computer Systems,
can be light hearted.  I think this covers Ultracomputers enough.

Treleaven's Computing Survey's paper on Dataflow.  A few people need to
have their von Neumann thinking sturred up.  This is a reasonable start.

Lamport's CACM paper on Clocks, Timing, and Synchronization.

The IEEE collection of old papers on parallel processing (already
on the existing list).

Akl or Ortega and Voigt as a book on parallel algorithms.

Babb's book on Programming Parallel Processors.

Compilers: work of Kuck or Kennedy.  I want to mull the suggested ref
here first.

NOTES: My list is too hardware oriented, need more software and algorithms.
No grand OS papers: sigh!

What I wish I had:
Parallel debugging [lack of expertise]
Interconnection networks: Siegel. 1979 [Lack of space on my own list]
Papers critiquing past architectures or real architectures ILLIAC IV, MPP, etc.

So this is an example of suggestions.

I will post the tallied results after 2 weeks or so.