[comp.parallel] fine->medium->coarse grain

eugene@wilbur.nas.nasa.gov (Eugene N. Miya) (06/05/90)

There is no accepted definition of grain.

It's a loose (as opposed to a tightly)-coupled concept (yet another
vague concept).
A feeling, like parallelism.  It's yet another elephant.  I've seen Bell's
papers and was at the talk he gave at the History of Workstations conference
where he gave this paper in Palo Alto.  Grain existed way before this
like Rumbaugh's Coarse Grain Dataflow IEEE TOC paper.  Robbie Babb's
LGDF (Large Grained Data Flow), etc.  From my programming language [PL]
background, we would say that a lot of "scope" is used in large grain.
I do not think synchronization is the only issue.  Dataflow is usually
regarded as fine grain and procedure "level" work is coarse grain.
With dataflow you have the concept of "Throttling" execution (See
Manchester and SISAL papers).  It gets into scheduling and OS rather
than traditional PL topics.  This in turn gets into data consistency
and other issues, and we swim around in circles.....

Just another blindman.

--e. nobuo miya, NASA Ames Research Center, eugene@orville.nas.nasa.gov
  {uunet,mailrus,other gateways}!ames!eugene

art@cs.bu.edu (Al Thompson) (06/06/90)

In article <9210@hubcap.clemson.edu> eugene@wilbur.nas.nasa.gov (Eugene N. Miya) writes:
|There is no accepted definition of grain.
|
|It's a loose (as opposed to a tightly)-coupled concept (yet another
|vague concept).
|A feeling, like parallelism.  It's yet another elephant.  I've seen Bell's
|papers and was at the talk he gave at the History of Workstations conference
|where he gave this paper in Palo Alto.  Grain existed way before this
|like Rumbaugh's Coarse Grain Dataflow IEEE TOC paper.

But that's the point.  There is no "natural" thing called "grain".  It's
an arbitrary thing.  That's why I cited the Bell and Stone definitions.
They are arbitrary, but they make a certain amount of sense.  Therefore
they are as good as anything else.  Using Bell's numbers I (and I hope no
other serious scientist) would not consider a machine that synched at 19
instructions to be fine grained while one that synched at 21 to be medium.
It doesn't make any sense except to the anally pedantic.  If you are going
to have terms like "grain" floating around it's a good idea to have AGREED
(not natural, whatever that means) definitions.  Agreed, simply so that
when we encounter the terms in isolation from other scientists we will
"know" what they mean.

  Robbie Babb's
|LGDF (Large Grained Data Flow), etc.  From my programming language [PL]
|background, we would say that a lot of "scope" is used in large grain.
|I do not think synchronization is the only issue.  Dataflow is usually
|regarded as fine grain and procedure "level" work is coarse grain.
|With dataflow you have the concept of "Throttling" execution (See
|Manchester and SISAL papers).  It gets into scheduling and OS rather
|than traditional PL topics.  This in turn gets into data consistency
|and other issues, and we swim around in circles.....

This paragraph is one of the best arguments for some sort of arbitrary
definition that I have ever read.  Again, we all have a sense of what the
term means, so it's a good idea to agree on some definition.  If we don't
we'll all be out there. like Humpty Dumpty, using words to mean whatever
we want them to mean whenever we want them to mean it.

One of the problems in computing is this lack of well defined terms.  As a
scientist trained in another discipline (biophysics) I am continually
appalled at the lexical anarchy in the computing "sciences".  In fact I
was so startled by this that I undertook a study of just what it is that
constitutes "knowledge" from the point of view of an established science.
What comes through clearly is the newness of computing.  This is
particularly so if you consider the historical need for what computing can
do for us (think of those medieval scribes calculating tables of logs).
Probably the best reference on this is T.S Kuhn's "The Structure of
Scientific Revolutions".  He makes the important point that you know a
true paradigm has been defined when the basic definitions appear in the
"standard" textbooks.  This has the effect of schooling a generation of
scientists who think that the words they use are somehow invariant.  This
is not true, but it does start them off that way.  Only later when they
are faced with paradigm crisis and shift do they truly confront this
issue.  I suggest that that is exactly where "computer science", "computer
engineering" and "computational science (the new kid on the block)" are
today.  If you don't believe me, look at the textbooks, particularly look
for agreement from author to author.  The more agreement you see the
closer we are.  Now, as an exercise, look at the books of twenty years
ago and repeat the comparison.

|
|Just another blindman.

Then put on glasses.  Any glasses will do as long as they are ground to
the same prescription rules as everybody else's.