[comp.lang.misc] Coverage of multitasking

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/09/89)

From genesch@aplvax.jhuapl.edu (Eugene Schwartzman):
> what do you define as overview.  Just stating the fact that something 
> like that exists and what it is?  

    The CACM article listed parallel programming as one of the 11
    major topics; I would therefore assume that parallel programming
    would be covered for at least a week.


    Bill Wolfe, wtwolfe@hubcap.clemson.edu

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/10/89)

In article <6217@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
>From genesch@aplvax.jhuapl.edu (Eugene Schwartzman):
>> what do you define as overview.  Just stating the fact that something 
>> like that exists and what it is?  
>
>    The CACM article listed parallel programming as one of the 11
>    major topics; I would therefore assume that parallel programming
>    would be covered for at least a week.

	Agreed, but not in a beginning level, maybe sophmore year. And then,
	junior and senior level can be used for those that are seriously
	interested in it. How are you going to explain to a student that can
	barely understand how to write a decent algorithm, etc.. about parallel 
	programming.  As somebody has stated, first you crawl, then you walk,
	then you run..  I would classify what you stated as a jog :-)
	Granted, now most of the crawling is done in H.S. :-), but you would
	not want to penalize those that didn'e have computer education in H.S.

gene schwartzman
genesch@aplvax.jhuapl.edu
_______________________________________________________________________________
| GO BEARS, GO CUBS, GO WHITE SOX, GO BULLS, GO BLACKHAWKS, GO TERPS !!!!!    |
| Soccer is a kick in the grass (and sometimes on astroturf)!                 |
| GO DIPLOMATS, GO STARS, GO BAYS, GO BLAST !!!!		              |	
| CFL -> GO EDMONTON ESKIMOS!!!!   VFL -> GO CARLTON BLUES !!!!		      |
|_____________________________________________________________________________|
Disclaimer:  These are my opinions and not of my employer.

pardo@june.cs.washington.edu (David Keppel) (08/10/89)

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) writes:
>[Teach parallelism, but not beginning, maybe sophmore.]

I disagree, but I don't know the reasoning behind parallelism as one
of the 11 CACM topics.

Parallelism can be quite complicated.  But it can also be quite
simple.  Consider filters in Un*x.  It is pretty easy to understand
what's going on in `eqn foo | tbl | troff'.  I've seen (and written!)
a fair number of programs where some kind of threads and pipes would
have made writing the program *simpler* by removing a level of
intermediate data structures.

	;-D on  ( Opinions move faster than light )  Pardo
-- 
		    pardo@cs.washington.edu
    {rutgers,cornell,ucsd,ubc-cs,tektronix}!uw-beaver!june!pardo

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/10/89)

From genesch@aplvax.jhuapl.edu (Eugene Schwartzman):
>>    The CACM article listed parallel programming as one of the 11
>>    major topics; I would therefore assume that parallel programming
>>    would be covered for at least a week.
> 
> Agreed, but not in a beginning level, maybe sophmore year. And then,
> junior and senior level can be used for those that are seriously
> interested in it. How are you going to explain to a student that can
> barely understand how to write a decent algorithm, etc.. about parallel 
> programming.  

   CACM referred to the *introductory* course, without regard for
   the year in which a student takes it.  You can explain it very
   easily via analogy to human activities; the Producer sends lines
   to the Disassembler, who converts them into characters and passes
   them along to the Formatter, etc...


   Bill Wolfe, wtwolfe@hubcap.clemson.edu

jont@cs.qmc.ac.uk (Jon Taylor) (08/12/89)

gene schwartman writes
>       interested in it. How are you going to explain to a student that can
>       barely understand how to write a decent algorithm, etc.. about parallel
>       programming.  As somebody has stated, first you crawl, then you walk,
>       then you run..  I would classify what you stated as a jog :-)

Arguing along the lines of Bachus' attack on the Von Neumann
intellectual bottleneck, it could be said that you are too fixed in
your ways. Couldn't it be the case that teaching parallel programming
second to sequential is putting blinkers on students. Is it not
possible that by teaching them to think of parallel algorithms first you
may get a whole new set of novel answers to programming problems ?

Slap me around the face is you think I am dreaming. It's happened
before. I can take it.

Jon Taylor
Dept. of Computer Science,
Queen Mary College,
University of London.
email : jont@uk.ac.qmc.cs

rudolf@neptune.uucp (Jim Rudolf) (08/19/89)

In article <1186@sequent.cs.qmc.ac.uk> jont@cs.qmc.ac.uk (Jon Taylor) writes:
>gene schwartman writes
>>       ...How are you going to explain to a student that can
>>       barely understand how to write a decent algorithm, etc.. about parallel
>>       programming.
>
>...Couldn't it be the case that teaching parallel programming
>second to sequential is putting blinkers on students. Is it not
>possible that by teaching them to think of parallel algorithms first you
>may get a whole new set of novel answers to programming problems ?

I agree with Jon in concept but not with this particular example.  The
way people learn things best is to compare them to things they already know
or have experienced.  So in the case of teaching parallel algorithms, I
tend to believe that most people think sequentially, and have a difficult
time with parallel concepts.  On the other hand, object-oriented design
can more closely model many real-life problems than imperative languages
can, so yes, I agree with Jon that maybe there is a better approach when
it comes to teaching fundamental concepts.

>Slap me around the face is you think I am dreaming.

Relax!  No slaps are forthcoming.

  -- Jim Rudolf

----------------------------------------------------------------------------
College of Oceanography                              Oregon State University
rudolf@oce.orst.edu    -or-    {tektronix,hp-pcd}!orstcs!oce.orst.edu!rudolf
                      "All opinions herein are mine" 

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/20/89)

From article <12126@orstcs.CS.ORST.EDU>, by rudolf@neptune.uucp (Jim Rudolf):
>>...Couldn't it be the case that teaching parallel programming
>>second to sequential is putting blinkers on students. Is it not
>>possible that by teaching them to think of parallel algorithms first you
>>may get a whole new set of novel answers to programming problems ?
> 
> I agree with Jon in concept but not with this particular example.  The
> way people learn things best is to compare them to things they already know
> or have experienced.  So in the case of teaching parallel algorithms, I
> tend to believe that most people think sequentially, and have a difficult
> time with parallel concepts.  

    Which they can compare to things they already know or have experienced;
    practically all social activity is an example of parallel computation.
    Corporations get the job done by hiring lots of little sequential
    processors and getting them to perform parallel computation.  On what
    basis do you believe that parallel concepts cannot be (as opposed to
    "haven't been in the past") explained in a natural manner?


    Bill Wolfe, wtwolfe@hubcap.clemson.edu

roger@wraxall.inmos.co.uk (Roger Shepherd) (08/20/89)

In article <12126@orstcs.CS.ORST.EDU> rudolf@oce.orst.edu (Jim Rudolf)
writes: 

>In article <1186@sequent.cs.qmc.ac.uk> jont@cs.qmc.ac.uk (JonTaylor) writes: 

>> ...Couldn't it be the case that teaching parallel programming 
>> second to sequential is putting blinkers on students. Is it not 
>> possible that by teaching them to think of parallel algorithms first you 
>> may get a whole new set of novel answers to programming problems? 

> I tend to believe that most people think sequentially, and have a difficult 
> time with parallel concepts.  On the other hand, object-oriented design 
> can more closely model many real-life problems than imperative languages 
> can, so yes, I agree with Jon that maybe there is a better approach when 
> it comes to teaching fundamental concepts. 

At Inmos we have had quite a lot of experience in trying to convince
people  that it is sensible to use concurrent programming to implement
their systems on our (transputer) products. It has seemed to me that
the people who find the concepts of concurrency hardest to understand
are those who have had a  computer science education; electronic
engineers seem to grasp the concepts very easily. I don't think this is
because one group is cleverer than the other, I think it stems from two
things:

  Firstly, electronic engineers deal with massively concurrent systems
  (32 wires changing at once on bus!) every day, and know of methods by
  which the design of such systems can be  contemplated - why should
  concurrent programming be difficult?  Many CS graduates seem to think
  that it is hard (mainly because they've been taught that it's hard).

  Secondly, one large part of the skill of the computer scientist, is
  taking a problem specification, and writing a program to implement it.
  Often in the problems given to CS students there is little ``inherent''
  parallelism, and secondly, where there is, most programming languages
  do not allow parallelism to be expressed and a programmer is trained to
  become very skilled at seeing sequential solutions to parallel
  problems. Sometimes programmers become so skilled that they are almost
  blind to the fact that there is parallelism in their problem (``Where's
  the parallelism in a word processor?'') 

I think that concurrency/parallel programming should be taught very
early in a CS course. The concepts of process, synchronisation, and
non-determinancy are very simple and should be taught alongside other
concepts such as procedural abstraction, iteration, conditional
behaviour. This begs the question of what programming language should be
used to teach these concepts and the only one that I know of which
represents these concepts clearly and cheaply enough is occam. Of
course, there are problems such as lack of availability of occam
implementations, and the fact that occam lacks various facilities (such
as recursion) which also need teaching to CS students.



Roger Shepherd, INMOS Ltd   JANET:    roger@uk.co.inmos 
1000 Aztec West             UUCP:     ukc!inmos!roger or uunet!inmos-c!roger
Almondsbury                 INTERNET: @col.hp.com:roger@inmos-c
+44 454 616616              ROW:      roger@inmos.co.uk

crowl@cs.rochester.edu (Lawrence Crowl) (08/22/89)

In article <6291@hubcap.clemson.edu>
billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>From article <12126@orstcs.CS.ORST.EDU>, by rudolf@neptune.uucp (Jim Rudolf):
>> The way people learn things best is to compare them to things they already
>> know or have experienced.  So in the case of teaching parallel algorithms,
>> I tend to believe that most people think sequentially, and have a difficult
>> time with parallel concepts.  
>
> Which they can compare to things they already know or have experienced;
> practically all social activity is an example of parallel computation.
> Corporations get the job done by hiring lots of little sequential processors
> and getting them to perform parallel computation.  On what basis do you
> believe that parallel concepts cannot be (as opposed to "haven't been in the
> past") explained in a natural manner?

The concept of parallel work is easy to explain.  The difficult part is
teaching people to manage parallel work correctly.  There are endless
management tools (PERT charts, etc) that aid managers in coordinating and
scheduling all the parallel activities of their employees.  If managing
parallelism were in any way natural, we would not need the managment tools
and would not need to pay managers so much.

While people often manage limited parallelism routinely (cooking two dishes at
once), they need considerable training and experience to manage more.  In
addition, most such tasks have a considerable amount of slop in them.  For
example, turning the heat down on the potatoes and up on the grean beans can
be done in any order.  Not as much slop exists in computation.

Also, note that almost all programming is done at the cognitive, or linguistic,
level of thought.  At this level, thinking is almost entirely sequential.
-- 
  Lawrence Crowl		716-275-9499	University of Rochester
		      crowl@cs.rochester.edu	Computer Science Department
...!{allegra,decvax,rutgers}!rochester!crowl	Rochester, New York,  14627

pardo@uw-june.cs.washington.edu (David Keppel) (08/23/89)

>[Ongoing discussion of introducing parallel programming early]

From rec.humor.funny, and reputedly from the New York Times.  N-joy.

	;-D on  ( Ridicule parallelism )  Pardo



Article 1296 of rec.humor.funny:
From: mbr@larch.LCS.MIT.EDU (Mark Reinhold)
Subject: Concurrency in the real world
Date: 21 Aug 89 10:30:04 GMT
Organization: MIT Laboratory for Computer Science
Lines: 13

New York Times, 25 April 1989, in an article on new operating systems for the
IBM PC:

    Real concurrency---in which one program actually continues to function
    while you call up and use another---is more amazing but of small use to the
    average person.  How many programs do you have that take more than a few
    seconds to perform any task?

--
Edited by Brad Templeton.  MAIL, yes MAIL your jokes to funny@looking.ON.CA
Attribute the joke's source if at all possible.  I will reply, mailers willing.

-- 
		    pardo@cs.washington.edu
    {rutgers,cornell,ucsd,ubc-cs,tektronix}!uw-beaver!june!pardo