[comp.lang.misc] Which language to teach first?

lacey@batcomputer.tn.cornell.edu (John Lacey) (07/29/89)

My school is currently using an old VAX (11/750) and VAX Pascal in its
CS courses.  In the last 2 years, one of the professors (the best one
:-) ) has offered a course using Abelson & Sussman^2 as the text, with
TI's PC Scheme on 8088/MS-DOS machines.  This course, however, is offered
as an upper level elective.

At this years SIGCSE course, there was a talk about using Lisp, and in
particular Scheme, as a first programming language, that is, in the 
CS1 and CS2 courses.

I am a senior; a member of the university's computer policy committee,
chair of the student math/cs board, and founder of the TeX Users Group.
The math/cs board, together with the department faculty, is looking at
replacing Pascal (and perhaps the VAX) as the main programming language.
I would be interested in hearing from everyone about what they think of
such a move, and what language they feel is the best to use.

My own predjudices are to use Scheme.  Another choice, considerably more
conservative, would be Modula/2 or Oberon.  Ada, to my own taste, is 
completely out of the picture.

What say all of you?

P.S.   What about comp.lang.paradigms?   I would be very interested in a
discussion about the usefulness of particular paradigms, especially as 
related to their effectiveness as teaching instruments. (Professionals
are usually adept enough to fit the correct (or one of the better) paradigms
(and a particular language associated with it) to the job at hand.)

-- 
John Lacey                      |     cornell!batcomputer!lacey
lacey@tcgould.tn.cornell.edu    |     lacey@crnlthry.bitnet

manis@grads.cs.ubc.ca (Vincent Manis) (07/29/89)

In article <8514@batcomputer.tn.cornell.edu>
lacey@tcgould.tn.cornell.edu (John Lacey) writes:

>My own predjudices are to use Scheme.  Another choice, considerably more
>conservative, would be Modula/2 or Oberon.  Ada, to my own taste, is 
>completely out of the picture.

I think that deciding upon the language to use first is most definitely
the wrong paradigm. Imagine deciding first whether to use alternating or
direct current when teaching physics! You should look at what you want
your course to accomplish before deciding upon the particular tools to
use. 

A first-year major-level science course ought to try to present the
discipline as an integrated whole, covering not just the current
consensus, but also the historical process by which that consensus was
developed, and the questions which researchers in the field are
currently investigating. It should also try to lure outstanding students
into that field.

Teaching programming is clearly going to address none of these issues,
as the report of the ACM Task Force on the Core of Computer Science (Jan
CACM, I believe) states forcefully. Rather, we should concentrate upon
characterising what computer science is, which in my mind constitutes
such concepts as abstraction and automata fully, and demonstrating how
these concepts intertwine. Applications are important, but they should
be significant ones, not just producing paycheques (mea culpa! mea culpa
maxima!). 

Not surprisingly, I consider Abelson and Sussman a seminal book in the
field, because it does exactly what I have stated. Abelson and Sussman
happen to use Scheme, and therefore, if one is using their book, the
laboratory work (*not* programming assignments!) would be done in
Scheme. I can imagine using Prolog or Oberon, but Scheme is already such
a good candidate that there seems no point. 

We at UBC expect to switch over to Abelson and Sussman fully in the fall
of 1990 for our major students. The final negotiations for getting
University approval are at present just getting started. 

There is, however, the non-major population, who clearly neither want
nor need Abelson and Sussman. With this group, programming is clearly
not the major issue, and Pascal is quite suitable. We're looking at the
UC Berkeley computer literacy course, `Computing Unbound', to serve
this group. 
____________  Vincent Manis                    | manis@cs.ubc.ca
___ \  _____  The Invisible City of Kitezh     | manis@cs.ubc.cdn
____ \  ____  Department of Computer Science   | manis%cs.ubc@relay.cs.net
___  /\  ___  University of British Columbia   | uunet!ubc-cs!manis
__  /  \  __  Vancouver, BC, Canada V6T 1W5    | (604) 228-2394
_  / __ \  _  "There is no law that vulgarity and literary excellence cannot
____________   coexist."               -- A. Trevor Hodge
              

ron@woan.austin.IBM.COM (07/29/89)

Cal Berkeley switched over to Scheme a few year ago for its intro
class, and I found that it was a better first language than the
ALGOL derivatives (Pascal,Fortran,C) because it teaches 
functional programming and recursion, as well as data structures.
At least if the instructor follows Abelson and Sussman.

Then again any language is probably fine if your instructors
are willing to make the extra effort to introduce those topics
early on, so students will be less confused later.

--

*********************************************************************************
*                         Ronald S. Woan     at IBM-Austin                      *
*                 ron@woan.austin.ibm.COM    WOAN AT AUSVMV                     *
*                             Division 75    Department E63                     *

tka4092@athena.mit.edu (Terry Alkasab) (07/31/89)

In article <8514@batcomputer.tn.cornell.edu> lacey@tcgould.tn.cornell.edu (John Lacey) writes:
>My school is currently using an old VAX (11/750) and VAX Pascal in its
>CS courses.  In the last 2 years, one of the professors (the best one
>:-) ) has offered a course using Abelson & Sussman^2 as the text, with
>TI's PC Scheme on 8088/MS-DOS machines.  This course, however, is offered
>as an upper level elective.
>
>At this years SIGCSE course, there was a talk about using Lisp, and in
>particular Scheme, as a first programming language, that is, in the 
>CS1 and CS2 courses.
>
>What say all of you?
>

        I just finished my freshman year at MIT, and though I am not a
Computer Science major, I took the first class in the Computer Science
sequence (6.001: Structure and Interpretation of Computer Programs,
taught by Hal Abelson :-)!!).  Many freshmen do; perhaps half the
class in non-majors (there are perhaps a couple hundred in the class.)
A *huge* percentage of the class already had programming experience
(in whatever language:  BASIC, Pascal, C, or what have you) and thus
were not learning *programming* per se, but programming *theory*.  As
an interesting note, on the first day, the professor asked how many
students had no previous programming experience.  When a smattering of
the class raised its hand, the professor reassured them saying that
having no experience might prove beneficial, since learning the stuff
he was about to teach would require many of their classmates to
unlearn stuff they had long since learned.  This was supported by a
friend of mine who had problems with the class at first because she
said she was trying to translate everything she did into BASIC.

	Naturally, the class used Ableson & Sussman, and it was quite
an interesting experience.  Many of the concepts I was asked to deal
with, many of (what I understand to be) the important concepts in
computer science were introduced in a deft manner through the use of
Scheme.  I could probably translate these ideas into use in other
languages (in fact, I do on a daily basis), but trying to introduce
them in, say, Pascal...I think it would have been a whole lot harder.
Further, Scheme is a whose syntax is quite simple, and is *very* easy
to pick up.  The idea behind the class was *not* "Let's learn
*another* programming language!" but "Let's learn computer science."
And if a language is chosen which requires more attention than the
concepts it is meant to introduce, then the entire purpose of the
exercise is being defeated.

	My conclusion:  as an introduction to computers, Scheme has
nothing which immediately recommends itself, IMHO.  However, as an
introduction to *computer science*, Scheme worked extremely well in my
case.  (Truth to tell, I kind of like the language.  Lots of fun stuff
you can do!)

--Terry

"Know thyself."
     --Socrates
"No thyself."
     --Zen's answer to Socrates

DISCLAIMER:  If MIT doesn't like what I say, it's their own damn fault
             for letting me have an opinion.

   Terry "Dweebie" Alkasab

gentile@horsey.dec.com (Sam Gentile) (08/01/89)

In article <13158@bloom-beacon.MIT.EDU>, tka4092@athena.mit.edu (Terry Alkasab) writes...
>In article <8514@batcomputer.tn.cornell.edu> lacey@tcgould.tn.cornell.edu (John Lacey) writes:
>>My school is currently using an old VAX (11/750) and VAX Pascal in its
>>CS courses.  In the last 2 years, one of the professors (the best one
>>:-) ) has offered a course using Abelson & Sussman^2 as the text, with
>>TI's PC Scheme on 8088/MS-DOS machines.  This course, however, is offered
>>as an upper level elective.
>>
>>At this years SIGCSE course, there was a talk about using Lisp, and in
>>particular Scheme, as a first programming language, that is, in the 
>>CS1 and CS2 courses.
>>
>>What say all of you?
>>
	I have been a Software Engineer for 4 years now first in the DOD world 
at Raytheon and now working on network applications. I have written the 
majority of my code ( 90%) in C with the remainder being Raytheon assembly,
MACRO-32, DCL and FORTRAN. The first language I was taught as a Computer
Engineering student was FORTRAN and then I learned DEC MACRO-32. 
	Allthough I love C and do most of my work in C, I don't think it is a
good first language to learn for a student. I found C very difficult at first 
and I had a lot of problems with it. Also some C coders write code that is
completely un-readable. I think BASIC should be abolished and certainly not
taught as a first language. It gives people very bad first habits. I don't
think FORTRAN is a good first language either. I think PASCAL is still the
ideal first language for a student. It will expouse the student to the 
concepts of pointers in a more friendly way than C and teach structured
programming habits.
	These are just my opinions on the subject. I would like to hear 
what other people have to say on the subject.

Sam Gentile			gentile%horsey.dec@decwrl.dec.com
Software Engineer		decwrl!horsey.dec.com!gentile
Digital Equipment Corp	
Software Services Engineering - Network Engineering
5 Wentworth Drive GSF1-1/G13, Hudson NH 03051
-----------------------------------------------------------------------------
		DEC is Number ONE in Networking!!!
-----------------------------------------------------------------------------
The views expressed are my totally my own and do not reflect the views of
Digital Equipment Corp.

djones@megatest.UUCP (Dave Jones) (08/01/89)

From article <8514@batcomputer.tn.cornell.edu>, by lacey@batcomputer.tn.cornell.edu (John Lacey):
...
> The math/cs board, together with the department faculty, is looking at
> replacing Pascal (and perhaps the VAX) as the main programming language.

So far, so good.

> I would be interested in hearing from everyone about what they think of
> such a move, and what language they feel is the best to use.
> 

I have some limited experience in this matter, having taught several
introduction to CS courses at the freshman and sophomore level, during
a short stint as visiting prof at a medium large midwestern university.

It is my considered opinion, arrived at after much thought, that it
depends. [He waits for the laughter to subside...]

I think there should be two courses: one for CS majors, one for
non-majors. Teach the course for non-majors in a relatively high
level language. Teach the course for CS majors beginning with a
toy assembly language, then moving to C, and then (only then),
consider LISP variants, unification languages, etc..

I don't hold with the popular notion that a beginning programmer
will be ruined by understanding how computers really work,
or that his brain will misfire if he finds out what the "high level"
languages really do, and why. The idea that the student will "form
bad habits" is specious. The student will better understand the
higher constructs, having rediscovered why they were invented.
I think teaching intro to CS starting with a language which has
embedded lookup-tables and automatic garbage collection is very
confusing to the student.

> 
> ...
> 
> P.S.   What about comp.lang.paradigms? 
>

I am weary with way computer jargon changes the meanings of
perfectly good English words.

  paradigm: EXAMPLE, PATTERN; esp. an outstandingly clear
  or typical example or archetype.

	 -- [Merrium] Webster's Ninth New Collegiate Dictionary

Is that what you mean?

reggie@dinsdale.nm.paradyne.com (George W. Leach) (08/01/89)

In article <13158@bloom-beacon.MIT.EDU> tka4092@athena.mit.edu (Terry Alkasab) writes:

>        I just finished my freshman year at MIT.....

>A *huge* percentage of the class already had programming experience
>(in whatever language:  BASIC, Pascal, C, or what have you) and thus
>were not learning *programming* per se, but programming *theory*.

       When I left high school nearly 15 years ago and began my freshman
year I too had some programming experience.  We had a course in BASIC in
high school that was only in its second year when I took it.  While the
teacher who instituted the course was an excellent math teacher, he was
not qualified to teach this course.  Consequently many bad habits were
picked up that had to be broken in an introductory CS course.  I would
imagine that this is a far more serious problem these days with the
proliferation of PCs and inexpensive compilers.  The availability of
programming courses at the primary and secondary school system level
must have exploded during the past decade.  I know that my younger 
brothers had more hardware and courses available to them than I did in
my high school days.  The large base of self taught hobbiests also
contribute towards the problem.

       Computing is still a young discipline.  There is such a demand
in industry for the skill of programming a computer that must be filled.
There are also too many programs out there calling themselves Computer
Science that do nothing more than teach programming and different languages.
This is especially true in the two year college programs.  Certainly the
situation has improved over the 70's, but we still have a long way to go.


>As an interesting note, on the first day, the professor asked how many
>students had no previous programming experience.  When a smattering of
>the class raised its hand, the professor reassured them saying that
>having no experience might prove beneficial, since learning the stuff
>he was about to teach would require many of their classmates to
>unlearn stuff they had long since learned.  This was supported by a
>friend of mine who had problems with the class at first because she
>said she was trying to translate everything she did into BASIC.

     Many studies of programmer behavior have indicated that the novice
programmer thinks in terms of a specific language syntax, while with
experience comes more abstract thinking without worring about implementation
language details.  The trick is to teach an introductory course that can
avoid this pitfall.  Most intro course concentrate too much on an introduction
to a specific programming language syntax.  One of the goals of the course is
to enable a student to utilize this language in any of the advanced courses
that will be encountered further down the road.  Often the focus is on the
programming language with little emphasis on the theory behind computer 
science.


George W. Leach					AT&T Paradyne 
(uunet|att)!pdn!reggie				Mail stop LG-133
Phone: 1-813-530-2376				P.O. Box 2826
FAX: 1-813-530-8224				Largo, FL  USA  34649-2826

alanm@cognos.UUCP (Alan Myrvold) (08/01/89)

In article <8514@batcomputer.tn.cornell.edu> lacey@tcgould.tn.cornell.edu 
 (John Lacey) writes:
>My school is currently using an old VAX (11/750) and VAX Pascal in its
>CS courses.  In the last 2 years, one of the professors (the best one
>:-) ) has offered a course using Abelson & Sussman^2 as the text, with
>TI's PC Scheme on 8088/MS-DOS machines.  This course, however, is offered
>as an upper level elective.

I would say that Scheme is the ideal language to learn first, but
having purchased TI's PC Scheme for my own 8088/MS-DOS machine, I 
might argue that the performance (even at 10MHz with a hard disk)
makes the language nearly unusable --- it really deserves a fast
'386 box -- or a workstation.

I'd personally like to see Pascal disappear as a first programming
language .... omitting support for separate compilation of units
means that it's tough to talk about writing/using subroutine
libraries (I'd rather teach Fortran or C!!!). Writing code that
others can use, and using other peoples code MUST be taught 
early.

I've heard folk who'd like to see APL as the first language learned 
... but I find my own APL code hard to decipher after a few hours
(even with liberal use of lamps!).

                                          - Alan

A subject who is truly loyal to the Chief Magistrate will nether advise
nor submit to arbitrary measures. JUNIUS
---
Alan Myrvold          3755 Riverside Dr.     uunet!mitel!sce!cognos!alanm
Cognos Incorporated   P.O. Box 9707          alanm@cognos.uucp
(613) 738-1440 x5530  Ottawa, Ontario       
                      CANADA  K1G 3Z4       

tneff@bfmny0.UUCP (Tom Neff) (08/01/89)

There is more than one reason to learn a programming language.  Some
will be theoreticians, some will be systems wankers like myself, some
will be applications drones.  What you want for a "cherry" programming
language is something that will give each of these groups something
rewarding and revealing in terms of their later track.  After that, they
should split up and use more specifically appropriate languages.

The most important thing is LEAVING OUT spurious or unhelpful concepts,
like line numbers in BASIC or pointers to functions returning arrays of
structures containing pointers to functions returning... in C, or about
half of PL/I. :-)  The simpler the better for an introductory language.
All you really need to communicate to people is that a computer is
something that does what you tell it to do.
-- 
"We walked on the moon --	((	Tom Neff
	you be polite"		 )) 	tneff@bfmny0.UU.NET

mccalpin@masig3.ocean.fsu.edu (John D. McCalpin) (08/02/89)

In <see above> reggie@dinsdale.nm.paradyne.com (George W. Leach) writes:

>     Many studies of programmer behavior have indicated that the
>novice programmer thinks in terms of a specific language syntax, while
>with experience comes more abstract thinking without worring about
>implementation language details. [...]

The trouble occurs when the poor students spend too many years with
abstraction and forget that the purpose of the exercise is to solve a
problem --- and that has to be done in a specific language with a
specific syntax.

A sad example of this was the "GOTO" war last year (or maybe the year
before) in the Communications of the ACM.  In a patronizing letter, an
eminent computer scientist (whose name will remain unmentioned) gave
the "correct" solution to a simple problem that had been batted back
and forth as an example of a construct that was made *easier* to read
with a GOTO statement.  The problem was that the "solution" was not
written in an existing programming language, but in a very obscure
pseudo-code.

mattias@emil (Mattias Waldau) (08/02/89)

At our four year long computer science education (Master) given since
-81 we have always had Lisp as the first language. First it was
MacLisp, now it is Common Lisp. The textbook we use is "Anatomy of
Lisp" by Allen.  We tried Abelsson and Sussman one year but the
students didn't like that book. Prolog is the second language, Prolog
and Lisp are used for most programming exercises, except where
low-level languages like assembler and C is needed (e.g. OS).


But to the point: Now and then we discuss using Prolog first, an
algorithmic language isn't actually needed until the students meet the
three books of Knuth. The difference between clean programming in Lisp
and Pascal is just syntax, the approach to solve a programming task is
the same.

If the students can Lisp then they learn Pascal, C, Ada within weeks.
But they are of course not professional programmers, that takes at
least a year.

reggie@dinsdale.nm.paradyne.com (George W. Leach) (08/02/89)

In article <MCCALPIN.89Aug1155412@masig3.ocean.fsu.edu> mccalpin@masig3.ocean.fsu.edu (John D. McCalpin) writes:
>In <see above> reggie@dinsdale.nm.paradyne.com (George W. Leach) writes:

>>     Many studies of programmer behavior have indicated that the
>>novice programmer thinks in terms of a specific language syntax, while
>>with experience comes more abstract thinking without worring about
>>implementation language details. [...]

>The trouble occurs when the poor students spend too many years with
>abstraction and forget that the purpose of the exercise is to solve a
>problem --- and that has to be done in a specific language with a
>specific syntax.

       The expression of the solution to the problem must be expressed
in a specific language for the purpose of implementing it on a machine.
And, yes that is the ultimate goal.  However, one does not use the
implementation language during the steps prior to implementation, eg.
requirments, specification, design, etc....  Unless one is just a hacker :-)
The expression of the design of the solution need not be written down in
that implementation language.

>A sad example of this was the "GOTO" war last year (or maybe the year
>before) in the Communications of the ACM.  In a patronizing letter, an
>eminent computer scientist (whose name will remain unmentioned) gave
>the "correct" solution to a simple problem that had been batted back
>and forth as an example of a construct that was made *easier* to read
>with a GOTO statement.  The problem was that the "solution" was not
>written in an existing programming language, but in a very obscure
>pseudo-code.

       Sometimes the use of pseudo-code is useful in order to express
a solution in the form of a programming language, but without paying
attention to all of the particular syntactic details of an implementation
language.  This allows one to concentrate on the solution method rather
than the syntax in which that solution is expressed.  For example, if one
wishes to communicate the solution to a file merge problem, the interesting
aspects of the solution revolved around the algorithm.  We are not all that
interested in how files are opened, read from, written to, checked for EOF,
closed, etc......  We can express these concepts in some abstract notation
and translate to the implementation language later.  The attention will be
focused in on the actual solution loop.

       The idea is that the solution that was provided may be transcribed
into an appropriate implementation language.  While I agree that once one
reaches this point you might as well use the implementation language to
write code, there is a use for pseudo-code.  How often do you refer to
books on algorithms?  Would you prefer one that expresses the algorithms
in what ever language happens to be in vogue today or one that expresses
the algorithm in a generic manner that transcends language?

George W. Leach					AT&T Paradyne 
(uunet|att)!pdn!reggie				Mail stop LG-133
Phone: 1-813-530-2376				P.O. Box 2826
FAX: 1-813-530-8224				Largo, FL  USA  34649-2826

garym@ulysses.UUCP (Gary Murphy) (08/02/89)

This subject of beginner's computer languages seems to crop up
every few years, yet surprisingly the arguments remain pretty
much constant - you'd think there'd be progress ...

From my own experience, which begins with WatFor and PL/1, I
would recommend anything but those two, and maybe add in the APL
variants as mindset dead-ends - not to imply that these, in the
hands of a trained professional, cannot be used in amazing ways,
just that the neophyte will spend the bulk of their time
learning alien codewords and symbols, awkward, unnatural and
mindlessly strict grammar, and a host of other irrelevent
details.  To begin programming, you want to start by writing
programs, not cyphers.

In teaching others, I've always found the criteria for choosing
the language to be more social than technical.  Despite whatever
reasons I might have for one over the other, it generally boils
down to knowing what languages their friends and collegues use,
what source examples & textbooks are available and what programs
they expect to write.  My father-in-law, now over 70, is
learning quickBasic, my good friend Udo, likewise near 70, has
other composer friends who can help him in C.  Whatever I might
say, these two are not about to buck their hinterland.

For children, however, maybe it's my age, maybe it's just my
general disposition, but I find them more willing to try my
advice on a first language.  Here is where my hypothesis of
avoiding the noise seems to work; kids (9 to 19) with no
particular math/science bent, pick up on declarative languages
much faster than procedurals.  Using Prolog, where there is only
one grammatical construct and few keywords, or Logo, which is
similarly terse, they can have their first program up and
running within minutes, with no 'railroad diagrams' or keyword
lists to memorize.  I don't know if it's just the particular
kids I've dealt with, but I find they also grasp programming by
goal-reduction much faster than they do the iterative,
sequence-specification approach.  I'm not saying that I'd never
teach them C, it's just that, as their tutor, I want to get them
hooked on the box before they even glimpse its terrors.
----------------------------------------------------------------
-- 
|   Gary Murphy - Cognos Incorporated - (613) 738-1338 x5537    |
|3755 Riverside Dr - P.O. Box 9707 - Ottawa Ont - CANADA K1G 3N3|
|        e-mail: decvax!utzoo!dciem!nrcaer!cognos!garym         |
|Cosmic Irreversibility: 1 pot T -> 1 pot P, 1 pot P /-> 1 pot T|

eugene@eos.UUCP (Eugene Miya) (08/03/89)

Well gee, this debate again......  Time to go, but some observations
since the last time I read this (before .newsrc needed reconstructing)

0) Resolved in the past that multilingual environments were the way to go.

1) It's been interesting talking to a physicist about LISP ["Why
would any one want to use a language like this?..."].  He does see value
in it now.  But still thinks we're crazy.

2) We took on a HS summer student (interested in fluid dynamics not CS, but
knew BASIC).  He said that his BASIC was carried over into his vector
Fortran and C (lots of tight loops).  The point being that hardware can
influence thinking as much as software.  And people use computers for
performance as well as flexibility.

3) I feel sorry for any student who has to learn Pascal as a first
language these days. [considering my old X3J9 days]

Time for some one to start up a cron file to post resolution on this frequently
asked discussion.

You're damned if you do, and damned if you don't.

Another gross generalization from

--eugene miya, NASA Ames Research Center, eugene@aurora.arc.nasa.gov
  resident cynic at the Rock of Ages Home for Retired Hackers:
  "You trust the `reply' command with all those different mailers out there?"
  "If my mail does not reach you, please accept my apology."
  {ncar,decwrl,hplabs,uunet}!ames!eugene
  				Live free or die.

siegman@sierra.Stanford.EDU (Anthony E. Siegman) (08/03/89)

From <14501@bfmny0.UUCP> tneff@bfmny0.UUCP:

> ...like line numbers in BASIC...

I already knew that most of the computer types who denigrate BASIC as
a programming language haven't in fact looked at a modern BASIC for
years, if not decades; but it's interesting to see it so clearly
verified.  Just to bring some of these people up to date, modern
versions of BASIC, besides being simple and easy to learn (and read
programs in),

  --Don't use line numbers; haven't for years.
  --Do have labels, permitting easy modular programming (and these
labels, along with variable nmaes, can be of unlimited length).
  --Have subroutines with local as well as global variables, and permit
arguments to be passed by value or by name.
  --Allow recursive subroutines.
  --Contain all the modern recommended structured programming concepts
(WHILE-WEND, IF-THEN-ELSE, CASE, etc.).
  --Have superb programming environments, with excellent editors.
  --Can be compiled (and will then typically run as fast or faster 
than the same program done in Pascal on the same machine).

and so on.  

[Just for information, with Microsoft QuickBASIC on the Macintosh one
can EASILY AND SIMPLY create Windows, Menus, Alerts and Dialog Boxes,
Mac-editable Text Boxes, Scrolls and scrollable windows, create,
access and use Resources, use all the Toolbox and QuickDraw
capabilities, create clickable Macintosh applications, and so on.
It's an EXCELLENT Macintosh programming environment, and much simpler
to learn than any other competing environment.]

None of the above is to be interpreted as any kind of argument for
BASIC as a first language _for teaching deep fundamental computer
science concepts to people who want to learn those concepts_!  In
fact, I'd even be prepared to concede that learning BASIC may well be
deleterious to later learning of those concepts (bad habits to
unlearn).

The problem -- or the fact -- however is that BASIC IS AND WILL REMAIN
BY FAR THE BEST CURRENTLY AVAILABLE GENERAL-PURPOSE LANGUAGE TO BE
LEARNED AND USED BY ORDINARY WORKING ENGINEERS AND SCIENTISTS WHO WANT
TO GET REAL WORK -- EVEN RATHER SIZABLE COMPUTATIONS -- DONE ON THEIR
DESKTOP COMPUTERS.  If BASIC is supplanted by anything else for those
sorts of users in the future, it won't be by Pascal, or Scheme, or C,
or Modula, or anything similar, it will be Mathematica.

reesd@gtephx.UUCP (David Rees) (08/03/89)

In article <14501@bfmny0.UUCP>, tneff@bfmny0.UUCP (Tom Neff) writes:
> There is more than one reason to learn a programming language.....
> will be applications drones.  What you want ... is something that
> will give each of these groups something
> rewarding and revealing in terms of their later track....
> 
> The most important thing is LEAVING OUT spurious or unhelpful concepts,
> like line numbers in BASIC or pointers to functions returning arrays of
> structures containing pointers to functions returning... in C, or about
> half of PL/I. :-)  The simpler the better for an introductory language.
> All you really need to communicate to people is that a computer is
> something that does what you tell it to do.


It is not necessary to teach an entire language. One could teach C as the
introductory language and barely touch the surface of what can be done
with it. I still am drawn towards Pascal as an introductory language. It seems
to me that Scheme (and Lisp..) both would show a slightly distorted view
of programming languages in general. What I mean is that Scheme does not (in
my opinion) show a good cross-section of programming languages.
((((( It also has two many parentheses )))) :)

			-David

tneff@bfmny0.UUCP (Tom Neff) (08/05/89)

In article <244@sierra.stanford.edu> siegman@sierra.UUCP (Anthony E. Siegman) writes:
>From <14501@bfmny0.UUCP> tneff@bfmny0.UUCP:
>
>> ...like line numbers in BASIC...
>
>I already knew that most of the computer types who denigrate BASIC as
>a programming language haven't in fact looked at a modern BASIC for
>years, if not decades; but it's interesting to see it so clearly
>verified ...

...blah, blah -- total strawman.  Give us a break.  The un-contexted
quote of mine above said that the key was LEAVING OUT unhelpful concepts,
like line numbers in BASIC.  Everyone reading here ought to be aware of
recent "advanced" BASICs, and to imply otherwise is offensive.  The
point was the line number concept, not the choice of language.

Criminey!  How about a topic on which MANNERS to teach first?  :-)

>The problem -- or the fact -- however is that BASIC IS AND WILL REMAIN
>BY FAR THE BEST CURRENTLY AVAILABLE GENERAL-PURPOSE LANGUAGE TO BE
>LEARNED AND USED BY ORDINARY WORKING ENGINEERS AND SCIENTISTS WHO WANT
>TO GET REAL WORK -- EVEN RATHER SIZABLE COMPUTATIONS -- DONE ON THEIR
>DESKTOP COMPUTERS.  If BASIC is supplanted by anything else for those
>sorts of users in the future, it won't be by Pascal, or Scheme, or C,
>or Modula, or anything similar, it will be Mathematica.

Good heavens.

Read the want ads.  (Do they read the want ads at Stanford? ;-) )
Look at the experience required for various programming positions.
A requirement for BASIC is rare as hens' teeth.  Mostly they want
FORTRAN and SPSS and COBOL and 3270 and various other real-world
things.

The original question was what language to teach first, and my answer
was that it depends on what you want to accomplish.  BASIC is fun
for teaching you that the computer does what you tell it to.  It's a
little lax in that it does too much FOR you, and later versions are
no better in that respect.

-- 
"We walked on the moon --	((	Tom Neff
	you be polite"		 )) 	tneff@bfmny0.UU.NET

sommar@enea.se (Erland Sommarskog) (08/06/89)

Marc Sabatella (marc@hpfcdc.HP.COM) writes:
>I could probably make a good argument for a language like Ada as a beginning
>language - if you stick to basics it is as easy as Pascal (easier, really -
>I think in/out is more intutive than "var"), and when it is time to learn
>more advanced concepts (say, in a second or third course) you don't have to
>switch languages.

I am about to second that. If I were to choose an introductury
langauge for people that would be like me, a computer consultant
working with industrial real-time applications and information
systems, I would choose a language that supported the most
important concepts for that type of programs: safety, modularity
and reusuability. I would also choose a real-world language, i.e.
a language that the students most likely will come in touch with
in their professional career. Safety means a strongly and statically 
typed language so that leaves out languages like Lisp and Smalltalk,
but also C and Pascal. (And I guess Modula-2.) And with the last
two requirements there are not too many langauges left. Finally, 
the real-world requirement would exclude things like Mesa, Turing 
and ML which to my knowledge is not used much in practice.
  What remains here is Ada and Eiffel (which qualifies as a real-world
langauge since I believe it's going to spread in use, and it will
take some years until my students are there, out in real life).
My choice for the first year would be Ada, since the modularization
concepts can be kept simple in Ada with flattening the language
too much. And, as Marc says, I wouldn't take all of the language,
only what I need to demonstrate the important issues about typing
and modularization.
  In Eiffel, on the other hand, the inheritance mechanism is not 
trivial, and leaving it out to begin with would give a strange
light on the language. But, of course, I wouldn't let the students
out of my university without an course in object-oriented programming
with Eiffel as the course language.
  They would also have to learn some assembler, some about query
languages (so they know how powerless they are.) But, there would
be no C, Lisp, or Scheme. C is something they can learn by them-
selves if they really have to. Lisp have I personally only used to 
program Emacs, but seems to me most like a toy language, nothing
for type of system I work with. Scheme I don't even know what it
is, from which I conlude it's no real-world langague.

But as I said, that is for people who would grew up and be like me.
-- 
Erland Sommarskog - ENEA Data, Stockholm - sommar@enea.se
"Hey poor, you don't have to be Jesus!" - Front 242

gateley@m2.csc.ti.com (John Gateley) (08/07/89)

In article <44cb7970.f9df@gtephx.UUCP> reesd@gtephx.UUCP (David Rees) writes:
>... It seems
>to me that Scheme (and Lisp..) both would show a slightly distorted view
>of programming languages in general. What I mean is that Scheme does not (in
>my opinion) show a good cross-section of programming languages.
>((((( It also has two many parentheses )))) :)
>			-David

This is not my experience: Scheme (or Lisp) makes it easy to understand
most concepts in programming languages:
  a prolog interpreter can be written in less than one page.
  All the basic stuff needed for Pascal-style languages is there.
  Even the dreaded GOTO can be done(in Scheme w/ call/cc, in Lisp w/ tagbody).
The expressiveness of the language combined with the simple syntax make
it a very good language for learning about techniques used in different
languages.

John
gateley@m2.csc.ti.com

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/07/89)

From article <161@enea.se>, by sommar@enea.se (Erland Sommarskog):
> Marc Sabatella (marc@hpfcdc.HP.COM) writes:
>>I could probably make a good argument for a language like Ada as a beginning
>>language - if you stick to basics it is as easy as Pascal (easier, really -
>>I think in/out is more intutive than "var"), and when it is time to learn
>>more advanced concepts (say, in a second or third course) you don't have to
>>switch languages.
% 
% I am about to second that. [...] I would choose a language that 
% supported the most important concepts for that type of programs: 
% safety, modularity and reuseability. 

   I third this; Ada is definitely the way to go.  And add to that
   list "multitasking capabilities"!!!  

  
   Bill Wolfe, wtwolfe@hubcap.clemson.edu

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/07/89)

From mproicou@blackbird.afit.af.mil (Michael C. Proicou):
>>I could probably make a good argument for a language like Ada [...] 
> 
> I mostly agree, EXCEPT you have to do generics to any kind of output of
> numbers and things!  For that reason, I'd recommend Pascal, with a switch
> to Ada if that's your end goal.  Since Ada is close to Pascal, very little
> would be lost in switching languages.

    All that is necessary is to provide the student with standard
    Integer_IO and Float_IO instantiations, and mention that these
    two packages, plus Text_IO, will serve to handle this class's 
    I/O requirements; it's a lot easier for the student to accept 
    a deferred explanation of generics than it is to switch languages 
    entirely!! 

    Ada provides lots of room for the highly motivated student to
    read ahead and go beyond what the class is doing; Pascal can
    provide nothing more than severe frustration.   
    

    Bill Wolfe, wtwolfe@hubcap.clemson.edu
 

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/08/89)

In article <161@enea.se> sommar@enea.se (Erland Sommarskog) writes:
#Marc Sabatella (marc@hpfcdc.HP.COM) writes:
#>I could probably make a good argument for a language like Ada as a beginning
#>language - if you stick to basics it is as easy as Pascal (easier, really -
#>I think in/out is more intutive than "var"), and when it is time to learn
#>more advanced concepts (say, in a second or third course) you don't have to
#>switch languages.
#
#I am about to second that. If I were to choose an introductury
#langauge for people that would be like me, a computer consultant
#working with industrial real-time applications and information
#systems, I would choose a language that supported the most
#important concepts for that type of programs: safety, modularity
#and reusuability. I would also choose a real-world language, i.e.
#a language that the students most likely will come in touch with
#in their professional career. Safety means a strongly and statically 
#typed language so that leaves out languages like Lisp and Smalltalk,
#but also C and Pascal. (And I guess Modula-2.) And with the last

	Why Pascal, it's just as strongly typed as ADA?

#two requirements there are not too many langauges left. Finally, 
#the real-world requirement would exclude things like Mesa, Turing 
#and ML which to my knowledge is not used much in practice.
#  What remains here is Ada and Eiffel (which qualifies as a real-world
#langauge since I believe it's going to spread in use, and it will
#take some years until my students are there, out in real life).
#My choice for the first year would be Ada, since the modularization

	AAAAAAAAAHHHHHHHHHHHHHHHH!!!!!!!!

#concepts can be kept simple in Ada with flattening the language
#too much. And, as Marc says, I wouldn't take all of the language,
#only what I need to demonstrate the important issues about typing
#and modularization.
#  In Eiffel, on the other hand, the inheritance mechanism is not 
#trivial, and leaving it out to begin with would give a strange
#light on the language. But, of course, I wouldn't let the students
#out of my university without an course in object-oriented programming
#with Eiffel as the course language.
#  They would also have to learn some assembler, some about query
#languages (so they know how powerless they are.) But, there would
#be no C, Lisp, or Scheme. C is something they can learn by them-
#selves if they really have to. Lisp have I personally only used to 
#program Emacs, but seems to me most like a toy language, nothing
#for type of system I work with. Scheme I don't even know what it
#is, from which I conlude it's no real-world langague.
#
#But as I said, that is for people who would grew up and be like me.

	It seems to me that your are pushing ADA merely based on the fact that
it is useful for the system *you* work with.  Everything you have stated about
Ada can be used to say about Pascal, except one - ease of programming.  I think
it's a lot easier to program in Pascal than in Ada.  Why?  Because, Pascal 
is not as restrictive.  In Pascal, if you have a problem, more than likely you
will be able to debug it step by step (syntax problem, not logical), but with
Ada it is almost impossible.  Why? Personal example - I wrote a *very* small
program (2 -3 packages, ~10 lines/package) and it took me days to figure out
why I kept getting certain syntax error (not that the errors themselves were
very helpful).  Call be stupid, inexperienced with Ada, whatever, but imagine
a biginning student running into that.  You know what he'll do - say "fuck this"drop class, and switch majors.  I know, because I've seen many people drop
simply because they were having trouble with the language.  Personally, I've
never had a syntax error I couldn't fix within a matter of minutes using Pascal,logic is another story :-)  As far as Lisp being a toy language, I would like
to see you do AI work in Ada....  It might not be right for you, but I think
it's just right depending on what oyu wnat to do.  I wouldn't want to write
an OS in it, but do to AI work, it's great.  As far as Ada being real-world,
I'd have to disagree with you very loudly.  The only "real" world that uses it
with any regularity is government and military.  What if the person doesn't
want to work for either?  As far as switching, I learned Pascal first and had
absolutely no problem understanding Ada syntax, the problem I ran into is the
stupidity of having all of the I/O packages, etc... and because it's so
restrictive.  If you really want to teach reusability, restrictiveness, etc..
use Pascal, it has all of that, but Pascal also gives you lot's of freedom if
needed, something Ada doesn't have.

gene schwartzman
genesch@aplvax.jhuapl.edu
_______________________________________________________________________________
| GO BEARS, GO CUBS, GO WHITE SOX, GO BULLS, GO BLACKHAWKS, GO TERPS !!!!!    |
| Soccer is a kick in the grass (and sometimes on astroturf)!                 |
| GO DIPLOMATS, GO STARS, GO BAYS, GO BLAST !!!!		              |	
| CFL -> GO EDMONTON ESKIMOS!!!!   VFL -> GO CARLTON BLUES !!!!		      |
|_____________________________________________________________________________|
Disclaimer:  These are my opinions and not of my employer.

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/08/89)

In article <6193@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
#>From article <161@enea.se>, by sommar@enea.se (Erland Sommarskog):
#> Marc Sabatella (marc@hpfcdc.HP.COM) writes:
#>>I could probably make a good argument for a language like Ada as a beginning
#>>language - if you stick to basics it is as easy as Pascal (easier, really -
#>>I think in/out is more intutive than "var"), and when it is time to learn
#>>more advanced concepts (say, in a second or third course) you don't have to
#>>switch languages.
#% 
#% I am about to second that. [...] I would choose a language that 
#% supported the most important concepts for that type of programs: 
#% safety, modularity and reuseability. 
#
#   I third this; Ada is definitely the way to go.  And add to that
#   list "multitasking capabilities"!!!  

Wonderful!!!  Could you please explain to me what beginning level course will
teach "multitasking capabilities"?  As I have explained in another article,
everything that Erland states, can be found in Pascal and then some.  I am not
harking on Ada, well actually I am, but a lot of you seem to be forgetting that
we are talking about 18-19 year olds, first year in college, for some, first
ever language, most not capable of adjusting their thinking to something totally
different.  To me it would seem that you first need a language that can be
taught very easily, students will not have a lot of trouble with, every major
theory could be explained in, etc.. etc... - ie. PASCAL!!!!  I have seen a lot
of people drop out because they could not handle the theory and/or language.
The theory is bad enough for a beginner, do you also want them to learn some
totally bizarre language?

gene schwartzman
genesch@aplvax.jhuapl.edu
_______________________________________________________________________________
| GO BEARS, GO CUBS, GO WHITE SOX, GO BULLS, GO BLACKHAWKS, GO TERPS !!!!!    |
| Soccer is a kick in the grass (and sometimes on astroturf)!                 |
| GO DIPLOMATS, GO STARS, GO BAYS, GO BLAST !!!!		              |	
| CFL -> GO EDMONTON ESKIMOS!!!!   VFL -> GO CARLTON BLUES !!!!		      |
|_____________________________________________________________________________|
Disclaimer:  These are my opinions and not of my employer.

grano@cs.Helsinki.FI (Juhani Grano) (08/08/89)

In article <86646@ti-csl.csc.ti.com> gateley@m2.UUCP (John Gateley) writes:
!This is not my experience: Scheme (or Lisp) makes it easy to understand
!most concepts in programming languages:
!(stuff deleted)
!The expressiveness of the language combined with the simple syntax make
!it a very good language for learning about techniques used in different
!languages.

Yes, but trying to understand the syntax issues involved in the _older_
languages after using Scheme can be a pain in the neck :-)

------------------------------
Kari Grano				University of Helsinki, Finland
email to: grano@cs.helsinki.fi		Department of CS
	"I've got a thousand telephones that don't ring" - R.Z.

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/08/89)

In article <6199@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:

#From mproicou@blackbird.afit.af.mil (Michael C. Proicou):
#>>I could probably make a good argument for a language like Ada [...] 
#> 
#> I mostly agree, EXCEPT you have to do generics to any kind of output of
#> numbers and things!  For that reason, I'd recommend Pascal, with a switch
#> to Ada if that's your end goal.  Since Ada is close to Pascal, very little
#> would be lost in switching languages.
#
#    All that is necessary is to provide the student with standard
#    Integer_IO and Float_IO instantiations, and mention that these
#    two packages, plus Text_IO, will serve to handle this class's 
#    I/O requirements; it's a lot easier for the student to accept 
#    a deferred explanation of generics than it is to switch languages 
#    entirely!! 
#
#    Ada provides lots of room for the highly motivated student to
#    read ahead and go beyond what the class is doing; Pascal can
#    provide nothing more than severe frustration.   

Totally disagree!!!!  You obviously have not worked with Pascal too much.
1) I have worked with it for a long time, and the frustration level I 
   encountered didn't even come close to the one I met when I started working
   with Ada.
2) A highly motivated student can read ahead and go beyond what the class is
   doing in *any* language.  A group of friends and I decided to find a better
   Pascal book than what was available thru class.  We did, looked it over,
   found a lotof neat shortcuts.. result - our programs were usually 20 - 40%
   smaller than the rest of the class, worked faster, and worked.
3) If you are going to provide the standard packages, etc... wouldn't it be
   a lot simpler if you just provided Pascal, that doesn't worry about any
   of that garbage.

gene schwartzman
genesch@aplvax.jhuapl.edu
_______________________________________________________________________________
| GO BEARS, GO CUBS, GO WHITE SOX, GO BULLS, GO BLACKHAWKS, GO TERPS !!!!!    |
| Soccer is a kick in the grass (and sometimes on astroturf)!                 |
| GO DIPLOMATS, GO STARS, GO BAYS, GO BLAST !!!!		              |	
| CFL -> GO EDMONTON ESKIMOS!!!!   VFL -> GO CARLTON BLUES !!!!		      |
|_____________________________________________________________________________|
Disclaimer:  These are my opinions and not of my employer.

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/08/89)

From by genesch@aplvax.jhuapl.edu (Eugene Schwartzman):
># it's a lot easier for the student to accept a deferred explanation 
># of generics than it is to switch languages entirely  [...] Ada 
># provides lots of room for the highly motivated student to
># read ahead and go beyond what the class is doing; Pascal can
># provide nothing more than severe frustration.   
> 
> Totally disagree!!!!  You obviously have not worked with Pascal too much.

   On the contrary, Pascal was my first language; I used it for
   three extremely frustrating years.
    
> 1) I have worked with it for a long time, and the frustration level I 
>    encountered didn't even come close to the one I met when I started working
>    with Ada.

   What a joke... how do you handle abstraction without packages?
   How do you separate specification from implementation?  How do
   you enforce the security of an ADT without limited private types?
   How do you manage exception handling?  What do you do when you get
   sick and tired of not being able to express concurrency?  What about
   the joy of writing code to manipulate a linked list for the 348th time
   because you can't express it cleanly, once and for all, using generics?

   Pascal may not be the ultimate in frustration, but it's WAY up there;
   Ada, on the other hand, is the language I wish I had been started off
   with in the first place.  I wouldn't wish Pascal on anyone, except
   possibly a very hard-core masochist.  


   Bill Wolfe, wtwolfe@hubcap.clemson.edu
 

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/08/89)

From genesch@aplvax.jhuapl.edu (Eugene Schwartzman):
> #% I am about to second that. [...] I would choose a language that 
> #% supported the most important concepts for that type of programs: 
> #% safety, modularity and reuseability. 
> #
> #   I third this; Ada is definitely the way to go.  And add to that
> #   list "multitasking capabilities"!!!  
> 
> Wonderful!!!  Could you please explain to me what beginning level course will
> teach "multitasking capabilities"?  

    Sure.  Observe CACM, V32, #1 (January 1989), the article entitled 
    "Computing as a Discipline",  which gives the final report of the 
    Task Force on the Core of Computer Science.  Page 10:

       The task force was given three general charges...

          3. Give a detailed example of an introductory course...

    Their recommended topics for the first course include fundamental
    algorithm concepts, data structures and abstraction, and PARALLEL
    COMPUTATION.  Students obviously will have to do advanced work in
    this area beyond what is found in an introductory course, but an
    early overview of what multitasking is, combined with an indication
    that highly motivated students can use Ada to explore these concepts,
    is highly appropriate.


    Bill Wolfe, wtwolfe@hubcap.clemson.edu

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/08/89)

In article <6206@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
#From genesch@aplvax.jhuapl.edu (Eugene Schwartzman):
#> Wonderful!!!  Could you please explain to me what beginning level course will
#> teach "multitasking capabilities"?  
#
#    Sure.  Observe CACM, V32, #1 (January 1989), the article entitled 
#    "Computing as a Discipline",  which gives the final report of the 
#    Task Force on the Core of Computer Science.  Page 10:
#
#       The task force was given three general charges...
#
#          3. Give a detailed example of an introductory course...
#
#    Their recommended topics for the first course include fundamental
	   ^^^^^^^^^^^
#    algorithm concepts, data structures and abstraction,
     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

	No problem here, you can't write good programs without knowing these.

#    and PARALLEL
#    COMPUTATION.  Students obviously will have to do advanced work in
#    this area beyond what is found in an introductory course, but an
#    early overview of what multitasking is, combined with an indication
#    that highly motivated students can use Ada to explore these concepts,
#    is highly appropriate.

	I agree, but I would like to emphasise that it is recommended, but
	are there any classes now that do it.  Also, what do you define as
	overview.  Just stating the fact that something like that exists and
	what it is?  Fine, I have no problems, but to do any "deep level"
	multitasking programming, I don't think is right for a beginning level
	class.  At my university, it is introduced at soph/junior year, but
	you don't really get into it until junior/senior year (depending on
	when you take the class).  If you agree with me about how much to
	go into multitasking in the beginning, then there is no use for Ada,
	if not, then I would like to know, how much do you think should be
	introduced, considering all other information that must be taught.

gene schwartzman
genesch@aplvax.jhuapl.edu
_______________________________________________________________________________
| GO BEARS, GO CUBS, GO WHITE SOX, GO BULLS, GO BLACKHAWKS, GO TERPS !!!!!    |
| Soccer is a kick in the grass (and sometimes on astroturf)!                 |
| GO DIPLOMATS, GO STARS, GO BAYS, GO BLAST !!!!		              |	
| CFL -> GO EDMONTON ESKIMOS!!!!   VFL -> GO CARLTON BLUES !!!!		      |
|_____________________________________________________________________________|
Disclaimer:  These are my opinions and not of my employer.

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/08/89)

In article <6204@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
=From by genesch@aplvax.jhuapl.edu (Eugene Schwartzman):
=># it's a lot easier for the student to accept a deferred explanation 
=># of generics than it is to switch languages entirely  [...] Ada 
=># provides lots of room for the highly motivated student to
=># read ahead and go beyond what the class is doing; Pascal can
=># provide nothing more than severe frustration.   
=> 
=> Totally disagree!!!!  You obviously have not worked with Pascal too much.
=
=   On the contrary, Pascal was my first language; I used it for
=   three extremely frustrating years.

	I would like to appologise for sounding so "sure" of something I didn't
	know and I didn't mean to sound so sarcastic.
=    
=> 1) I have worked with it for a long time, and the frustration level I 
=>    encountered didn't even come close to the one I met when I started working
=>    with Ada.
=
=   What a joke... how do you handle abstraction without packages?

=   How do you separate specification from implementation? 

	Very easily, write stubs that do absolutely nothing, but have a
	declaration, begin, end, and maybe a write stmt.

=   How do you enforce the security of an ADT without limited private types?

	Very easily, put procedures inside procedures, etc.. etc...  The
	data types in the inside procedures can't be accessed outside of it.

=   How do you manage exception handling? 

	Put in error checks :-) ?

=   What do you do when you get sick and tired of not being able to express
=   concurrency? 

	Could you give an example and then I could answer.

=   What about the joy of writing code to manipulate a linked list for the
=   348th time because you can't express it cleanly, once and for all, using
=   generics?

	Sorry, but I *HAVE* been able to write a "generic" code to manipulate
	a linked list in Pascal, you just need to know what to include in
	the code and what to exclude. As a matter of fact, almost everything
	I write, I try to find ways to make it as generic as possible.  You just
	need to know the tricks of the language.
=
=   Pascal may not be the ultimate in frustration, but it's WAY up there;
=   Ada, on the other hand, is the language I wish I had been started off
=   with in the first place.  I wouldn't wish Pascal on anyone, except
=   possibly a very hard-core masochist.  

	Fine, so I am a hard-core masochist :-)  I wouldn't wish Ada on my
	worst enemy and I am sure people who sit in my office would conccur
	(and they are almost constantly working with Ada).

peter@ficc.uu.net (Peter da Silva) (08/08/89)

In article <6204@hubcap.clemson.edu>, billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
>    What a joke... how do you handle abstraction without packages?
>    How do you separate specification from implementation?  How do
>    you enforce the security of an ADT without limited private types?
     [...exception handling, concurrency, generics, ... ]

What is a beginner in CS101 (or whatever your intro to CS course is called)
doing any of that for? First you crawl, then you walk, then you run. ADA
for introductory programming is like Air Jordans for 1-year-olds.
-- 
Peter da Silva, Xenix Support, Ferranti International Controls Corporation.
Business: peter@ficc.uu.net, +1 713 274 5180. | "The sentence I am now
Personal: peter@sugar.hackercorp.com.   `-_-' |  writing is the sentence
Quote: Have you hugged your wolf today?  'U`  |  you are now reading"

prp@sei.cmu.edu (Patrick Place) (08/09/89)

In article <2565@aplcen.apl.jhu.edu>, genesch@aplvax.jhuapl.edu (Eugene Schwartzman) writes:
> In article <6204@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
> =   How do you enforce the security of an ADT without limited private types?
> 	Very easily, put procedures inside procedures, etc.. etc...  The
> 	data types in the inside procedures can't be accessed outside of it.
Without entering into any language wars here, it seems that procedures
inside procedures is not sufficient for enforcing the security of
an ADT.  Consider the scenario where I am providing some interface I
to an ADT.  The only visibility I want users to have is that interface, I.
My ADT may well have other routines which are to be shared by the interface
routines.  If I put these inside one of the procedures in I, well then
it can't be shared (unless it is duplicated - not a great idea). So,
these shared routines must be accessible at a higher level, the only
level of which is the entire program.  The same goes for data, and it
can almost be guaranteed that there will be shared data between the
interface routines I of the ADT.

So what is the solution?
In Pascal, you have to construct one procedure that contains all
the real interface routines and shared data, as well as any supporting
routines, and the body of this procedure must then separate out the
parameters and invoke the correct ADT interface routine.  The difficulty
here lies in the possibly large number of parameters in the ADT
surrounding procedure, the unused parameters in all of the calls to
this one ADT surrounding procedure and the disentangling of the single
interface into the appropriate real interface.

Which all goes to show that Pascal was not designed with ADT's in mind.
Languages such as Ada, Modula 2, Euclid ... have the necessary level
of abstraction for supporting ADTs though it may not be sufficient.

Pat Place prp@sei.cmu.edu

sommar@enea.se (Erland Sommarskog) (08/09/89)

Eugene Schwartzman (genesch@aplvax.jhuapl.edu) writes:
>	Why Pascal, it's just as strongly typed as ADA?

Certainly not. Pascal is a weakly typed language. Period.
Look at this:

    TYPE  Apple_type  = 0..Max_apples;
          Orange_type = 0..Max_oranges;
    ...
    PROCEDURE Macedonia(Apple  : Apple_type;
                        Orange : Orange_type);
    BEGIN
       ...
    END;
    ...
    VAR   Apple  : Apple_type;
          Orange : Orange_type;
    BEGIN
       ...
       Macedonia(Orange, Apple);    <--  Probably an error!
       ...
    END.

We forgot the order of the parameters and swapped them. But does
the compiler complain? No, integer to integer, perfectly OK. You
call that stronly typed? I don't. In Ada you can say:
    TYPE Apple_type IS NEW integer RANGE 0..Max_Apples;
and then similar for Orange_type. If you really want to mix apples
and oranges, you can do this with an explicit type conversion. If
you want do it often you can still declare the types similar to
Pascal.
  Add to this private types, limited private types and you already
have more devices in Ada than Pascal to enforce real data abstraction.

>#But as I said, that is for people who would grew up and be like me.
>
>	It seems to me that your are pushing ADA merely based on the fact that
>it is useful for the system *you* work with.

Now, you had the bad taste to quote my article in entirety. Didn't
you even read it? Read the line above again. Your observation is totally
superfluous.

>Why? Personal example - I wrote a *very* small
>program (2 -3 packages, ~10 lines/package) and it took me days to figure out
>why I kept getting certain syntax error (not that the errors themselves were
>very helpful).

One thing is true. It takes longer to time to get an Ada module
free from compilation errors. The reason is simple. The compiler
finds more errors for you. So you gain that when you debug. Yes,
correct, that is not applicable on syntax errors. Tell you what,
with a limited knowledge in Pascal and bad compiler messages you
would have exactly the same problem.
  Then it could also be added that Ada is not at optimum for programs
at that size, but it is never a real problem. Not even in a beginner's
course. As a contra, why don't you try to write a 2.000.000 line
artillery control system in standard Pascal. (One single source file!)

>but imagine a biginning student running into that.  You know what he'll 
>do - say "fuck this"drop class, and switch majors.  I know, because
>I've seen many people drop simply because they were having trouble 
>with the language.

A freshman would probably have less problems than you. His mind
wouldn't be set up thinking in Pascal ways. And, he would be just
as likely to run in those problems in Pascal. What about this one?
      IF Something THEN
         IF And_this_to THEN
            Do_this;
      ELSE
         Do_that;
First our student gets a syntax error on the ";" before ELSE which
he doesn't understand. (Believe. I have teached Pascal to freshmen.)
When he gets that he remove the semicolon, the program compiles, but
doesn't run corectly. Well, not really a novice error. I have had 
colleagues that have done the same error and been fooled by the 
indentation.
  As for giving up; if they give up because of the language, that might
be just as well. With that motivation I doubt that they would be any
good programmers anyway.

>As far as Lisp being a toy language, I would like to see you do AI work 
>in Ada....

Yes, but people like me don't write AI. :-) Seriously, I don't
doubt a second that Lisp is good in AI and similar small systems.
But with the stuff I'm involved in, it's not a serious alternative.
And, I might be totally wrong, but I have this idea that this
apply to the main bulk of programmers.

>but do to AI work, it's great.  As far as Ada being real-world,
>I'd have to disagree with you very loudly.  The only "real" world that uses it
>with any regularity is government and military.

Bill Wolfe has already commented this, but let me just second. The
talk that Ada is only used in military application is just a myth.
It's probably more common there than elsewhere, due to the require-
ments from the customers, but also because that military systems
often are big in size. And Ada was developed for large systems...

>restrictive.  If you really want to teach reusability, restrictiveness, etc..
>use Pascal, it has all of that, but Pascal also gives you lot's of freedom if
>needed, something Ada doesn't have.

This is where I have some problem to remain polite. That must be a joke.
Tell me you forgot the smiley. How the !"#$%&/() can Pascal be reusable
when there is no module concept in the language? You cannot reuse a
bloody thing from your one-file program, except by copying code, and,
believe me, that has nothing to do with reusability.

And before you go on and accuse me as you did with Bill Wolfe for
not knowing Pascal, I tell you. I work with the shit. Every day.
(VAX-Pascal though, which is a good deal better than standard
Pascal, but totally unportable.)
-- 
Erland Sommarskog - ENEA Data, Stockholm - sommar@enea.se
"Hey poor, you don't have to be Jesus!" - Front 242

tarvaine@tukki.jyu.fi (Tapani Tarvainen) (08/09/89)

In Jyvaskyla University Pascal is taught as the first language.
The most important reason (which disqualifies Ada in particular)
is the availability of cheap and easy-to-use compilers for PCs.
I suspect this is the decisive factor in quite a few places.

I can't, however, resist the temptation to quote Edsger Dijkstra here
[describing his idea of an introductory programming course 
in the 1988 SIGCSE Award Lecture (titled "On the cruelty of really
teaching computer science"!)]:

"... we see to it that the programming language in question has _not_
been implemented on campus so that students are protected from the
temptation to test their programs."

-- 
Tapani Tarvainen    (tarvaine@jyu.fi, tarvainen@finjyu.bitnet)

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/09/89)

In article <5594@ficc.uu.net> peter@ficc.uu.net (Peter da Silva) writes:
#In article <6204@hubcap.clemson.edu>, billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
#>    What a joke... how do you handle abstraction without packages?
#>    How do you separate specification from implementation?  How do
#>    you enforce the security of an ADT without limited private types?
#     [...exception handling, concurrency, generics, ... ]
#
#What is a beginner in CS101 (or whatever your intro to CS course is called)
#doing any of that for? First you crawl, then you walk, then you run. ADA
#for introductory programming is like Air Jordans for 1-year-olds.

	Because Ada wil force you to put all of that into the beginning level
	course, since it all revolves around itself.  And then you wont have
	time to teach them the stuff you should have to begin with :-)

gene schwartzman
genesch@aplvax.jhuapl.edu
_______________________________________________________________________________
| GO BEARS, GO CUBS, GO WHITE SOX, GO BULLS, GO BLACKHAWKS, GO TERPS !!!!!    |
| Soccer is a kick in the grass (and sometimes on astroturf)!                 |
| GO DIPLOMATS, GO STARS, GO BAYS, GO BLAST !!!!		              |	
| CFL -> GO EDMONTON ESKIMOS!!!!   VFL -> GO CARLTON BLUES !!!!		      |
|_____________________________________________________________________________|
Disclaimer:  These are my opinions and not of my employer.

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/09/89)

In article <3781@fy.sei.cmu.edu> prp@sei.cmu.edu (Patrick Place) writes:
#In article <2565@aplcen.apl.jhu.edu>, genesch@aplvax.jhuapl.edu (Eugene Schwartzman) writes:
#> In article <6204@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
#> =   How do you enforce the security of an ADT without limited private types?
#> 	Very easily, put procedures inside procedures, etc.. etc...  The
#> 	data types in the inside procedures can't be accessed outside of it.
[discussion of how to use Pascal for ADT deleted]
#
#Which all goes to show that Pascal was not designed with ADT's in mind.
#Languages such as Ada, Modula 2, Euclid ... have the necessary level
#of abstraction for supporting ADTs though it may not be sufficient.

	I agree, but any professor who decides to make *BEGINNING* level
	students write something like that, need his teaching license revoked
	and sent to the insane asilum.  Which shows to prove that Ada, at the
	beginning level, will cause more problems than it will solve.  I.E.
	STAY AWAY FROM ADA AS THE *BEGINNING LEVEL* LANGUAGE!!!!
				  ^^^^^^^^^^^^^^^^^
				  ~~~~~~~~~~~~~~~~~

gene schwartzman
genesch@aplvax.jhuapl.edu
_______________________________________________________________________________
| GO BEARS, GO CUBS, GO WHITE SOX, GO BULLS, GO BLACKHAWKS, GO TERPS !!!!!    |
| Soccer is a kick in the grass (and sometimes on astroturf)!                 |
| GO DIPLOMATS, GO STARS, GO BAYS, GO BLAST !!!!		              |	
| CFL -> GO EDMONTON ESKIMOS!!!!   VFL -> GO CARLTON BLUES !!!!		      |
|_____________________________________________________________________________|
Disclaimer:  These are my opinions and not of my employer.

rleroux1@uvicctr.UVic.ca.UUCP (Roger Leroux) (08/09/89)

Naw, you guys got it *all* wrong. The first language should be...

	FORTRAN...

	on punched cards...

	on an old IBM teletype...

#include smileys.h

Roger
-- 

Roger Leroux                                rleroux1@uvicctr.UVic.CA
User Services Consultant                         BITNET: LEROUX@UVVM
University of Victoria, Box 1700, Victoria BC, Canada, (604) 721-7687
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/10/89)

From genesch@aplvax.jhuapl.edu (Eugene Schwartzman):
> [discussion of how to use Pascal for ADT deleted]
> #Which all goes to show that Pascal was not designed with ADT's in mind.
> #Languages such as Ada, Modula 2, Euclid ... have the necessary level
> #of abstraction for supporting ADTs though it may not be sufficient.
> 
> 	I agree, but any professor who decides to make *BEGINNING* level
> 	students write something like that, need his teaching license revoked
> 	and sent to the insane asilum.  

    No, actually, students need to be exposed to ADTs early.
    Probably the best way to do it is for the professor to
    provide the spec, and have the students write an implementation,
    while the concepts behind the spec are being covered in class.


    Bill Wolfe, wtwolfe@hubcap.clemson.edu

rossj@titan.UUCP (Ross Judson) (08/10/89)

I attend Carleton University in Ottawa, Ontario, Canada.  When I took the
introductory courses, they were taught using Pascal.  This has since changed
to Smalltalk, which I consider to be an excellent choice.  I may be TAing
this year, and it will be very interesting to see what kind of work is done
by students who are exposed to object oriented ideas before functional ones.

I believe the courses use Smalltalk V on 286Clones and/or some kind of early
Smalltalk on Macintosh.

I have never worked with Ada.  It appears to be an excellent production
language; that is quite likely its best use as well.  Trying to introduce
the concept of packages, declarations, concurrency, exception handling, et
cetera and the introductory level is foolish.  Remember, we are dealing with
students who have difficulty understanding the difference between
pass-by-reference and pass-by-value!

Pascal is not my favourite language, but it is reasonably easy to learn and
is traditional :-).  I find that problems arise because beginning Pascal
programmers think of everything in a concrete way.  They hack away at a
problem until it's done, thinking in terms of arrays and functions.  Very
rarely will a beginning programmer make use of some of the nicer features of
Pascal, such as sets and enumerated types, because they are forced by the
language to think in terms of storage, not concepts.

Smalltalk accomplishes several goals:

o	Ease of use and programming (advanced user interface).  The
	graphical environments of most Smalltalk systems pique student
	interest, and encourage exploration of the environment.  It is very
	easy for them to play with the system, figure out how to add
	classes, and do interesting things quickly.  They generally need
	less impetus and TA time to become familiar with the environment.

o	The teaching of reuse _early_.  The courses are designed so that
	as classes are developed, they are incorporated and specialized
	further into and by later assignments.  It is valuable training for
	future work.

o	Beginning Smalltalk programmers are isolated from the hard "machine
	realities".  They don't have to worry about storage, data types,
	garbage collection, or any of the other nonsense that their lives
	will be concerned with when they hit the real world :-).  They can
	focus on _concepts_, and think in _abstractions_.  Once they have
	that language doesn't matter.

o	Smalltalk is a simple, clean, and consistent language.  It also
	comes with lots of examples built-in.

There are probably some other interesting benefits that I can't think of
right now.  Some of this is assumption; I assume the faculty is planning the
courses with the same techniques used to plan the 2nd year courses.  I took
those courses, and for my classmates and I the benefits were very real.

-- 
uucp       - uunet!mitel!sce!cognos!rossj  |I guess I'm following my heart...
arpa/inter - rossj%cognos.uucp@uunet.uu.net|And that takes me a different path

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/12/89)

From article <6774@titan.UUCP>, by rossj@titan.UUCP (Ross Judson):
> I have never worked with Ada.  It appears to be an excellent production
> language; that is quite likely its best use as well.  Trying to introduce
> the concept of packages, declarations, concurrency, exception handling, et
> cetera at the introductory level is foolish.  Remember, we are dealing with
> students who have difficulty understanding the difference between
> pass-by-reference and pass-by-value!

    In Ada, they don't need to worry about how parameters are passed,
    only whether they are "in", "in out", or "out".  As for the rest,
    nobody is asserting that students who have taken an introductory
    course is going to be an Ada expert, any more than a child who has
    had 16 weeks of Interactive English is going to be an English expert.

    But let's look at these Ada concepts: if they can understand what 
    an interface is, then they will have no trouble with the idea of a
    package.  As for declarations, it makes sense to have to obtain
    (or create, if you prefer) an object first.  Exception handling
    is simply "What to do if something goes wrong in this environment";
    nothing un-intuitive about that.  With a properly-written text (and 
    I grant you that there are not many Ada texts which don't assume 
    previous knowledge of another language), these concepts are OK. 
    As you mention above, there are some things about programming
    languages which are difficult to learn, such as the difference
    between pass-by-reference and pass-by-value in Pascal.  The reason
    for this is that pass-by-reference forces you to keep two different
    environments in your head at the same time, whereas pass-by-value
    is clean and modular, thus easily understood.  Similarly, specifying
    "in", "in out", or "out" is clean and modular, thus easily understood. 
     
> o	Ease of use and programming (advanced user interface).  

    Great for a Hypercard system designed to be used by non-experts;
    but we are training professionals here, not unsophisticated users.

> o	The teaching of reuse _early_.  

    Which I strongly endorse, and which Ada strongly supports.

> o	Beginning Smalltalk programmers are isolated from the hard "machine
> 	realities".  They don't have to worry about storage, data types,
> 	garbage collection, or any of the other nonsense that their lives
> 	will be concerned with when they hit the real world :-).  They can
> 	focus on _concepts_, and think in _abstractions_.  

     Now HERE is where the real problems come in.  The time to focus
     on concepts and think in abstractions is when you are writing
     your package specifications.  But do we really want students 
     to only write specs, with the impression that implementation 
     is just a trivial thing, free from all that nasty mental work?  

     If we want to simplify the student's life, we can provide underlying
     abstractions for the student to use; for example, implementing the
     five basic operations of relational algebra is relatively easy if
     you have a generic B+ tree at your disposal, properly documented
     with respect to time and space requirements.  But PLEASE, let's not
     have students thinking they can just go out and simply write a spec 
     for Ackermann's function!!!  Nor, IMHO, should they harbor ideas
     that will cause them to wonder why anyone would ever worry about 
     any time vs. space tradeoffs!  Indeed, these questions are at the
     very heart of what software people do, and are therefore properly 
     placed right in the center of one's introduction to computer science. 


     Bill Wolfe, wtwolfe@hubcap.clemson.edu
 

peter@ficc.uu.net (Peter da Silva) (08/14/89)

In article <6251@hubcap.clemson.edu>, billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
>     but we are training professionals here, not unsophisticated users.

I beg to disagree. Your students are unsophisticated users. They have a long
way to go to being professionals. Oh yes, that's the goal of the course of
study they're engaged in, and they'll get there soon enough, but you can't
assume that's where they're at now.

And bear in mind that they're not all going to be professional computer
scientists or software engineers. Some of them, maybe the majority, will be
professional physicists, or materials engineers, or statisticians, or even
historians or accountants.
-- 
Peter da Silva, Xenix Support, Ferranti International Controls Corporation.
Business: peter@ficc.uu.net, +1 713 274 5180. | "The sentence I am now
Personal: peter@sugar.hackercorp.com.   `-_-' |  writing is the sentence
Quote: Have you hugged your wolf today?  'U`  |  you are now reading"

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/15/89)

From article <5666@ficc.uu.net>, by peter@ficc.uu.net (Peter da Silva):
>>     but we are training professionals here, not unsophisticated users.
> 
> I beg to disagree. Your students are unsophisticated users. They have a long
> way to go to being professionals. Oh yes, that's the goal of the course of
> study they're engaged in, and they'll get there soon enough, but you can't
> assume that's where they're at now.

    The shortest distance between two points is a straight line;
    the more time they spend in a professional frame of mind, the 
    greater their professional capabilities will be when it comes 
    time to put their extremely costly education to practical use.
 
> And bear in mind that they're not all going to be professional computer
> scientists or software engineers. Some of them, maybe the majority, will be
> professional physicists, or materials engineers, or statisticians, or even
> historians or accountants.

    In that case, let them take the introductory course for non-majors.

    If they sign up for the CS major track, they should receive a clear
    picture of what the Real World is all about, not some glossed-over
    pretend world where nobody has to worry about implementing anything.

    Only with a clear understanding of what being a professional computer
    scientist / software engineer is all about will they be in a position
    to ultimately decide to be an accountant instead...   :-)  :-)  :-)


    Bill Wolfe, wtwolfe@hubcap.clemson.edu

scott@shuksan.UUCP (Scott Moody) (08/15/89)

In article <5594@ficc.uu.net>, peter@ficc.uu.net (Peter da Silva) writes:

> What is a beginner in CS101 (or whatever your intro to CS course is called)
> doing any of that for? First you crawl, then you walk, then you run. ADA
> for introductory programming is like Air Jordans for 1-year-olds.

You don't have to run to use Ada (I bike anyway :-)

Remember that the first language you are taught in CS101 is also the 
main language you use throughout undergraduate education (outside of
the many languages course). There are a lot of jobs in industry
that need Ada programmers and it is still the job of the Universites
to teach/prepare its students for the real-world. So what good is
it to teach them pascal if they never use it, other than for the
techniques? Try explaining to you future employer that you were
taught to 'learn' other languages easially when they are looking
for expert Ada programmers. The first thing they do is send
you to an Ada course anyway.

Whether beginners should be taught exceptions as the first thing
is obviously up for discussion, but is probably under the venue of the
instructor who is new to Ada anyway. They should probably teach
the pascal like subset first, then advance to the generics and tasks.
There should be task force on what parts of Ada to teach at what times
but I feel they should concentrate on generics, not as an advanced
feature, but as a necessary and easy one. If all programmers would
think generically then ...

--scott

paul@batserver.cs.uq.oz (Paul Bailes) (08/15/89)

The answer is quite clear: a modern-style functional language: Miranda (TM),
Haskell (when it appears), or even Hope. This is because

	* intro. courses are about establishing both a vocabulary
	  and a mind-set

	* functional languages are more expressive (in a sense) than
	  procedural languages (ie better for presenting a vocab.)

	* functional languages admit simple formal proofs, allowing
	  the establishment of a pro-formal methods mind-set (such
	  as encouraged by Dijkstra's SIGCSE paper)

	* there is at least one superlative text book: ``Introduction
	  to Functional Programming'' by Richard Bird and Phil Wadler
	  (Prentice-Hall).

	  ANYONE WHO HASN'T READ IT JUST ISN'T SUFFICIENTLY INFORMED
	  TO EVEN BEGIN TO DEBATE THE ISSUE OF WHAT SHOULD BE AN
	  INTRO PROGRAMMING LANGUAGE (seriously!)

Paul Bailes

peter@ficc.uu.net (Peter da Silva) (08/15/89)

In article <6259@hubcap.clemson.edu>, billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) writes:
>     In that case, let them take the introductory course for non-majors.

So, what language should non-majors start with?
-- 
Peter da Silva, Xenix Support, Ferranti International Controls Corporation.
Business: peter@ficc.uu.net, +1 713 274 5180. | "The sentence I am now
Personal: peter@sugar.hackercorp.com.   `-_-' |  writing is the sentence
Quote: Have you hugged your wolf today?  'U`  |  you are now reading"

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/16/89)

From paul@batserver.cs.uq.oz (Paul Bailes):
> The answer is quite clear: a modern-style functional language [...]

    Since the ACM Task Force on the Core of Computer Science has
    specified parallel programming as one of the 11 major topics
    to be covered in an introductory course, the language used
    should be one in which parallel programming is supported.

    Quoting David M. Harland (Concurrency and Programming Languages, 1986):

       The history-sensitivity of the imperative languages contrasts
       with the total inability of functional systems to handle
       interactions with the outside world...  It is interesting
       to ponder what would happen if processes were made explicit
       in functional languages.  With the introduction of arbitrary
       interactions, sequentiality would return and this would remove
       the basis of their traditional implementation.  There is far
       more to the issue than the fashionable 'single-assignment'
       approach would have us believe.  Functional languages _rely_ 
       upon the fact that they are not history-sensitive to evaluate 
       expressions in parallel, and thereby are claimed to be inherently 
       parallel languages.  This is not so.  They are not sequential, 
       but this is far from being parallel.  The only parallelism they 
       offer is hidden in their implementation; it is implicit.  A truly 
       parallel language would make its concurrency explicit, and allow 
       arbitrary computation over the agents of parallelism.  Far from the
       applicative languages providing the most natural evolutionary path 
       for future 'good' programming systems, they are, as currently defined,
       fatally flawed.

    But don't worry; Ada will be quite happy to take up the slack while 
    you functional programming types run right back to the drawing board...


    Bill Wolfe, wtwolfe@hubcap.clemson.edu

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/16/89)

From article <5688@ficc.uu.net>, by peter@ficc.uu.net (Peter da Silva):
>>     In that case, let them take the introductory course for non-majors.
> 
> So, what language should non-majors start with?

    It is probably inadvisable to teach them any language at all;
    instead, a survey of user-oriented systems (Hypercard, the
    so-called 4GLs, and so on) would probably be more appropriate. 


    Bill Wolfe, wtwolfe@hubcap.clemson.edu

kolding@june.cs.washington.edu (Eric Koldinger) (08/16/89)

In article <6265@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>> So, what language should non-majors start with?
>
>    It is probably inadvisable to teach them any language at all;
>    instead, a survey of user-oriented systems (Hypercard, the
>    so-called 4GLs, and so on) would probably be more appropriate. 

Why?  Just because someone doesn't want to be a CS major doesn't mean that
he/she doesn't want to learn to program, or have need to program.  What about,
for instance, engineering majors who often have to design large programs, but
aren't CS people, per se?

-- 
	_   /|				Eric Koldinger
	\`o_O'				University of Washington
  	  ( )     "Gag Ack Barf"	Department of Computer Science
       	   U				kolding@cs.washington.edu

smk@dip.eecs.umich.edu (Steve kelley) (08/16/89)

In article <1501@shuksan.UUCP> scott@shuksan.UUCP (Scott Moody) writes:
>
> ... Try explaining to you future employer that you were
>taught to 'learn' other languages easially when they are looking
>for expert Ada programmers. The first thing they do is send
>you to an Ada course anyway.
>

So we have two options presented here : 

	1.  I pay a university to teach me Ada, using time during which
	    I might be doing something I find more interesting.

	2.  My boss pays me to learn Ada, using time that would otherwise
	    be devoted to having to perform real work.

 I know how I'd pick.

 Steve "OJT" Kelley

peter@ficc.uu.net (Peter da Silva) (08/16/89)

In article <1501@shuksan.UUCP>, scott@shuksan.UUCP (Scott Moody) writes:
> Remember that the first language you are taught in CS101 is also the 
> main language you use throughout undergraduate education (outside of
> the many languages course).

That's not my experience. The intro course I skipped, but it was in
Fortran. Then there were a couple of courses in Pascal, an assembly
language course, and then mostly C with occasional forays back into
Pascal and Fortran (for EE courses, mainly). This was some years ago,
of course, but I would hope that things haven't narrowed *too* much
since then.

And if the intro course is too easy for a student, they should test out
of it like I did. You can't assume that just because someone had a computer
before they came to university they knew anything about programming.

Sure, you can call the intro course a remedial course, but you're doing
people a disservice. By the criterion you're applying most of the first-
year courses fit that description, in any discipline... I know I was
horrified by the stuff they were going over in first year Math and Physics.
-- 
Peter da Silva, Xenix Support, Ferranti International Controls Corporation.
Business: peter@ficc.uu.net, +1 713 274 5180. | "The sentence I am now
Personal: peter@sugar.hackercorp.com.   `-_-' |  writing is the sentence
Quote: Have you hugged your wolf today?  'U`  |  you are now reading"

peter@ficc.uu.net (Peter da Silva) (08/16/89)

billwolf@hubcap.clemson.edu:
>     It is probably inadvisable to teach them [non-majors] any language
>     at all; instead, a survey of user-oriented systems (Hypercard, the
>     so-called 4GLs, and so on) would probably be more appropriate. 

You have to be kidding. You're saying, here, that if you're not a CS
Major you shouldn't have any programming at all. Is that right? What
about people in scientific disciplines? Physicists don't have to
program? How about mechanical engineers? How about EEs?
-- 
Peter da Silva, Xenix Support, Ferranti International Controls Corporation.
Business: peter@ficc.uu.net, +1 713 274 5180. | "The sentence I am now
Personal: peter@sugar.hackercorp.com.   `-_-' |  writing is the sentence
Quote: Have you hugged your wolf today?  'U`  |  you are now reading"

nevin1@cbnewsc.ATT.COM (nevin.j.liber) (08/16/89)

In article <6226@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>    No, actually, students need to be exposed to ADTs early.
>    Probably the best way to do it is for the professor to
>    provide the spec, and have the students write an implementation,
>    while the concepts behind the spec are being covered in class.

I don't know.  I tend to laugh when I get a non-trivial (and usually
incomplete) spec for a trivial program, which is what most of the
beginning CS programs tend to have.  It makes ADTs *seem* like useless
overhead (they may not be, especially on larger projects, but tiny
projects aren't the way to convince someone that ADTs are worth their
cost).
-- 
 _ __	NEVIN ":-)" LIBER  nevin1@ihlpb.ATT.COM  (312) 979-4751  IH 4F-410
' )  )			 "We are almost into the '90s; *nothing* is wierd."
 /  / _ , __o  ____	   -- Buzz Kelman, Newsman & Bluesman, WLUP, 7/28/89
/  (_</_\/ <__/ / <_	As far as I know, these are NOT the opinions of AT&T.

nevin1@cbnewsc.ATT.COM (nevin.j.liber) (08/16/89)

In article <6251@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>> o	Ease of use and programming (advanced user interface).  

>    Great for a Hypercard system designed to be used by non-experts;
>    but we are training professionals here, not unsophisticated users.

I really disagree with you here.  You seem to be saying that
"sophisticated professionals" ought to be using more primitive tools.
Maybe you like "programming" all your documents in troff, or debugging
by inserting printf-type statements in your code and recompiling, but I
for one would rather just type my documents with a WYSISYG word
processor, or use an integrated development environment to do my
debugging.
-- 
 _ __	NEVIN ":-)" LIBER  nevin1@ihlpb.ATT.COM  (312) 979-4751  IH 4F-410
' )  )			 "We are almost into the '90s; *nothing* is wierd."
 /  / _ , __o  ____	   -- Buzz Kelman, Newsman & Bluesman, WLUP, 7/28/89
/  (_</_\/ <__/ / <_	As far as I know, these are NOT the opinions of AT&T.

jamin@cogsci.berkeley.edu (Sugih Jamin) (08/16/89)

My first language was Scheme.  My second language C.  But that is of no
importance.  As my prof. likes to say, "Teach them the concepts, they'll
pick up any language within a week."  I sort of agree with that belief,
even though I haven't learned Prolog.  But the point is, if the choice of
language is immaterial, then, as a student, I sincerely beg professors
to teach a language that they can support with 1) a good accompanying 
textbook--for example, Scheme has the excellent Abelson & Sussman, 
2) a fast compiler/interpreter (Scheme or Saber C), 3) a good debugger, and 
4) an intelligent editor and otherwise supportive environment--
the trio: GNU Emacs, C, and UNIX comes to mind.

In general, keep the frustration to the minimum.


sugih

jeff@aiai.uucp (Jeff Dalton) (08/16/89)

In article <1304@batserver.cs.uq.oz> paul@batserver.cs.uq.oz writes:
>The answer is quite clear: a modern-style functional language: Miranda (TM),
>Haskell (when it appears), or even Hope. This is because

Clear, you say?

>	* intro. courses are about establishing both a vocabulary
>	  and a mind-set

Good point, although sometimes they're about changing mind-set.
(Think of all those students who've used some losing Basic.)

However, the problem with mind-sets is that it's hard to settle a
disagreement at that level.  Not everyone prefers the functional
+ formal methods approach, and what can you say to them except
that you think the functional apporoach will prove more effective
in the end?  You could point to more specific things, but then
you'd have to say why those things are important.

>	* functional languages are more expressive (in a sense) than
>	  procedural languages (ie better for presenting a vocab.)

I'm not sure which sense you mean, especially since a functional
language might just be a subset of a procedural one (think of
standard ML, for example, or Lisp).

>	* functional languages admit simple formal proofs, allowing
>	  the establishment of a pro-formal methods mind-set (such
>	  as encouraged by Dijkstra's SIGCSE paper)

Unfortunately, the formal methods mind-set can lead to something
like this:

  Don't use programming language constructs that make formal
  methods difficult.

And then some one can ask:

  If formal methods are so wimpy, why should that be my problem?

>	* there is at least one superlative text book: ``Introduction
>	  to Functional Programming'' by Richard Bird and Phil Wadler
>	  (Prentice-Hall).
>
>	  ANYONE WHO HASN'T READ IT JUST ISN'T SUFFICIENTLY INFORMED
>	  TO EVEN BEGIN TO DEBATE THE ISSUE OF WHAT SHOULD BE AN
>	  INTRO PROGRAMMING LANGUAGE (seriously!)

Well, I've read it.  I think it's a good text, but it has a problem
shared by other functional texts, namely that the most complex (I'm
not sure that's the right word, but it will do) programs in the book
are still fairly trivial ones.

I suspect that the people who like this book will be those who find
the mathematics of programming interesting in itself.  There are more
interesting programs in other texts, such as Abelson and Sussman's
_Structure and Interpretation of Computer Programs_.  And I think
you'll find that preference for one sort of book or the other splits
on more or less ideological (what people sometimes call religious)
lines.

-- Jeff

genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/16/89)

In article <169@enea.se> sommar@enea.se (Erland Sommarskog) writes:
#Eugene Schwartzman (genesch@aplvax.jhuapl.edu) writes:
#>	Why Pascal, it's just as strongly typed as ADA?
#
#Certainly not. Pascal is a weakly typed language. Period.
#Look at this:
#
#    TYPE  Apple_type  = 0..Max_apples;
#          Orange_type = 0..Max_oranges;
#    ...
#    PROCEDURE Macedonia(Apple  : Apple_type;
#                        Orange : Orange_type);
#    BEGIN
#       ...
#    END;
#    ...
#    VAR   Apple  : Apple_type;
#          Orange : Orange_type;
#    BEGIN
#       ...
#       Macedonia(Orange, Apple);    <--  Probably an error!
#       ...
#    END.
#
#We forgot the order of the parameters and swapped them. But does
#the compiler complain? No, integer to integer, perfectly OK. You
#call that stronly typed? I don't. In Ada you can say:
#    TYPE Apple_type IS NEW integer RANGE 0..Max_Apples;
#and then similar for Orange_type. If you really want to mix apples
#and oranges, you can do this with an explicit type conversion. If
#you want do it often you can still declare the types similar to
#Pascal.

	You yourself said that the above example would produce an error, granted
	I don't know whether you mean logical or compiler...  Honestly, I don't
	know how the compiler would handle it, but if I remember anything from
	one of my classes, it would gripe and tell you that they type are
	incompatible.  Also, the two ranges are different, sooner or later it
	would crash anyway and the problem would be discovered (hopefully, in
	the testing stages :-)

#  Add to this private types, limited private types and you already
#have more devices in Ada than Pascal to enforce real data abstraction.

	Ok, but why does a *beginning* level student need them?
#
#>Why? Personal example - I wrote a *very* small
#>program (2 -3 packages, ~10 lines/package) and it took me days to figure out
#>why I kept getting certain syntax error (not that the errors themselves were
#>very helpful).
#
#One thing is true. It takes longer to time to get an Ada module
#free from compilation errors. The reason is simple. The compiler
#finds more errors for you. So you gain that when you debug. Yes,
#correct, that is not applicable on syntax errors. Tell you what,
#with a limited knowledge in Pascal and bad compiler messages you
#would have exactly the same problem.

	I am not surehow you define debug, but I define it as fixing the logical
	problems.  So, if you spend more time figuring out syntax errors with
	Ada and same time for logical errors with Ada and Pascal, which one
	takes more time to write a program in?

#  Then it could also be added that Ada is not at optimum for programs
#at that size, but it is never a real problem. Not even in a beginner's
#course. As a contra, why don't you try to write a 2.000.000 line
#artillery control system in standard Pascal. (One single source file!)

	Are you crazy? :-) Who in their right mind would write something that
	big as a single source file?  Now, you'll tell me that Pascal cannot
	be split? To that I say, that according to a lot of my friends who
	have worked in the 'real' world with Pascal to write a large program,
	Pascal can be split up, very easily.
#
#>but imagine a biginning student running into that.  You know what he'll 
#>do - say "fuck this"drop class, and switch majors.  I know, because
#>I've seen many people drop simply because they were having trouble 
#>with the language.
#
#A freshman would probably have less problems than you. His mind
#wouldn't be set up thinking in Pascal ways. And, he would be just
#as likely to run in those problems in Pascal. What about this one?
#      IF Something THEN
#         IF And_this_to THEN
#            Do_this;
#      ELSE
#         Do_that;
#First our student gets a syntax error on the ";" before ELSE which
#he doesn't understand. (Believe. I have teached Pascal to freshmen.)
#When he gets that he remove the semicolon, the program compiles, but
#doesn't run corectly. Well, not really a novice error. I have had 
#colleagues that have done the same error and been fooled by the 
#indentation.

	So what are you saying, that something like that can't happen in Ada?

#  As for giving up; if they give up because of the language, that might
#be just as well. With that motivation I doubt that they would be any
#good programmers anyway.

	True, but how many *good* programmers are out there vs. not very good
	ones.  I know, it would be an ideal world if only the best did all of
	the work, but unfortunately that is not the case and the poor ones are
	needed to do the work the good ones wont.  EEEhhhhh, maybe not a very
	good argument, but then again, there are many CS majors who do not
	want to do any programming, but want to do things like Numerical
	Anyalysis, Theory, etc... that don't require programming (or at least
	a lot of programming)  I can hear it now - "THEY SHOULD BE MATH MAJORS!"
	Well, maybe they should, but they chose CS, and losing them might be
	bad, especially if later they turn out to be very good.
#
#>As far as Lisp being a toy language, I would like to see you do AI work 
#>in Ada....
#
#Yes, but people like me don't write AI. :-) Seriously, I don't
#doubt a second that Lisp is good in AI and similar small systems.
#But with the stuff I'm involved in, it's not a serious alternative.
#And, I might be totally wrong, but I have this idea that this
#apply to the main bulk of programmers.

	I am not arguing with you at all on this issue.  All I am pointing out
	is just because a language is not right for you, does not mean that it
	is not right for others.  I would like to do work in AI, does that mean
	that Assemply is a toy language because it's not very useful for AI, NO!
	(Read all with a :-)
#
#>restrictive.  If you really want to teach reusability, restrictiveness, etc..
#>use Pascal, it has all of that, but Pascal also gives you lot's of freedom if
#>needed, something Ada doesn't have.
#
#This is where I have some problem to remain polite. That must be a joke.
#Tell me you forgot the smiley. How the !"#$%&/() can Pascal be reusable
#when there is no module concept in the language? You cannot reuse a
#bloody thing from your one-file program, except by copying code, and,
#believe me, that has nothing to do with reusability.
#
#And before you go on and accuse me as you did with Bill Wolfe for
#not knowing Pascal, I tell you. I work with the shit. Every day.
#(VAX-Pascal though, which is a good deal better than standard
#Pascal, but totally unportable.)

	Maybe that's the problem (VAX Pascal).  Let me tell you a story...
	In one of my classes, we had an asignment - write *4* *seperate*
	*modules* in Pascal that would each do a specific thing.  Write a
	test program to link them with and make sure they work.  Then, once
	thay are done, they would be linked in with our professor's code
	and he would test it.  This was done without knowing what the data
	staructure is/was.  All we were told was what will be passed in and
	out of the modules.  Now, you tell me, is there or is there not a
	module concept in Pascal (granted I was using Pascal/VS) (also see above
	in article).  Maybe there isn't one in VAX version, but there certainly
	is one. 

gene schwartzman
genesch@aplvax.jhuapl.edu
_______________________________________________________________________________
| GO BEARS, GO CUBS, GO WHITE SOX, GO BULLS, GO BLACKHAWKS, GO TERPS !!!!!    |
| Soccer is a kick in the grass (and sometimes on astroturf)!                 |
| GO DIPLOMATS, GO STARS, GO BAYS, GO BLAST !!!!		              |	
| CFL -> GO EDMONTON ESKIMOS!!!!   VFL -> GO CARLTON BLUES !!!!		      |
|_____________________________________________________________________________|
Disclaimer:  These are my opinions and not of my employer.

jrc@ukc.ac.uk (John) (08/16/89)

In article <6264@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
<From paul@batserver.cs.uq.oz (Paul Bailes):
<< The answer is quite clear: a modern-style functional language [...]
<
<    Since the ACM Task Force on the Core of Computer Science has
<    specified parallel programming as one of the 11 major topics
<    to be covered in an introductory course, the language used
<    should be one in which parallel programming is supported.

Does the ACM document say the Ada model of parallelism should be taught? If
not, is the Ada sort really the best kind for beginners?

<    Quoting David M. Harland (Concurrency and Programming Languages, 1986):
<
<       The history-sensitivity of the imperative languages contrasts
<       with the total inability of functional systems to handle
<       interactions with the outside world...  

Eh?? Please explain what this is referring to! 

<	It is interesting
<       to ponder what would happen if processes were made explicit
<       in functional languages.  With the introduction of arbitrary
<       interactions, sequentiality would return ...

This is silly! Any language, even a 'parallel' one like Ada, has to perform
interactions in some sequence. Unless of course you make the user type on two
keyboards at the same time :-)

I don't get the impression from reading your quote that David Harland knows 
a great deal about functional languages. 

<    But don't worry; Ada will be quite happy to take up the slack while 
<    you functional programming types run right back to the drawing board...
<
<    Bill Wolfe, wtwolfe@hubcap.clemson.edu

Well, I agree with Paul Bailes anyway. God, I love language wars! Hey, let's
get some personal insults into the next few postings :-)

John Cupitt

prp@sei.cmu.edu (Patrick Place) (08/16/89)

In article <2685@aplcen.apl.jhu.edu>, genesch@aplvax.jhuapl.edu (Eugene Schwartzman) writes:
> 	You yourself said that the above example would produce an error, granted
> 	I don't know whether you mean logical or compiler...  Honestly, I don't
> 	know how the compiler would handle it, but if I remember anything from
> 	one of my classes, it would gripe and tell you that they type are
> 	incompatible.  Also, the two ranges are different, sooner or later it
> 	would crash anyway and the problem would be discovered (hopefully, in
> 	the testing stages :-)
In the example, (deleted for brevity), Pascal would compile the code,
which as we have said is incorrect.  As you say, sooner or later it
will crash anyway.  Big deal, let us assume that we are fortunate and
the crash occurs in the procedure Macedonia, we can spend a long time
looking at the code of Macedonia, which is all correct, so nothing is found.
It is the call that is in error.  The Ada compiler will not permit the
call to occur, it is a syntactic error.  And so may be fixed immediately.
It is much easier to fix a syntactic error, than a semantic error
which may not be at the point at which the program crashes.
> 	I am not surehow you define debug, but I define it as fixing the logical
> 	problems.  So, if you spend more time figuring out syntax errors with
> 	Ada and same time for logical errors with Ada and Pascal, which one
> 	takes more time to write a program in?
The whole point is that moving more checking into the compiler reduces the
amount of time spent looking at "logical errors" such as those described
in the example.
> 
> 	Are you crazy? :-) Who in their right mind would write something that
> 	big as a single source file?  Now, you'll tell me that Pascal cannot
> 	be split? To that I say, that according to a lot of my friends who
> 	have worked in the 'real' world with Pascal to write a large program,
> 	Pascal can be split up, very easily.
Only because they are not using standard Pascal.  This is a very important
point.  Non standard Pascals permit separate compilation.  THEY ARE
NOT PORTABLE.

Pat Place   prp@sei.cmu.edu

cik@l.cc.purdue.edu (Herman Rubin) (08/16/89)

In article <30666@ucbvax.BERKELEY.EDU>, jamin@cogsci.berkeley.edu (Sugih Jamin) writes:
> My first language was Scheme.  My second language C.  But that is of no
> importance.  As my prof. likes to say, "Teach them the concepts, they'll
> pick up any language within a week."

			...............................

One of the great difficulties with the languages is that they give a very
distorted view of what a computer does.  A computer does nothing but bit
manipulations and transfers of control.  Some of these bit manipulations
are organized in certain ways for convenience and efficiency.  Some bit
manipulations are done in hardware by some computers and not by others,
but any computer can simulate any other on a clearly defined task.

The fundamental operations are not limited to those in C, and the operations
in a language are not all fundamental, for a specific computer.  Students
should learn that which operations are hardware, and how fast they are, 
affect the algorithm to be used.

> In general, keep the frustration to the minimum.
> 
Even more so, keep the unlearning to a minimum.

> sugih


-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907
Phone: (317)494-6054
hrubin@l.cc.purdue.edu (Internet, bitnet, UUCP)

scott@bbxeng.UUCP (Engineering) (08/16/89)

In article <1501@shuksan.UUCP> scott@shuksan.UUCP (Scott Moody) writes:
>
> ... Try explaining to you future employer that you were
>taught to 'learn' other languages easially when they are looking
>for expert Ada programmers. The first thing they do is send
>you to an Ada course anyway.
>

Which employers are these?  DOD contracters?  My motto is:

          *** LIVE FREE!  JUST SAY "NO" TO GOVERNMENT MONEY! ***  :-)

Actually, I think a first semester student should learn programming
with a simple interactive BASIC interpreter.  The instant feedback
along with simple variables and loops will help get the fundamentals
of abstract thought sharpened up.  Don't worry about bad habits at this
point.  Once the student is able to understand what an arrays and 
subroutines are, then move him on to a more disciplined language.

The second semester student needs to unlearn the bad habits he picked 
up in the first semester.  Students unable to unlearn their bad habits
are not fit to be professionals and they still have time to change their
major (seriously).

Once a student is comfortable with simple program logic, a good high level
language class and an assembler class can provide complimentary points
of view on the process of computing.

What languages to teach?  It depends on the student's plans.  The
government still uses FORTRAN.  'C' is hot right now in the mainstream.
ADA is used for some government work.  A 4GL language would be a good
introduction to data base concepts.  BASIC is a useful tool for
quick and dirty programs.   Assembly language teaches the student about
CPU architecture.  Take your pick.

-- 

------------------------------------------------------
Scott Amspoker
Basis International
505-345-5232

haveraaen-magne@CS.YALE.EDU (Magne Haveraaen) (08/16/89)

In article <2449@cbnewsc.ATT.COM> nevin1@ihlpb.ATT.COM (nevin.j.liber) writes:
>In article <6226@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>>    No, actually, students need to be exposed to ADTs early.
>
>I don't know.  I tend to laugh when I get a non-trivial (and usually
>incomplete) spec for a trivial program, which is what most of the
>beginning CS programs tend to have.  It makes ADTs *seem* like useless
>overhead [...]

Having taught intro cs. courses at the University of Bergen (Norway)
for some time I would like to remark that it is virtually impossible to
give a beginning student an assignment that is large enough to show the
usefulness of any good programming method without drowning the students
in work. With smaller examples, however, students who have been hacking
at home, tend to use the "hacker approach" to solve the assignments
quickly, seeing the proper method as "useless overhead".

I still think Pascal is a good introductory language, its major
shortcoming being lack of ADTs and descriptive parameter modes. We have
developed an extension to Pascal, called ADT-Pascal, that adds these
features in a Pascalesc way. So at the end of the first semester, and
during the second semester, we let the students use this language. We
also provide them with some generic ADT modules they may use. The
second semester assignments are so large that students form teams to
solve them. This also shows the need for specifications independent of
implementations, as well as software component reuse.

ADT Pascal blends nicely with Pascal, so a 20 page booklet is enough to
supplement a standard Pascal textbook on these issues. We were
originally thinking of using Modula-2, but dropped it since Modula-2
lacks generic (or polymorphic) modules, and all text books seemed to
describe horrendous methods for adding it to ones code. Ada was out of
the question as well:  Compilers were too expensive (both to buy and in
terms of hardware requirements), and there were no good introductory
text books using Ada. I also feel that Ada is a bit to complex for this
purpose, although useful later on.


Magne Haveraaen		(haveraaen-magne@cs.yale.edu)
Disclaimer: These views have nothing to do with Yale, I haven't even
taught there.

tneff@bfmny0.UUCP (Tom Neff) (08/16/89)

Actually the ideal first programming language to teach the scurvy masses
in their vast, stockyard-like freshman lecture halls is my own creation,
DISMAL (Deliberately Impractical Simple Minded Algorithmic Language).
In my experience, colleges are already turning out DISMAL programmers
anyway, so why not teach the language formally at the entry level.

Institutions desiring tapes of DISMAL should notify the secretary at:

		beautyeh@bland.toronto.edu

Implementing the DISMAL language system should be fairly straightforward
on most academic computing hosts, which appear to be capable of DISMAL
performance already.
-- 
"We walked on the moon --	((	Tom Neff
	you be polite"		 )) 	tneff@bfmny0.UU.NET

jeff@aiai.uucp (Jeff Dalton) (08/17/89)

In article <2450@cbnewsc.ATT.COM> nevin1@ihlpb.ATT.COM (nevin.j.liber) writes:
>Maybe you like "programming" all your documents in troff, or debugging
>by inserting printf-type statements in your code and recompiling, but I
>for one would rather just type my documents with a WYSISYG word processor

Never.  Give me LaTeX any day.

jrk@sys.uea.ac.uk (Richard Kennaway) (08/17/89)

In article <746@skye.ed.ac.uk> jeff@aiai.uucp (Jeff Dalton) writes:
>In article <2450@cbnewsc.ATT.COM> nevin1@ihlpb.ATT.COM (nevin.j.liber) writes:
>>I for one would rather just type my documents with a WYSISYG word processor
>
>Never.  Give me LaTeX any day.

Never.  Give me WYSIWYG LaTeX any day.

--
Richard Kennaway          SYS, University of East Anglia, Norwich, U.K.
uucp:  ...mcvax!ukc!uea-sys!jrk		Janet:  kennaway@uk.ac.uea.sys

markv@phoenix.Princeton.EDU (Mark T Vandewettering) (08/17/89)

In article <6264@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>From paul@batserver.cs.uq.oz (Paul Bailes):
>> The answer is quite clear: a modern-style functional language [...]

	I would heartily agree, with a caveat.  It is my belief that 
	functional programming languages are both powerful and
	reasonably efficient, and permit a treatment of program 
	development which is more rigorous than that generally taught
	with more imperative languages.  Witness the success of the 
	Scheme language (not a pure functional language admittedly)
	which is largely due to the excellect text by Abelson and
	Sussman, which is probably the single best text on computer
	science that I have seen.

	The one caveat is the relative lack of a good, public domain
	(or nearly so) implementation of a language like Miranda.
	Until such an implementation is available, we cannot make
	the learning experience practical for students.

>    Since the ACM Task Force on the Core of Computer Science has
>    specified parallel programming as one of the 11 major topics
>    to be covered in an introductory course, the language used
>    should be one in which parallel programming is supported.

	However, the parallelism that you desire is an inherent part of
	functional programming.  Because of the lack of side effects,
	the only constraints placed on evaluation order are the natural
	data dependancies.  It is precisely this property which makes
	functional programming languages MORE amenable to parallelism
	than traditional imperative languages.
	
>    Quoting David M. Harland (Concurrency and Programming Languages, 1986):
>
>       The history-sensitivity of the imperative languages contrasts
>       with the total inability of functional systems to handle
>       interactions with the outside world...
	
	Partial agreement on this count, it is the strongest argument 
	against functional programming.  It is an area of active
	research however, with some recent advances.  It is also
	interesting to note that many computer programs DO act as
	"functions", they take input and churn and generate output.

>	It is interesting
>       to ponder what would happen if processes were made explicit
>       in functional languages.  With the introduction of arbitrary
>       interactions, sequentiality would return and this would remove
>       the basis of their traditional implementation.  

	Adding explicit processes doesn't mean adding interactions.
	MultiLisp for example, has explicit processes which are in the
	form of futures or promises.  These do not change the semantics
	of the program in the slightest.  If you are talking about
	adding message passing or some such nonsense, the point is moot:
	the language would cease to be functional.

>       There is far
>       more to the issue than the fashionable 'single-assignment'
>       approach would have us believe.  Functional languages _rely_ 
>       upon the fact that they are not history-sensitive to evaluate 
>       expressions in parallel, and thereby are claimed to be inherently 
>       parallel languages.  This is not so.  They are not sequential, 
>       but this is far from being parallel.  The only parallelism they 
>       offer is hidden in their implementation; it is implicit.  A truly 
>       parallel language would make its concurrency explicit, and allow 
>       arbitrary computation over the agents of parallelism.  Far from the
>       applicative languages providing the most natural evolutionary path 
>       for future 'good' programming systems, they are, as currently defined,
>       fatally flawed.

	I disagree totally, as do many others.  In order to make
	parallelism work, it is REQUIRED to be implicit.  Consider the 
	Actor model of Hewitt.  It requires the explicit coordination of
	large numbers of individual computational agents.   Because
	their computation is history sensitive, the action of the
	program is some function of the program state (the state of all
	the subactors), each of which is a function of ALL of the
	message traffic done to date.  What is lacking in the Actor
	model (and in every other model which uses explicit parallelism
	and history sensitive behavior) is a formal basis to reason
	about the meaning of programs.

	The only way to manage the complexity of programming a network
	of thousands of processors is to let the computer take up the
	slack.  We should let the computer make choices about
	partitioning and scheduling, and leave it to the programmer to 
	develop parallel ALGORITHMS to accomplish the task at hand.
	Techniques such as strictness analysis, compile time reference 
	counting, and program transformations can be used to detect and
	exploit potential parallelism.


>    But don't worry; Ada will be quite happy to take up the slack while 
>    you functional programming types run right back to the drawing board...
	
	Excuse me, but I think we will continue our steady progress
	thank you.

	For an interesting look into the possibilities of functional 
	programming and parallelism, try
	
		"Functional Programming on Loosely Coupled Multiprocessors", 
		Kelly, MIT Press, 1989.

	Not the absolute best book, but chock full of further references
	to guide the misinformed (1/2 :-). 

>    Bill Wolfe, wtwolfe@hubcap.clemson.edu

Mark VandeWettering (markv@acm.princeton.edu)

mitchell@community-chest.uucp (George Mitchell) (08/17/89)

In article <1504@l.cc.purdue.edu> cik@l.cc.purdue.edu (Herman Rubin) writes:
>
>One of the great difficulties with the languages is that they give a very
>distorted view of what a computer does.  A computer does nothing but bit
>manipulations and transfers of control.  Some of these bit manipulations
>are organized in certain ways for convenience and efficiency.  Some bit
>manipulations are done in hardware by some computers and not by others,
>but any computer can simulate any other on a clearly defined task.
>
This discussion has now covered the entire range from Register Transfer
Languages to 4GLs.  Is it possible that "none of the above" is a better
answer?  Although I started my journey with assembly language on a 12-bit
machine (PDP-8), I do not see why others should repeat the trip.  I will
admit that RTLs and ALs will give the student a better appreciation of the
underlying hardware, but most practitioners of computer science are
shielded from the hardware by layers of system software.  Understanding
that, should be of higher priority than understanding the hardware.

Why does any language need to be taught in an introductory course.  Aren't
there enough concepts to cover without introducing a particular programming
language and requiring the student to "write programs"?  When the student
leaves the school, will the task be to write programs or to develop
software?  Before the student gets started down the same path as other
"DISMAL" programmers, why not teach the concepts of program design without
relying on a particular programming language?  I would suggest the use of
Nassi-Schneiderman diagrams as an appropriate tool for the student to
acquire the ability to develop and follow program/algorithm designs.

Since many new graduates are initially assigned as maintenance programmers,
the debugging of programs is as important as writing them.  If any language
is to be used in the course, its use should be restricted as follows:
  1.  No input to compilers
  2.  Code fragments only
  3.  Emphasis on debugging, not writing
  4.  Minimize syntactical considerations
If the above restrictions are followed, structured English should suffice.

After completing the introductory course, Ada should be a more than
satisfactory choice for the first/principal programming language.

/s/ George          703/883-6029              GB Mitchell, MS Z676
Easiest: gmitchel@mitre.org                   MITRE, 7525 Colshire Drive
Best:    mitchell@community-chest.mitre.org   McLean, VA 22102

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/18/89)

From nevin1@cbnewsc.ATT.COM (nevin.j.liber):
>>> o	Ease of use and programming (advanced user interface).  
> 
>>    Great for a Hypercard system designed to be used by non-experts;
>>    but we are training professionals here, not unsophisticated users.
> 
> I really disagree with you here.  You seem to be saying that
> "sophisticated professionals" ought to be using more primitive tools.

    No, I'm saying that we should be training for what actually will
    confront people out in the real world.  If that's nothing more
    than an operating system and a compiler, then that's what they
    have to get used to.  If it's an advanced CASE system, then that
    should be made available for training purposes as soon as it's
    economically feasible to do so.  But I am 100% against the idea
    of spoon-feeding students and giving them unrealistic expectations.


    Bill Wolfe, wtwolfe@hubcap.clemson.edu

sommar@enea.se (Erland Sommarskog) (08/18/89)

Peter da Silva (peter@ficc.uu.net) writes:
)Bill Wolfe (billwolf@hubcap.clemson.edu) said:
))     It is probably inadvisable to teach them [non-majors] any language
))     at all; instead, a survey of user-oriented systems (Hypercard, the
))     so-called 4GLs, and so on) would probably be more appropriate.
)
)You have to be kidding. You're saying, here, that if you're not a CS
)Major you shouldn't have any programming at all. Is that right? What
)about people in scientific disciplines? Physicists don't have to
)program? How about mechanical engineers? How about EEs?

Being an Electrical Engineer to my degree and knowing what I'm
working with, I'd say there is no difference to the CS majors.
A lot of them will be involved in programming fairly large
system anyway. At least as things works over here.
-- 
Erland Sommarskog - ENEA Data, Stockholm - sommar@enea.se
"Hey poor, you don't have to be Jesus!" - Front 242

db@lfcs.ed.ac.uk (Dave Berry) (08/18/89)

In article <741@skye.ed.ac.uk> jeff@aiai.uucp (Jeff Dalton) writes:
>a functional language might just be a subset of a procedural one
>(think of Standard ML, for example, or Lisp).

A functional language is a language that has functions as first class values.
An imperative or procedural language is one that supports (re-)assignment.
Languages that support both are not subsets of either style; they are hybrids.

Dave Berry, Laboratory for Foundations      db%lfcs.ed.ac.uk@nsfnet-relay.ac.uk
    of Computer Science, Edinburgh Uni.	    <Atlantic Ocean>!mcvax!ukc!lfcs!db

			Concept + Acronym = Thesis

karl@ficc.uu.net (karl lehenbauer) (08/18/89)

In article <1501@shuksan.UUCP>, scott@shuksan.UUCP (Scott Moody) writes:
> Remember that the first language you are taught in CS101 is also the 
> main language you use throughout undergraduate education (outside of
> the many languages course). 

Pshaw.  No decent CS school should let you have a degree without learning
several languages.  At Indiana University, circa 1976-1980, you could not
escape without learning (at least) Pascal, LISP, Snobol, Algol, FORTRAN
and TI 980 assembly.  Some CS201 intro classes were taught in FORTRAN,
some in Pascal, and the honors class was taught in LISP (at least when
Doug Hofstader or Dan Friedman taught it)

Someone pointed out that the high-level lanuages are pretty far removed from
the hardware.  I agree and that's why I think an architecture (computer 
structures) class with assembly programming (which included toggling a few 
programs in through the front panel)  was nice.

> There are a lot of jobs in industry
> that need Ada programmers and it is still the job of the Universites
> to teach/prepare its students for the real-world. 

There are a lot of jobs in industry that need COBOL programmers but I don't
think it's the job of the Universities to teach/prepare its students for *that.*
In other words, that there are Ada jobs is not an inherently compelling reason 
to teach Ada.

> So what good is
> it to teach them pascal if they never use it, other than for the
> techniques? 

For the techniques.  Because declarative languages are a lot alike.

> Try explaining to you future employer that you were
> taught to 'learn' other languages easially when they are looking
> for expert Ada programmers. 

After college you can't learn any new languages?

> The first thing they do is send
> you to an Ada course anyway.

So I don't have to learn it in school after all, and anyway, who is this 
*they*, bucko?  You write as if Ada is the only thing people program in.  
In fact, programming in Ada is still a minute fraction of total programming.
I've programmed professionally in C, PL/M, Pascal, Forth, FORTRAN and
pdp11, Z8000, 80x86 and 68000 assembly languages.  Your Ada-view is overly
restrictive.
-- 
-- uunet!ficc!karl	"Have you debugged your wolf today?"

jeff@aiai.uucp (Jeff Dalton) (08/19/89)

In article <178@castle.ed.ac.uk> db@lfcs.ed.ac.uk (Dave Berry) writes:
>In article <741@skye.ed.ac.uk> jeff@aiai.uucp (Jeff Dalton) writes:
>>a functional language might just be a subset of a procedural one
>
>A functional language is a language that has functions as first class values.
>An imperative or procedural language is one that supports (re-)assignment.

I don't think those definitions are quite right, but they'll do for now.

>Languages that support both are not subsets of either style; they are hybrids.

That is true, but beside the point.

It is certainly possible to have a language that has a functional
subset.  Scheme, Common Lisp, and Standard ML are examples of
languages that have such a subset.  So, if I want to use a functional
language, I might use one of these sublanguages.  That was all I was
trying to say.  If you want to say "a subset isn't a language" that's
fine with me, but I don't think it's very important.

-- Jeff

sommar@enea.se (Erland Sommarskog) (08/20/89)

Herman Rubin (cik@l.cc.purdue.edu) writes:
>The fundamental operations are not limited to those in C, and the operations
>in a language are not all fundamental, for a specific computer.  Students
>should learn that which operations are hardware, and how fast they are,
>affect the algorithm to be used.

Certainly not in their first year anyway. I'd say that the newly
graduated should have some understanding of these topics, however
there are much more important things like data abstraction, writing
reuauble code etc. One reason is that not all the student will ever
face that kind of problems. If you into information systems, you're
interest in the hardware operations are low. There are so many other
complexities that you could live without another.


-- 
Erland Sommarskog - ENEA Data, Stockholm - sommar@enea.se
The law of gravity should be forbidden execpt in downhills.

jont@cs.qmc.ac.uk (Jon Taylor) (08/21/89)

In article <1501@shuksan.UUCP> scott@shuksan.UUCP (Scott Moody) writes:
>
> ... Try explaining to you future employer that you were
>taught to 'learn' other languages easially when they are looking
>for expert Ada programmers. The first thing they do is send
>you to an Ada course anyway.
>

... and you will perform very well on this course, beacuse you
will have been given a sound theoretical grounding with which
to learn the language(s) of your employers choice.


Jon Taylor.
Queen Mary College
Uni. of London.
jont@uk.ac.qmc.cs

jont@cs.qmc.ac.uk (Jon Taylor) (08/22/89)

In article <6264@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>       .......  The only parallelism they
>       offer is hidden in their implementation; it is implicit.  A truly
>       parallel language would make its concurrency explicit, and allow
>       arbitrary computation over the agents of parallelism.

I refer you here to the work done at Yale on the langauge
ParAlFl. A functional language which allows for explicit
concurrency. The main strength of this langauge as far as I
could see was that it was possible to develop a program on a
sequential machine and then run the program on a parallel
machine with or without the explicit parallel notations.

Mail me if you want references.

Jon Taylor
Queen Mary College
Uni. of London
jont@cs.qmc.ac.uk

karl@ficc.uu.net (karl lehenbauer) (08/23/89)

For better or for worse, most incoming CS freshmen already know how to program.
-- 
-- uunet!ficc!karl	"Have you debugged your wolf today?"

rossj@cognos.UUCP (Ross Judson) (08/23/89)

I said:
     o	Ease of use and programming (advanced user interface).  
   
Bill Wolfe said:
       Great for a Hypercard system designed to be used by non-experts;
       but we are training professionals here, not unsophisticated users.

Nevin Liber said:
   I really disagree with you here.  You seem to be saying that
   "sophisticated professionals" ought to be using more primitive tools.

Bill Wolfe said:
     No, I'm saying that we should be training for what actually will
     confront people out in the real world.  If that's nothing more
     than an operating system and a compiler, then that's what they
     have to get used to.  If it's an advanced CASE system, then that
     should be made available for training purposes as soon as it's
     economically feasible to do so.  But I am 100% against the idea
     of spoon-feeding students and giving them unrealistic expectations.
 
I say:

Bill, this is precisely the kind of reasoning that would turn our
universities into C shops.  Let's be realistic here.  When students graduate
and hit the real world, they're going to be working in C, C++, FORTRAN, or
COBOL.  That covers 99.9% of cases.  Why then do we not teach nothing but
the above in university?

Clearly you know the answer.  There's more to life than bread and water.

Incidentally, a truly advanced programming system will encompass both power
_and_ ease of use.  I do not believe they are mutually exclusive; they are
merely difficult to achieve, and difficult to define.

I say that we spoon feed.  Give them a taste of the future.  If the rest of
the world is too archaic or immutable to develop the tools of the future,
perhaps that will persuade them to perform the work themselves.

-- 
uucp       - uunet!mitel!sce!cognos!rossj  | stop this rhyming! i mean it!
arpa/inter - rossj%cognos.uucp@uunet.uu.net| anybody want a peanut?

nevin1@cbnewsc.ATT.COM (nevin.j.liber) (08/23/89)

In article <1501@shuksan.UUCP> scott@shuksan.UUCP (Scott Moody) writes:
>Remember that the first language you are taught in CS101 is also the 
>main language you use throughout undergraduate education (outside of
>the many languages course).

This bothers me.  Why should the first language you learn be your
primary language throughout college?  Doesn't sound like a well-rounded
CS education to me!  By using different languages, one learns different
ways of thinking about their programming problems.  Sampling half a
dozen languages in a single semester (the many languages course you
refer to) is not enough!  A semester in Scheme, the next in Pascal, the
next in Smalltalk, etc., would be a better curriculum.  You need to
write non-trivial programs in a language for a reasonable amount of time
(two weeks in the many language course is NOT reasonable) in order to
learn the language and the programming paradigm it presents.  If you
only use one language, you tend to get a very narrow view of
computing.

As for me, I was lucky; throughout my undergrad education, I never used
a single language for more than a semester.  But if I could have chosen
a primary language, it would have not have been Pascal or Ada or C; it
would have been Icon.  <plug time :-)>  I can express in tens of lines
of Icon what it takes in hundreds to thousands of lines in Pascal.
Since all the basic data structures (list, set, string, queue, etc.)
are built in, I can concentrate on the more important parts of the
programming assignment.  At college, where time is the most critical
constraint, this is very important.

>There are a lot of jobs in industry
>that need Ada programmers and it is still the job of the Universites
>to teach/prepare its students for the real-world.

There are also a lot of jobs in industry that need C programmers, C++
programmers, assembly language programmers, etc.   Should universities
ignore this?  I would rather work with someone who knew a variety of
languages and could tell me what language would be best for the
application at hand than someone who said to use Ada 'cause that's all
he knows!

>Try explaining to you future employer that you were
>taught to 'learn' other languages easially when they are looking
>for expert Ada programmers.

I have done just that (well, they were looking for expert C
programmers, anyway).  Guess what?  I got the job!

If Company-X is only looking for Language-Y programmers, Company-X is
being very short-sighted (ex:  there is a project being written in
Language-Y which is behind schedule).  I rather work at a place where
they hire you because of how-you-think rather than what-you-know.  As
long as you have the rest of the talent, another programming language
can always be learned.
-- 
NEVIN ":-)" LIBER  AT&T Bell Laboratories  nevin1@ihlpb.ATT.COM  (312) 979-4751

pmontgom@sonia.math.ucla.edu (Peter Montgomery) (08/23/89)

In article <193@enea.se> sommar@enea.se (Erland Sommarskog) writes:
>Herman Rubin (cik@l.cc.purdue.edu) writes:
>>The fundamental operations are not limited to those in C, and the operations
>>in a language are not all fundamental, for a specific computer.  Students
>>should learn that which operations are hardware, and how fast they are,
>>affect the algorithm to be used.
>
>Certainly not in their first year anyway. I'd say that the newly
>graduated should have some understanding of these topics, however
>there are much more important things like data abstraction, writing
>reuauble code etc. One reason is that not all the student will ever
>face that kind of problems. If you into information systems, you're
>interest in the hardware operations are low. There are so many other
>complexities that you could live without another.
>
	I resent that this was NOT covered in my first course,
a FORTRAN/ALGOL course in 1967.  Two years later, without
having taken another course, I was vending machine manager
at my dormitory and took a poll on who liked which candy bars.
For each member and candy bar, the program had an array element 
identifying whether he liked it, i.e.,

	like(John, Snickers) = 1 if John likes Snickers, else 0

	The program tallied not only the number of votes for each item
but also the statistical correlations between who likes what (e.g.,
are Hershey's and Nestle's Milk Chocolate liked by the same folks?).
So for every pair of candy bars c1 and c2, I computed

	SUM   like(p, c1) * like(p, c2)  (p runs over all ballots)
	    p  

	The run cost about $30, a considerable amount in those days.
What I did not know (but learned when I took an assembly language
class) is that the machine (a CDC 6400) lacked an integer multiply
(even though the languages had that operation): the operands
had to be converted to floating point, multiplied together 
(a slow operation), and converted back every pass though the main loop.
Since all values were 0 or 1, I could have used a bitwise AND
in place of the multiply, but I had not been taught this operation 
(admittedly, it is not standard FORTRAN).  This alone would
speed the program considerably; the program could be shortened much more
by using one AND in place of 60 multiplications (with appropriate
change in storage of data) and by using
the machine's population count instruction in place of 59 additions.
--------
        Peter Montgomery
        pmontgom@MATH.UCLA.EDU 

cooper@vlab.enet.dec.com (g.d.cooper in the shadowlands) (08/24/89)

In article <1189@sequent.cs.qmc.ac.uk>, jont@cs.qmc.ac.uk (Jon Taylor) writes...
>In article <1501@shuksan.UUCP> scott@shuksan.UUCP (Scott Moody) writes:

>> ... Try explaining to you future employer that you were
>>taught to 'learn' other languages easially when they are looking
>>for expert Ada programmers. The first thing they do is send
>>you to an Ada course anyway.


>.... and you will perform very well on this course, beacuse you
>will have been given a sound theoretical grounding with which
>to learn the language(s) of your employers choice.

Yes but then you wouldn't have been hired in the first place.  The
problem with relying upon your ability to learn in job aquisition is
that the future employer doesn't want somebody who knows `plover' two
days from now but one who has been working with it for years.

		Running into this problem frequently,

				shades
============================================================================
| But I that am not shaped for sport- | Geoffrey D. Cooper                 | 
| ive trick, nor formed to court an   | cooper@vlab.enet.dec.com	   |
| amorous looking glass...            | business (508) 467-3678            |
|                                     | home (617) 925-1099                |
============================================================================
Note: I'm a consultant.  My opinions are *MY* opinions.

ams@cbnewsl.ATT.COM (andrew.m.shaw) (08/24/89)

In article <416@ryn.esg.dec.com> cooper@vlab.enet.dec.com (g.d.cooper in the shadowlands) writes:
>In article <1189@sequent.cs.qmc.ac.uk>, jont@cs.qmc.ac.uk (Jon Taylor) writes...
>>In article <1501@shuksan.UUCP> scott@shuksan.UUCP (Scott Moody) writes:
>
>>> ... Try explaining to you future employer that you were
>>>taught to 'learn' other languages easially when they are looking
>>>for expert Ada programmers. The first thing they do is send
>>>you to an Ada course anyway.
>
>
>>.... and you will perform very well on this course, beacuse you
>>will have been given a sound theoretical grounding with which
>>to learn the language(s) of your employers choice.
>
>Yes but then you wouldn't have been hired in the first place.  The
>problem with relying upon your ability to learn in job aquisition is
>that the future employer doesn't want somebody who knows `plover' two
>days from now but one who has been working with it for years.

Are we talking about fresh-outs, or experienced ee/cs's?  A programmer
with a new bachelor's degree is not presumed to know anything anyway.

Clearly any employer that thinks that a course in a new language is
sufficient preparation for anything more than the most trivial application
is sadly misguided.  You have to use a language over time before you
are proficient enough to use it well. 

Moreover, my experience has been that after a programmer has learned n
languages, the n+1th, unless its *theoretical basis* is new to the programmer,
comes effortlessly and is better learned from the manual and examples of good
code than through classroom training.

Thus, the employer who wants a consultant to work with 'plover' today
needs someone experienced and does not care what courses he took in school.
On the other hand, the poor student who, as was suggested elsewhere, learns
only 'plover' and no theory, is a real candidate for the dreaded technological
obsolescence.

"Give a man a fish and you feed him for one day; teach him to fish and you
feed him for a lifetime"

		Andrew Shaw

[The opinions expressed herein are my own and are not to be construed as 
indicative of those of my employer, which has, no doubt, its own views]

scott@shuksan.UUCP (Scott Moody) (08/25/89)

In article <2633@cbnewsc.ATT.COM>, nevin1@cbnewsc.ATT.COM (nevin.j.liber) writes:
> In article <1501@shuksan.UUCP> scott@shuksan.UUCP (Scott Moody) writes:
> >Remember that the first language you are taught in CS101 is also the 
> >main language you use throughout undergraduate education (outside of
> >the many languages course).
> 
> This bothers me.  Why should the first language you learn be your
> primary language throughout college?  Doesn't sound like a well-rounded
> CS education to me!  By using different languages, one learns different
> ways of thinking about their programming problems.  Sampling half a


I agree that a lot of different languages is best but for short 
quarter courses, learning a new language can be difficult. Remember
that it usually takes at least 1 quarter to learn Pascal, and even then
half the people are not experts yet (barely proficient).
So I think the best compromise is to start with a language that
can be used in some of the later courses. Introduce tasks during
an operating system course, or simulation course, etc. CASE Tools
for sw engineering courses, ...

My experience at the UWash was 1 language for most of the curriculum. Things
might have changed by now.

--scott

paul@batserver.cs.uq.oz (Paul Bailes) (08/25/89)

In article <5823@ficc.uu.net> karl@ficc.uu.net (karl lehenbauer) writes:
>
>For better or for worse, most incoming CS freshmen already know how to program.
>-- 

Just what trivial function do describe as ``to program''? If you take it to
mean something along the lines of ``tell a computer to do what you want''
(subject to any meaningful performance/correctness constraints), then after
3-4 years of university people still can't!

Paul

wgh@ubbpc.UUCP (William G. Hutchison) (08/25/89)

In article <2633@cbnewsc.ATT.COM>, nevin1@cbnewsc.ATT.COM (nevin.j.liber) writes:
> In article <1501@shuksan.UUCP> scott@shuksan.UUCP (Scott Moody) writes:
> >Remember that the first language you are taught in CS101 is also the 
> >main language you use throughout undergraduate education (outside of
> >the many languages course).
> 
> This bothers me.  Why should the first language you learn be your
> primary language throughout college?  Doesn't sound like a well-rounded
> CS education to me!  By using different languages, one learns different
> ways of thinking about their programming problems.  [ ... ]

 The Whorf hypothesis for natural languages seems to work for artificial
symbol systems as well (don't linguists deny that programming languages
are really languages?).

> >There are a lot of jobs in industry
> >that need Ada programmers and it is still the job of the Universites
> >to teach/prepare its students for the real-world.
> 
> There are also a lot of jobs in industry that need C programmers, C++
> programmers, assembly language programmers, etc.   Should universities
> ignore this?  [ ... ]
> -- 
> NEVIN ":-)" LIBER  AT&T Bell Laboratories  nevin1@ihlpb.ATT.COM  (312) 979-4751

 Universities should not ignore the "real world", just refuse to be corrupted
by it!  I just want to point out that this is not an absolute dichotomy
between the "intellectually pure" and the "applied".

 All one needs to do is to find a "neat" language which is somewhat like
a practical, widely used language.  The introductory language can be the neat
language, then application courses can use the second choice, with some brief
teaching of methods to simulate the "neat" stuff in the grubby language.

 Scheme, as used at MIT, fits the bill as a "neat" language, but it is too
far from widely-used languages (MIT graduates are smart enough to make the
transition themselves, but not all college grads have the time or initiative
to do this).

 Here is a possible pair of languages:  use Eiffel for introductory courses,
because it is designed to be object-oriented and "neat" from the ground up.
Eiffel probably won't be widely used for applications (for political reasons),
so I suggest C++ as the "practical" language.  

 C++ is a reasonable compromise to an object oriented language. I do not wish to
disparage Bjarne Stroustrup: the hangups in C++ are due to the fact that it
is tied to C (if it weren't an evolutionary path from C, nobody would be using
it).

 Can anybody else suggest other language pairs of this sort?
-- 
Bill Hutchison, DP Consultant	rutgers!cbmvax!burdvax!ubbpc!wgh
Unisys UNIX Portation Center	"Unless you are very rich and very eccentric,
P.O. Box 500, M.S. B121         you will not enjoy the luxury of a computer
Blue Bell, PA 19424		in your own home", Edward Yourdon, 1975.

mwm@eris.berkeley.edu (Mike (I'll think of something yet) Meyer) (08/26/89)

In article <1514@shuksan.UUCP> scott@shuksan.UUCP (Scott Moody) writes:
<I agree that a lot of different languages is best but for short 
<quarter courses, learning a new language can be difficult.

In that case, the education of those taking the course has been sadly
neglected (assuming they don't have to pick up a completely new way of
looking at things, of course. This assumption will be unstated by
applicable for the rest of the article). Knowing what languages are
available for use on a system, what their strengths and weaknesses
are, how to use them effectively, and how to get them to cooperate in
a single program are important parts of being a programmer. Not as
important as algorithm design, but not far from it - because the tools
you have will affect the algorithms you choose.

<Remember
<that it usually takes at least 1 quarter to learn Pascal, and even then
<half the people are not experts yet (barely proficient).

But that includes trying to teach the system the language is being
used on, as well as some basic programming skills. After you've been
through that, you should be able to pick up almost any Pascal-like
langauge well enough to write code in with a week of classwork and the
reference manual. You won't be an expert, but you'll be able to get
the classwork done.  At least, that's how I had it done to me. The
intro course taught any of Pascal, Algol, Fortran or Cobol. A later
course did a quick overview of lots of languages, concentrating on how
they were different from the others. After that, instructers taught
courses in whatever language they felt like - and if it wasn't covered
in those two courses, they spent a week on it, and had you buy the
refernce manual. Some let you write in any language you wanted to that
met certain guidelines (e.g. - has to have structured programming
constructs). Some did both, teaching in a preferred language, but
letting you write what you wanted.

Those advocating a language because that's what industry needs are
making a fatal mistake - they're pushing for technological
obsolescense for the students. If you train somebody for the currently
"hot" language in industry, there's a good chance they'll have trouble
finding a job using it in ten years. Think about what was going on in
languages ten years ago: Pascal was the "hot" language, and employers
were looking for people to write the stuff. Was there an Ada compiler?
How about Modula II? Those seem to be where the Pascal people have
gone. C was being used a little, but there wasn't much demand for it
in industry. Now, people are moving to C++.

If your goal is to have the highest possible placement rate after
graduation, go ahead and teach a single language. While you're at it,
drop the liberal arts and mathematics requirements, so you can cut the
time to a degree by a couple of years. If your goal is to provide
people with training that will be usefull to them throughout a carreer
as a programmer, teach them multiple languages, and teach them how to
learn new languages.

	<mike
--
The sun is shining slowly.				Mike Meyer
The birds are flying so low.				mwm@berkeley.edu
Honey, you're my one and only.				ucbvax!mwm
So pay me what you owe me.				mwm@ucbjade.BITNET

billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu (William Thomas Wolfe,2847,) (08/27/89)

From article <6907@cognos.UUCP>, by rossj@cognos.UUCP (Ross Judson):
> Bill, this is precisely the kind of reasoning that would turn our
> universities into C shops.  Let's be realistic here.  When students graduate
> and hit the real world, they're going to be working in C, C++, FORTRAN, or
> COBOL.  That covers 99.9% of cases.  Why then do we not teach nothing but
> the above in university?

    The above figures are incorrect; it was estimated a year ago that
    the United States Ada industry has an activity of $1.25 billion,
    supports 13,000 jobs, and comprises 3% of the U.S. software market.
    Considering Ada's accelerating growth, it's probably around 5 to 10%
    right now.  

    Now students are not going to hit the real world for four years, so
    the idea is to a) estimate what they will actually be confronted with
    four years from now, b) see to it that their expectations correspond to
    the estimate derived in a), and c) train them well enough to give them 
    a look into the future.  My view is that the introductory course should
    concentrate on b), thus giving them realistic expectations; IMHO, this
    calls for Ada training in anticipation of 1993's conditions.  Later work
    can then survey some of the older languages and the newer research ideas. 

> Incidentally, a truly advanced programming system will encompass both power
> _and_ ease of use.  I do not believe they are mutually exclusive; they are
> merely difficult to achieve, and difficult to define.

    Precisely.

> I say that we spoon feed.  Give them a taste of the future.  If the rest of
> the world is too archaic or immutable to develop the tools of the future,
> perhaps that will persuade them to perform the work themselves.

    I just think the "taste of the future" should be reserved for AFTER 
    their expectations of the immediate present have been calibrated.


    Bill Wolfe, wtwolfe@hubcap.clemson.edu

peter@ficc.uu.net (Peter da Silva) (08/28/89)

In article <6907@cognos.UUCP>, rossj@cognos.UUCP (Ross Judson) writes:
> [about CS students]  Give them a taste of the future.  If the rest of
> the world is too archaic or immutable to develop the tools of the future,
> perhaps that will persuade them to perform the work themselves.

This can be interpreted as saying that students should be treated as magic
tools for turning parchment into neat software. Now, I'm not saying that
you hold this attitude, but it does seem prevalent in the academic
community.

As I mentioned to someone else in private mail, students are people. Treating
them as purely a resource for change does them a disservice. If this is your
primary criterion for choosing a language to teach students, then I hope
you're not in a position to make that choice.
-- 
Peter da Silva, *NIX support guy @ Ferranti International Controls Corporation.
Biz: peter@ficc.uu.net, +1 713 274 5180. Fun: peter@sugar.hackercorp.com. `-_-'
"Just once I'd like to meet an alien menace that isn't immune to bullets"  'U`
   -- The Brigadier, Dr Who.

scott@shuksan.UUCP (Scott Moody) (08/29/89)

> If your goal is to provide
> people with training that will be usefull to them throughout a carreer
> as a programmer, teach them multiple languages, and teach them how to
> learn new languages.
> 
> 	<mike

Mike, I totally agree with you. I was only trying to point out what it
was like when I was at school (79-82) and how hard it was to teach and learn
the new ideas of CS let alone new languages. So the approach then was
to give them one or two languages and then teach them the fundamentals that
they could apply using those languages.

--scott

db@lfcs.ed.ac.uk (Dave Berry) (08/31/89)

>In article <6264@hubcap.clemson.edu> billwolf%hazel.cs.clemson.edu@hubcap.clemson.edu writes:
>>    Quoting David M. Harland (Concurrency and Programming Languages, 1986):
>>       The history-sensitivity of the imperative languages contrasts
>>       with the total inability of functional systems to handle
>>       interactions with the outside world...

In article <9970@phoenix.Princeton.EDU> markv@phoenix.Princeton.EDU (Mark T Vandewettering) writes:
>	I disagree totally, as do many others.  In order to make
>	parallelism work, it is REQUIRED to be implicit.  Consider the 

I believe that languages based on CCS will be the natural development of
functional languages to include explicit parallelism and communication with
the outside world.  Like functional languages they may be reasoned about
straightforwardly; the languages are calculi.  They don't use assignment, not
even single-assignment; state is stored as arguments to processes just as state
in sequential languages can be stored as arguments to functions.

Of course programmers will face the same problems of deadlock and scheduling
using these languages as using any other; these problems are part of the
task rather than of the language.

Reference: Robin Milner, "Communication and Concurrency", Prentice Hall,
1989.  0-13-115007-3 Pbk   0-13-114984-9.
The paperback version is 19 pounds.


>	The one caveat is the relative lack of a good, public domain
>	(or nearly so) implementation of a language like Miranda.

Poly/ML is fairly cheap to academic sites.  The licencing arrangments
are being revised at tghe moment, but I assume that this policy will continue.
New Jersey ML will probably be available cheap for academic sites as well.
Edinburgh University distributes a core-language version of ML (i.e. no
modules) for teaching; this is available for the cost of distribution.
I think Hope+ is available fairly cheaply from Imperial College London.

ML is eager evaluation, and Hope+ is applicative order evaluation with lazy
datatypes, so if you want full lazy evaluation these languages aren't suitable.

Dave Berry, Laboratory for Foundations      db%lfcs.ed.ac.uk@nsfnet-relay.ac.uk
    of Computer Science, Edinburgh Uni.	    <Atlantic Ocean>!mcvax!ukc!lfcs!db

   "Another hope, another dream, another truth, installed by the machine."

karl@ficc.uu.net (Karl Lehenbauer) (09/07/89)

I wrote:
>For better or for worse, most incoming CS freshmen already know how to program.

In article <1381@batserver.cs.uq.oz>, paul@batserver.cs.uq.oz (Paul Bailes) writes:
> Just what trivial function do describe as ``to program''? If you take it to
> mean something along the lines of ``tell a computer to do what you want''
> (subject to any meaningful performance/correctness constraints), then after
> 3-4 years of university people still can't!

I was merely commenting that almost everyone coming into a college-level intro 
CS class has had some experience writing code.  My point is that new students 
are not coming into CS as a tabula rasa, not that they had achieved some
specific competence level, particularly one so high that you assert most
grads can't meet.
-- 
-- uunet!ficc!karl	"Have you debugged your wolf today?"